Oct 09 15:18:13 crc systemd[1]: Starting Kubernetes Kubelet... Oct 09 15:18:13 crc restorecon[4669]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:13 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 15:18:14 crc restorecon[4669]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 15:18:14 crc restorecon[4669]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 09 15:18:14 crc kubenswrapper[4719]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 09 15:18:14 crc kubenswrapper[4719]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 09 15:18:14 crc kubenswrapper[4719]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 09 15:18:14 crc kubenswrapper[4719]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 09 15:18:14 crc kubenswrapper[4719]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 09 15:18:14 crc kubenswrapper[4719]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.951584 4719 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955167 4719 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955182 4719 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955187 4719 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955190 4719 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955196 4719 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955201 4719 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955206 4719 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955211 4719 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955216 4719 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955220 4719 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955225 4719 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955230 4719 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955234 4719 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955238 4719 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955241 4719 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955245 4719 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955257 4719 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955261 4719 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955264 4719 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955268 4719 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955271 4719 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955275 4719 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955278 4719 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955282 4719 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955286 4719 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955289 4719 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955293 4719 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955296 4719 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955300 4719 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955303 4719 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955307 4719 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955311 4719 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955314 4719 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955318 4719 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955321 4719 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955325 4719 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955328 4719 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955332 4719 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955335 4719 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955340 4719 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955343 4719 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955359 4719 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955363 4719 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955367 4719 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955371 4719 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955375 4719 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955378 4719 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955382 4719 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955385 4719 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955389 4719 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955392 4719 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955396 4719 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955401 4719 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955405 4719 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955409 4719 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955414 4719 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955417 4719 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955421 4719 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955424 4719 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955428 4719 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955431 4719 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955435 4719 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955438 4719 feature_gate.go:330] unrecognized feature gate: Example Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955442 4719 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955445 4719 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955449 4719 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955452 4719 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955457 4719 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955461 4719 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955466 4719 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.955469 4719 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955567 4719 flags.go:64] FLAG: --address="0.0.0.0" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955579 4719 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955585 4719 flags.go:64] FLAG: --anonymous-auth="true" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955590 4719 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955597 4719 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955601 4719 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955606 4719 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955611 4719 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955615 4719 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955620 4719 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955624 4719 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955628 4719 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955632 4719 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955636 4719 flags.go:64] FLAG: --cgroup-root="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955640 4719 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955644 4719 flags.go:64] FLAG: --client-ca-file="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955648 4719 flags.go:64] FLAG: --cloud-config="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955652 4719 flags.go:64] FLAG: --cloud-provider="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955656 4719 flags.go:64] FLAG: --cluster-dns="[]" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955661 4719 flags.go:64] FLAG: --cluster-domain="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955665 4719 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955669 4719 flags.go:64] FLAG: --config-dir="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955674 4719 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955678 4719 flags.go:64] FLAG: --container-log-max-files="5" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955683 4719 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955687 4719 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955692 4719 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955696 4719 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955700 4719 flags.go:64] FLAG: --contention-profiling="false" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955704 4719 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955708 4719 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955712 4719 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955716 4719 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955721 4719 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955726 4719 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955730 4719 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955736 4719 flags.go:64] FLAG: --enable-load-reader="false" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955741 4719 flags.go:64] FLAG: --enable-server="true" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955745 4719 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955751 4719 flags.go:64] FLAG: --event-burst="100" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955755 4719 flags.go:64] FLAG: --event-qps="50" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955759 4719 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955763 4719 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955767 4719 flags.go:64] FLAG: --eviction-hard="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955772 4719 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955776 4719 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955781 4719 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955785 4719 flags.go:64] FLAG: --eviction-soft="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955789 4719 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955792 4719 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955796 4719 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955800 4719 flags.go:64] FLAG: --experimental-mounter-path="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955804 4719 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955808 4719 flags.go:64] FLAG: --fail-swap-on="true" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955812 4719 flags.go:64] FLAG: --feature-gates="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955817 4719 flags.go:64] FLAG: --file-check-frequency="20s" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955821 4719 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955825 4719 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955829 4719 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955835 4719 flags.go:64] FLAG: --healthz-port="10248" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955839 4719 flags.go:64] FLAG: --help="false" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955844 4719 flags.go:64] FLAG: --hostname-override="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955848 4719 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955852 4719 flags.go:64] FLAG: --http-check-frequency="20s" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955856 4719 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955861 4719 flags.go:64] FLAG: --image-credential-provider-config="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955865 4719 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955869 4719 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955874 4719 flags.go:64] FLAG: --image-service-endpoint="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955879 4719 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955883 4719 flags.go:64] FLAG: --kube-api-burst="100" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955887 4719 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955892 4719 flags.go:64] FLAG: --kube-api-qps="50" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955896 4719 flags.go:64] FLAG: --kube-reserved="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955900 4719 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955904 4719 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955908 4719 flags.go:64] FLAG: --kubelet-cgroups="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955912 4719 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955917 4719 flags.go:64] FLAG: --lock-file="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955920 4719 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955924 4719 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955928 4719 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955935 4719 flags.go:64] FLAG: --log-json-split-stream="false" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955938 4719 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955942 4719 flags.go:64] FLAG: --log-text-split-stream="false" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955946 4719 flags.go:64] FLAG: --logging-format="text" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955950 4719 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955955 4719 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955959 4719 flags.go:64] FLAG: --manifest-url="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955963 4719 flags.go:64] FLAG: --manifest-url-header="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955968 4719 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955973 4719 flags.go:64] FLAG: --max-open-files="1000000" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955978 4719 flags.go:64] FLAG: --max-pods="110" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955982 4719 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955986 4719 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955990 4719 flags.go:64] FLAG: --memory-manager-policy="None" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955994 4719 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.955998 4719 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956002 4719 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956006 4719 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956023 4719 flags.go:64] FLAG: --node-status-max-images="50" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956027 4719 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956032 4719 flags.go:64] FLAG: --oom-score-adj="-999" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956036 4719 flags.go:64] FLAG: --pod-cidr="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956040 4719 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956046 4719 flags.go:64] FLAG: --pod-manifest-path="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956049 4719 flags.go:64] FLAG: --pod-max-pids="-1" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956053 4719 flags.go:64] FLAG: --pods-per-core="0" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956057 4719 flags.go:64] FLAG: --port="10250" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956062 4719 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956066 4719 flags.go:64] FLAG: --provider-id="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956070 4719 flags.go:64] FLAG: --qos-reserved="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956074 4719 flags.go:64] FLAG: --read-only-port="10255" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956092 4719 flags.go:64] FLAG: --register-node="true" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956096 4719 flags.go:64] FLAG: --register-schedulable="true" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956100 4719 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956107 4719 flags.go:64] FLAG: --registry-burst="10" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956111 4719 flags.go:64] FLAG: --registry-qps="5" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956115 4719 flags.go:64] FLAG: --reserved-cpus="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956119 4719 flags.go:64] FLAG: --reserved-memory="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956125 4719 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956129 4719 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956133 4719 flags.go:64] FLAG: --rotate-certificates="false" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956137 4719 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956141 4719 flags.go:64] FLAG: --runonce="false" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956146 4719 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956151 4719 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956156 4719 flags.go:64] FLAG: --seccomp-default="false" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956160 4719 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956164 4719 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956169 4719 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956174 4719 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956178 4719 flags.go:64] FLAG: --storage-driver-password="root" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956183 4719 flags.go:64] FLAG: --storage-driver-secure="false" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956187 4719 flags.go:64] FLAG: --storage-driver-table="stats" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956191 4719 flags.go:64] FLAG: --storage-driver-user="root" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956195 4719 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956199 4719 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956203 4719 flags.go:64] FLAG: --system-cgroups="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956207 4719 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956213 4719 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956217 4719 flags.go:64] FLAG: --tls-cert-file="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956220 4719 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956226 4719 flags.go:64] FLAG: --tls-min-version="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956230 4719 flags.go:64] FLAG: --tls-private-key-file="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956234 4719 flags.go:64] FLAG: --topology-manager-policy="none" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956238 4719 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956242 4719 flags.go:64] FLAG: --topology-manager-scope="container" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956246 4719 flags.go:64] FLAG: --v="2" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956251 4719 flags.go:64] FLAG: --version="false" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956257 4719 flags.go:64] FLAG: --vmodule="" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956262 4719 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956266 4719 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956380 4719 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956386 4719 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956392 4719 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956398 4719 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956402 4719 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956408 4719 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956412 4719 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956417 4719 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956421 4719 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956425 4719 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956428 4719 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956432 4719 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956435 4719 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956439 4719 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956443 4719 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956446 4719 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956449 4719 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956453 4719 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956456 4719 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956460 4719 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956463 4719 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956467 4719 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956472 4719 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956476 4719 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956480 4719 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956484 4719 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956488 4719 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956491 4719 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956496 4719 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956500 4719 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956504 4719 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956508 4719 feature_gate.go:330] unrecognized feature gate: Example Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956512 4719 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956515 4719 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956519 4719 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956522 4719 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956525 4719 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956529 4719 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956532 4719 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956536 4719 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956542 4719 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956545 4719 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956549 4719 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956553 4719 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956556 4719 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956559 4719 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956563 4719 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956567 4719 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956570 4719 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956573 4719 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956577 4719 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956581 4719 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956586 4719 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956590 4719 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956593 4719 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956597 4719 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956600 4719 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956604 4719 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956607 4719 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956612 4719 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956616 4719 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956620 4719 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956624 4719 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956629 4719 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956633 4719 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956636 4719 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956640 4719 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956643 4719 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956647 4719 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956650 4719 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.956654 4719 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.956665 4719 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.966050 4719 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.966066 4719 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966119 4719 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966124 4719 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966129 4719 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966134 4719 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966140 4719 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966144 4719 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966148 4719 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966153 4719 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966157 4719 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966161 4719 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966165 4719 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966169 4719 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966173 4719 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966178 4719 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966182 4719 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966185 4719 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966188 4719 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966192 4719 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966195 4719 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966199 4719 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966202 4719 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966206 4719 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966209 4719 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966212 4719 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966216 4719 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966219 4719 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966223 4719 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966227 4719 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966230 4719 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966234 4719 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966237 4719 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966241 4719 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966246 4719 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966251 4719 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966255 4719 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966259 4719 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966263 4719 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966267 4719 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966270 4719 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966274 4719 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966279 4719 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966283 4719 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966287 4719 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966291 4719 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966295 4719 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966298 4719 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966302 4719 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966305 4719 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966309 4719 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966313 4719 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966316 4719 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966320 4719 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966324 4719 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966327 4719 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966331 4719 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966334 4719 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966338 4719 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966341 4719 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966345 4719 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966360 4719 feature_gate.go:330] unrecognized feature gate: Example Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966365 4719 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966368 4719 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966372 4719 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966377 4719 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966381 4719 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966386 4719 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966390 4719 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966393 4719 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966397 4719 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966401 4719 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966406 4719 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.966411 4719 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966510 4719 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966516 4719 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966519 4719 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966523 4719 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966526 4719 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966530 4719 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966533 4719 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966537 4719 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966541 4719 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966545 4719 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966548 4719 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966551 4719 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966555 4719 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966559 4719 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966562 4719 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966566 4719 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966569 4719 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966573 4719 feature_gate.go:330] unrecognized feature gate: Example Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966576 4719 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966580 4719 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966583 4719 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966587 4719 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966590 4719 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966594 4719 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966599 4719 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966603 4719 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966606 4719 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966609 4719 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966613 4719 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966616 4719 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966620 4719 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966623 4719 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966626 4719 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966630 4719 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966634 4719 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966638 4719 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966641 4719 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966645 4719 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966649 4719 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966654 4719 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966658 4719 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966662 4719 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966666 4719 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966670 4719 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966674 4719 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966678 4719 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966682 4719 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966686 4719 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966689 4719 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966693 4719 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966696 4719 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966700 4719 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966703 4719 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966708 4719 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966713 4719 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966717 4719 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966722 4719 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966726 4719 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966730 4719 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966733 4719 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966737 4719 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966740 4719 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966744 4719 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966748 4719 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966752 4719 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966756 4719 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966760 4719 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966764 4719 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966767 4719 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966771 4719 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 09 15:18:14 crc kubenswrapper[4719]: W1009 15:18:14.966775 4719 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.966781 4719 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.969370 4719 server.go:940] "Client rotation is on, will bootstrap in background" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.972858 4719 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.972937 4719 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.974507 4719 server.go:997] "Starting client certificate rotation" Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.974526 4719 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.974660 4719 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-01 22:26:33.902603997 +0000 UTC Oct 09 15:18:14 crc kubenswrapper[4719]: I1009 15:18:14.974704 4719 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 2023h8m18.927902093s for next certificate rotation Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.003419 4719 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.004844 4719 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.023293 4719 log.go:25] "Validated CRI v1 runtime API" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.053087 4719 log.go:25] "Validated CRI v1 image API" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.054671 4719 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.061853 4719 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-09-15-13-31-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.061881 4719 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.079034 4719 manager.go:217] Machine: {Timestamp:2025-10-09 15:18:15.073854899 +0000 UTC m=+0.583566214 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:d18dc188-15d4-4547-94df-d9149082a3a0 BootID:7d273987-9d8a-4a77-9956-ccb64e9e22c3 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:40:d6:3f Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:40:d6:3f Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:97:5d:fe Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:f2:a3:06 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:51:18:6f Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:6e:90:22 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:52:7c:41:a5:6f:40 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:12:c9:c2:16:db:af Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.079336 4719 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.079529 4719 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.083489 4719 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.083684 4719 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.083720 4719 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.083924 4719 topology_manager.go:138] "Creating topology manager with none policy" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.083936 4719 container_manager_linux.go:303] "Creating device plugin manager" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.084503 4719 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.084533 4719 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.084729 4719 state_mem.go:36] "Initialized new in-memory state store" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.084820 4719 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.090280 4719 kubelet.go:418] "Attempting to sync node with API server" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.090321 4719 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.090453 4719 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.090491 4719 kubelet.go:324] "Adding apiserver pod source" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.090516 4719 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.096545 4719 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.097569 4719 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 09 15:18:15 crc kubenswrapper[4719]: W1009 15:18:15.097703 4719 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.59:6443: connect: connection refused Oct 09 15:18:15 crc kubenswrapper[4719]: W1009 15:18:15.097688 4719 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.59:6443: connect: connection refused Oct 09 15:18:15 crc kubenswrapper[4719]: E1009 15:18:15.097988 4719 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.59:6443: connect: connection refused" logger="UnhandledError" Oct 09 15:18:15 crc kubenswrapper[4719]: E1009 15:18:15.097841 4719 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.59:6443: connect: connection refused" logger="UnhandledError" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.099406 4719 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.102810 4719 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.102905 4719 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.102959 4719 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.103013 4719 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.103074 4719 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.103131 4719 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.103184 4719 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.103236 4719 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.103285 4719 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.103334 4719 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.103424 4719 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.103490 4719 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.105391 4719 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.106033 4719 server.go:1280] "Started kubelet" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.107341 4719 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.107448 4719 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.59:6443: connect: connection refused Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.107339 4719 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 09 15:18:15 crc systemd[1]: Started Kubernetes Kubelet. Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.108452 4719 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.108494 4719 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.108520 4719 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 20:04:21.264968621 +0000 UTC Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.108560 4719 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 796h46m6.15641112s for next certificate rotation Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.108746 4719 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.108762 4719 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.109056 4719 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 09 15:18:15 crc kubenswrapper[4719]: E1009 15:18:15.108911 4719 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.109378 4719 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 09 15:18:15 crc kubenswrapper[4719]: W1009 15:18:15.109737 4719 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.59:6443: connect: connection refused Oct 09 15:18:15 crc kubenswrapper[4719]: E1009 15:18:15.109911 4719 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.59:6443: connect: connection refused" logger="UnhandledError" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.110747 4719 server.go:460] "Adding debug handlers to kubelet server" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.110981 4719 factory.go:55] Registering systemd factory Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.111099 4719 factory.go:221] Registration of the systemd container factory successfully Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.111447 4719 factory.go:153] Registering CRI-O factory Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.111470 4719 factory.go:221] Registration of the crio container factory successfully Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.111537 4719 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.111557 4719 factory.go:103] Registering Raw factory Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.111570 4719 manager.go:1196] Started watching for new ooms in manager Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.112049 4719 manager.go:319] Starting recovery of all containers Oct 09 15:18:15 crc kubenswrapper[4719]: E1009 15:18:15.112877 4719 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.59:6443: connect: connection refused" interval="200ms" Oct 09 15:18:15 crc kubenswrapper[4719]: E1009 15:18:15.116441 4719 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.59:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186cdbae1446daa8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-09 15:18:15.106001576 +0000 UTC m=+0.615712871,LastTimestamp:2025-10-09 15:18:15.106001576 +0000 UTC m=+0.615712871,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.121323 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.121501 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.121603 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.121682 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.121747 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.121809 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.121862 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.121923 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.122002 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.122061 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.122118 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.122173 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.122241 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.122310 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.122381 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.122437 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.122520 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.122575 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.122633 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.122692 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.122744 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.122799 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.122860 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.122920 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.122979 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123032 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123097 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123184 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123205 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123221 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123233 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123247 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123258 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123268 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123281 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123293 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123325 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123337 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123363 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123382 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123394 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123407 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123417 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123431 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123443 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123456 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123472 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123488 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123516 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123534 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123547 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123559 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123580 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123595 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123607 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123631 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123646 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123658 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123671 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123681 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123691 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123703 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123712 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123724 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123733 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123742 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123754 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123765 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123778 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123865 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123875 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123889 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123899 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123909 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123924 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123934 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123946 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123955 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123966 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123978 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.123988 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124001 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124011 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124022 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124035 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124045 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124059 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124073 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124084 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124096 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124107 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124119 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124129 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124139 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124152 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124163 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124200 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124210 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124220 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124234 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124244 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124255 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124270 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124282 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124307 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124323 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124336 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124370 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124387 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124399 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124413 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124432 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124443 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124456 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124467 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124478 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124489 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124502 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124512 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124558 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124604 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124614 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124624 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124637 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124685 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124699 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124709 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124819 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124881 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124896 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124939 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124964 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.124994 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.125006 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.125050 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.125083 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.125116 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.125125 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.125137 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.125146 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.125157 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.125166 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.125176 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.125191 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.125221 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129488 4719 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129527 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129546 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129566 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129580 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129604 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129618 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129631 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129643 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129693 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129715 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129729 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129742 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129755 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129769 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129780 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129790 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129801 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129811 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129822 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129833 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129843 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129854 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129865 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129875 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129886 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129905 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129915 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129924 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129934 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129944 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129953 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129963 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129972 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.129982 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130149 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130158 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130169 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130178 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130187 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130197 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130206 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130215 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130225 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130234 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130243 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130253 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130261 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130270 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130279 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130289 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130299 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130309 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130319 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130328 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130337 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130369 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130391 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130401 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130411 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130421 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130431 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130441 4719 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130449 4719 reconstruct.go:97] "Volume reconstruction finished" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.130456 4719 reconciler.go:26] "Reconciler: start to sync state" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.144711 4719 manager.go:324] Recovery completed Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.156957 4719 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.157334 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.158711 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.158751 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.158763 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.159416 4719 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.159442 4719 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.159466 4719 state_mem.go:36] "Initialized new in-memory state store" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.159703 4719 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.159789 4719 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.159859 4719 kubelet.go:2335] "Starting kubelet main sync loop" Oct 09 15:18:15 crc kubenswrapper[4719]: E1009 15:18:15.160026 4719 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 09 15:18:15 crc kubenswrapper[4719]: W1009 15:18:15.161574 4719 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.59:6443: connect: connection refused Oct 09 15:18:15 crc kubenswrapper[4719]: E1009 15:18:15.161698 4719 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.59:6443: connect: connection refused" logger="UnhandledError" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.176808 4719 policy_none.go:49] "None policy: Start" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.177795 4719 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.177874 4719 state_mem.go:35] "Initializing new in-memory state store" Oct 09 15:18:15 crc kubenswrapper[4719]: E1009 15:18:15.209143 4719 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.233750 4719 manager.go:334] "Starting Device Plugin manager" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.233802 4719 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.233813 4719 server.go:79] "Starting device plugin registration server" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.234284 4719 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.234308 4719 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.234520 4719 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.234640 4719 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.234661 4719 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 09 15:18:15 crc kubenswrapper[4719]: E1009 15:18:15.241471 4719 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.261051 4719 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.261202 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.262599 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.262636 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.262648 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.262841 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.263200 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.263275 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.263982 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.264012 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.264024 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.264379 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.264531 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.264572 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.264582 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.264609 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.264622 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.265923 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.265945 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.265954 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.265972 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.266003 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.266015 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.266069 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.266238 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.266305 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.267217 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.267248 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.267254 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.267285 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.267284 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.267389 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.267430 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.267700 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.267736 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.268130 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.268336 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.268468 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.268732 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.268770 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.268793 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.268856 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.268992 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.270493 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.270517 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.270527 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:15 crc kubenswrapper[4719]: E1009 15:18:15.314483 4719 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.59:6443: connect: connection refused" interval="400ms" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.332164 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.332304 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.332416 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.332491 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.332576 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.332650 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.332724 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.332815 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.332944 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.333017 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.333090 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.333154 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.333216 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.333283 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.333378 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.334811 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.335868 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.335910 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.335920 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.335945 4719 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 15:18:15 crc kubenswrapper[4719]: E1009 15:18:15.336493 4719 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.59:6443: connect: connection refused" node="crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.434396 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.434747 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.434769 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.434681 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.434885 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.434813 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.434912 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.434917 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.434960 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.434969 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.434941 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.434992 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.435009 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.435037 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.435058 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.435078 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.435099 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.435104 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.435135 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.435144 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.435167 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.435144 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.435117 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.435202 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.435203 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.435234 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.435249 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.435271 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.435296 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.435381 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.536607 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.538040 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.538073 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.538086 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.538110 4719 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 15:18:15 crc kubenswrapper[4719]: E1009 15:18:15.538496 4719 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.59:6443: connect: connection refused" node="crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.587917 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.596522 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.618835 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: W1009 15:18:15.625158 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-3141f38b3dc29b2301d552642ae603b0715ea0bac506c0d4f1cbb24c5eaf16ae WatchSource:0}: Error finding container 3141f38b3dc29b2301d552642ae603b0715ea0bac506c0d4f1cbb24c5eaf16ae: Status 404 returned error can't find the container with id 3141f38b3dc29b2301d552642ae603b0715ea0bac506c0d4f1cbb24c5eaf16ae Oct 09 15:18:15 crc kubenswrapper[4719]: W1009 15:18:15.629754 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-5aae14ef230474067484d171f341af8196ba0e8d9c31a2c80cabbe4141702331 WatchSource:0}: Error finding container 5aae14ef230474067484d171f341af8196ba0e8d9c31a2c80cabbe4141702331: Status 404 returned error can't find the container with id 5aae14ef230474067484d171f341af8196ba0e8d9c31a2c80cabbe4141702331 Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.640404 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: W1009 15:18:15.644545 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-aeae88b79e8a63b0b9fbba7577547e977285ceebfc89e85172d93fcffdae522f WatchSource:0}: Error finding container aeae88b79e8a63b0b9fbba7577547e977285ceebfc89e85172d93fcffdae522f: Status 404 returned error can't find the container with id aeae88b79e8a63b0b9fbba7577547e977285ceebfc89e85172d93fcffdae522f Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.647631 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 15:18:15 crc kubenswrapper[4719]: W1009 15:18:15.654126 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-b25666197fa649c7fc466fee8bf5dbf75ce39ea75eab1ddf4f93f321f0bbd431 WatchSource:0}: Error finding container b25666197fa649c7fc466fee8bf5dbf75ce39ea75eab1ddf4f93f321f0bbd431: Status 404 returned error can't find the container with id b25666197fa649c7fc466fee8bf5dbf75ce39ea75eab1ddf4f93f321f0bbd431 Oct 09 15:18:15 crc kubenswrapper[4719]: W1009 15:18:15.663846 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-1067ab81a2029caeb62fe4cb8d41e2d562e45e3c7c219299f32f1167e11ba353 WatchSource:0}: Error finding container 1067ab81a2029caeb62fe4cb8d41e2d562e45e3c7c219299f32f1167e11ba353: Status 404 returned error can't find the container with id 1067ab81a2029caeb62fe4cb8d41e2d562e45e3c7c219299f32f1167e11ba353 Oct 09 15:18:15 crc kubenswrapper[4719]: E1009 15:18:15.715874 4719 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.59:6443: connect: connection refused" interval="800ms" Oct 09 15:18:15 crc kubenswrapper[4719]: W1009 15:18:15.920420 4719 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.59:6443: connect: connection refused Oct 09 15:18:15 crc kubenswrapper[4719]: E1009 15:18:15.920507 4719 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.59:6443: connect: connection refused" logger="UnhandledError" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.938790 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.940980 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.941026 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.941038 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:15 crc kubenswrapper[4719]: I1009 15:18:15.941067 4719 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 15:18:15 crc kubenswrapper[4719]: E1009 15:18:15.941538 4719 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.59:6443: connect: connection refused" node="crc" Oct 09 15:18:16 crc kubenswrapper[4719]: W1009 15:18:16.016134 4719 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.59:6443: connect: connection refused Oct 09 15:18:16 crc kubenswrapper[4719]: E1009 15:18:16.016242 4719 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.59:6443: connect: connection refused" logger="UnhandledError" Oct 09 15:18:16 crc kubenswrapper[4719]: W1009 15:18:16.084825 4719 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.59:6443: connect: connection refused Oct 09 15:18:16 crc kubenswrapper[4719]: E1009 15:18:16.084914 4719 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.59:6443: connect: connection refused" logger="UnhandledError" Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.108960 4719 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.59:6443: connect: connection refused Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.166078 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b628c71ffc4577dac4247fca1780e229a260bd382075e7eeb15d7f71fa688c27"} Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.166174 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1067ab81a2029caeb62fe4cb8d41e2d562e45e3c7c219299f32f1167e11ba353"} Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.166261 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.167267 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281"} Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.167591 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b25666197fa649c7fc466fee8bf5dbf75ce39ea75eab1ddf4f93f321f0bbd431"} Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.167531 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.167623 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.167635 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.169063 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d"} Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.169088 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"aeae88b79e8a63b0b9fbba7577547e977285ceebfc89e85172d93fcffdae522f"} Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.169174 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.169850 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.169869 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.169877 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.170623 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284"} Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.170674 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3141f38b3dc29b2301d552642ae603b0715ea0bac506c0d4f1cbb24c5eaf16ae"} Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.170811 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.171711 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.171733 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.171741 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.172790 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"a73e5700d7dbee6fac767db433a82521b7af9241107369d3be4aa00593128763"} Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.172897 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5aae14ef230474067484d171f341af8196ba0e8d9c31a2c80cabbe4141702331"} Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.173008 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.173830 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.173921 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.174005 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:16 crc kubenswrapper[4719]: W1009 15:18:16.250418 4719 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.59:6443: connect: connection refused Oct 09 15:18:16 crc kubenswrapper[4719]: E1009 15:18:16.250487 4719 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.59:6443: connect: connection refused" logger="UnhandledError" Oct 09 15:18:16 crc kubenswrapper[4719]: E1009 15:18:16.516858 4719 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.59:6443: connect: connection refused" interval="1.6s" Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.742704 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.744715 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.744757 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.744767 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:16 crc kubenswrapper[4719]: I1009 15:18:16.744819 4719 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 15:18:16 crc kubenswrapper[4719]: E1009 15:18:16.745314 4719 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.59:6443: connect: connection refused" node="crc" Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.109098 4719 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.59:6443: connect: connection refused Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.178528 4719 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d" exitCode=0 Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.178606 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d"} Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.178649 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.179571 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.179604 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.179617 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.180157 4719 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284" exitCode=0 Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.180217 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284"} Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.180334 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.181011 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.181037 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.181050 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.181117 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.182185 4719 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="a73e5700d7dbee6fac767db433a82521b7af9241107369d3be4aa00593128763" exitCode=0 Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.182274 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"a73e5700d7dbee6fac767db433a82521b7af9241107369d3be4aa00593128763"} Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.182379 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.182770 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.182793 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.182804 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.183860 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.183906 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.183920 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.185616 4719 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="b628c71ffc4577dac4247fca1780e229a260bd382075e7eeb15d7f71fa688c27" exitCode=0 Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.185686 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"b628c71ffc4577dac4247fca1780e229a260bd382075e7eeb15d7f71fa688c27"} Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.185771 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.186391 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.186415 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.186425 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.193704 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68"} Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.193778 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab"} Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.193801 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423"} Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.193928 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.195441 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.195488 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:17 crc kubenswrapper[4719]: I1009 15:18:17.195510 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.108876 4719 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.59:6443: connect: connection refused Oct 09 15:18:18 crc kubenswrapper[4719]: E1009 15:18:18.117479 4719 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.59:6443: connect: connection refused" interval="3.2s" Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.200856 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ff1066e8910a7aa889e7cc5c7b2735a240197a60b66c9471b4fda297dba4176f"} Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.201041 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.202067 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.202117 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.202129 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.208136 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"db4a7a60b0336fb0e1a046f59f9d60cc55a056b70959c7e6a33b6b15b879bd76"} Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.208198 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ecd0662699e43951e6e139dbe8bb44c36a0120144c90a7f21010cbf68a2abcf7"} Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.208213 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7137edca40a10e85d3116f62b5dbe6ffea35d9473164173af2dea55f1794397c"} Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.208335 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.209383 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.209418 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.209428 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.215906 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6"} Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.215950 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca"} Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.215966 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a"} Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.215979 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac"} Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.219395 4719 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9" exitCode=0 Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.219442 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9"} Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.219642 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.219677 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.221058 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.221068 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.221095 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.221099 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.221107 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.221116 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:18 crc kubenswrapper[4719]: W1009 15:18:18.242425 4719 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.59:6443: connect: connection refused Oct 09 15:18:18 crc kubenswrapper[4719]: E1009 15:18:18.242502 4719 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.59:6443: connect: connection refused" logger="UnhandledError" Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.346010 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.347696 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.347743 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.347752 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:18 crc kubenswrapper[4719]: I1009 15:18:18.347781 4719 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 15:18:18 crc kubenswrapper[4719]: E1009 15:18:18.348374 4719 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.59:6443: connect: connection refused" node="crc" Oct 09 15:18:18 crc kubenswrapper[4719]: W1009 15:18:18.678035 4719 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.59:6443: connect: connection refused Oct 09 15:18:18 crc kubenswrapper[4719]: E1009 15:18:18.678197 4719 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.59:6443: connect: connection refused" logger="UnhandledError" Oct 09 15:18:19 crc kubenswrapper[4719]: I1009 15:18:19.227554 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6bb563b23f59ad2cd83e71016c9f1497905e250c15aaabbedafa95973a5646c0"} Oct 09 15:18:19 crc kubenswrapper[4719]: I1009 15:18:19.227711 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:19 crc kubenswrapper[4719]: I1009 15:18:19.228949 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:19 crc kubenswrapper[4719]: I1009 15:18:19.229018 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:19 crc kubenswrapper[4719]: I1009 15:18:19.229044 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:19 crc kubenswrapper[4719]: I1009 15:18:19.232317 4719 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641" exitCode=0 Oct 09 15:18:19 crc kubenswrapper[4719]: I1009 15:18:19.232369 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641"} Oct 09 15:18:19 crc kubenswrapper[4719]: I1009 15:18:19.232514 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:19 crc kubenswrapper[4719]: I1009 15:18:19.233713 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:19 crc kubenswrapper[4719]: I1009 15:18:19.234688 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:19 crc kubenswrapper[4719]: I1009 15:18:19.234703 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:19 crc kubenswrapper[4719]: I1009 15:18:19.798744 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 15:18:19 crc kubenswrapper[4719]: I1009 15:18:19.798959 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:19 crc kubenswrapper[4719]: I1009 15:18:19.800087 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:19 crc kubenswrapper[4719]: I1009 15:18:19.800117 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:19 crc kubenswrapper[4719]: I1009 15:18:19.800127 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:19 crc kubenswrapper[4719]: I1009 15:18:19.866728 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 15:18:20 crc kubenswrapper[4719]: I1009 15:18:20.244038 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76"} Oct 09 15:18:20 crc kubenswrapper[4719]: I1009 15:18:20.244107 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2"} Oct 09 15:18:20 crc kubenswrapper[4719]: I1009 15:18:20.244122 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac"} Oct 09 15:18:20 crc kubenswrapper[4719]: I1009 15:18:20.244131 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:20 crc kubenswrapper[4719]: I1009 15:18:20.244134 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209"} Oct 09 15:18:20 crc kubenswrapper[4719]: I1009 15:18:20.245016 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:20 crc kubenswrapper[4719]: I1009 15:18:20.245050 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:20 crc kubenswrapper[4719]: I1009 15:18:20.245061 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:21 crc kubenswrapper[4719]: I1009 15:18:21.052676 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 15:18:21 crc kubenswrapper[4719]: I1009 15:18:21.053000 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:21 crc kubenswrapper[4719]: I1009 15:18:21.054678 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:21 crc kubenswrapper[4719]: I1009 15:18:21.054730 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:21 crc kubenswrapper[4719]: I1009 15:18:21.054746 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:21 crc kubenswrapper[4719]: I1009 15:18:21.253690 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e"} Oct 09 15:18:21 crc kubenswrapper[4719]: I1009 15:18:21.253796 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:21 crc kubenswrapper[4719]: I1009 15:18:21.253826 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:21 crc kubenswrapper[4719]: I1009 15:18:21.254786 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:21 crc kubenswrapper[4719]: I1009 15:18:21.254832 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:21 crc kubenswrapper[4719]: I1009 15:18:21.254848 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:21 crc kubenswrapper[4719]: I1009 15:18:21.254786 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:21 crc kubenswrapper[4719]: I1009 15:18:21.254910 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:21 crc kubenswrapper[4719]: I1009 15:18:21.254922 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:21 crc kubenswrapper[4719]: I1009 15:18:21.548602 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:21 crc kubenswrapper[4719]: I1009 15:18:21.550957 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:21 crc kubenswrapper[4719]: I1009 15:18:21.551014 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:21 crc kubenswrapper[4719]: I1009 15:18:21.551029 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:21 crc kubenswrapper[4719]: I1009 15:18:21.551074 4719 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 15:18:22 crc kubenswrapper[4719]: I1009 15:18:22.054463 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 15:18:22 crc kubenswrapper[4719]: I1009 15:18:22.212166 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 15:18:22 crc kubenswrapper[4719]: I1009 15:18:22.218919 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 15:18:22 crc kubenswrapper[4719]: I1009 15:18:22.219212 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:22 crc kubenswrapper[4719]: I1009 15:18:22.220910 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:22 crc kubenswrapper[4719]: I1009 15:18:22.220951 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:22 crc kubenswrapper[4719]: I1009 15:18:22.220964 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:22 crc kubenswrapper[4719]: I1009 15:18:22.256331 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:22 crc kubenswrapper[4719]: I1009 15:18:22.256465 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:22 crc kubenswrapper[4719]: I1009 15:18:22.258031 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:22 crc kubenswrapper[4719]: I1009 15:18:22.258081 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:22 crc kubenswrapper[4719]: I1009 15:18:22.258094 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:22 crc kubenswrapper[4719]: I1009 15:18:22.258169 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:22 crc kubenswrapper[4719]: I1009 15:18:22.258219 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:22 crc kubenswrapper[4719]: I1009 15:18:22.258241 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:22 crc kubenswrapper[4719]: I1009 15:18:22.799222 4719 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 09 15:18:22 crc kubenswrapper[4719]: I1009 15:18:22.799318 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 09 15:18:23 crc kubenswrapper[4719]: I1009 15:18:23.260608 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:23 crc kubenswrapper[4719]: I1009 15:18:23.261750 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:23 crc kubenswrapper[4719]: I1009 15:18:23.262036 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:23 crc kubenswrapper[4719]: I1009 15:18:23.262133 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:24 crc kubenswrapper[4719]: I1009 15:18:24.509733 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 15:18:24 crc kubenswrapper[4719]: I1009 15:18:24.509906 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:24 crc kubenswrapper[4719]: I1009 15:18:24.511149 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:24 crc kubenswrapper[4719]: I1009 15:18:24.511192 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:24 crc kubenswrapper[4719]: I1009 15:18:24.511206 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:24 crc kubenswrapper[4719]: I1009 15:18:24.871005 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 09 15:18:24 crc kubenswrapper[4719]: I1009 15:18:24.871234 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:24 crc kubenswrapper[4719]: I1009 15:18:24.872620 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:24 crc kubenswrapper[4719]: I1009 15:18:24.872690 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:24 crc kubenswrapper[4719]: I1009 15:18:24.872705 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:25 crc kubenswrapper[4719]: E1009 15:18:25.241784 4719 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 09 15:18:25 crc kubenswrapper[4719]: I1009 15:18:25.985604 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 09 15:18:25 crc kubenswrapper[4719]: I1009 15:18:25.985768 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:25 crc kubenswrapper[4719]: I1009 15:18:25.986900 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:25 crc kubenswrapper[4719]: I1009 15:18:25.986983 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:25 crc kubenswrapper[4719]: I1009 15:18:25.987007 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:26 crc kubenswrapper[4719]: I1009 15:18:26.016550 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 15:18:26 crc kubenswrapper[4719]: I1009 15:18:26.016693 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:26 crc kubenswrapper[4719]: I1009 15:18:26.017554 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:26 crc kubenswrapper[4719]: I1009 15:18:26.017705 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:26 crc kubenswrapper[4719]: I1009 15:18:26.017790 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:26 crc kubenswrapper[4719]: I1009 15:18:26.024083 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 15:18:26 crc kubenswrapper[4719]: I1009 15:18:26.270556 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:26 crc kubenswrapper[4719]: I1009 15:18:26.273123 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:26 crc kubenswrapper[4719]: I1009 15:18:26.273159 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:26 crc kubenswrapper[4719]: I1009 15:18:26.273169 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:26 crc kubenswrapper[4719]: I1009 15:18:26.277127 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 15:18:27 crc kubenswrapper[4719]: I1009 15:18:27.271335 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:27 crc kubenswrapper[4719]: I1009 15:18:27.272309 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:27 crc kubenswrapper[4719]: I1009 15:18:27.272367 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:27 crc kubenswrapper[4719]: I1009 15:18:27.272378 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:28 crc kubenswrapper[4719]: W1009 15:18:28.911886 4719 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 09 15:18:28 crc kubenswrapper[4719]: I1009 15:18:28.911973 4719 trace.go:236] Trace[1891769126]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Oct-2025 15:18:18.910) (total time: 10001ms): Oct 09 15:18:28 crc kubenswrapper[4719]: Trace[1891769126]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:18:28.911) Oct 09 15:18:28 crc kubenswrapper[4719]: Trace[1891769126]: [10.001811205s] [10.001811205s] END Oct 09 15:18:28 crc kubenswrapper[4719]: E1009 15:18:28.911995 4719 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 09 15:18:29 crc kubenswrapper[4719]: I1009 15:18:29.109574 4719 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 09 15:18:29 crc kubenswrapper[4719]: W1009 15:18:29.245410 4719 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 09 15:18:29 crc kubenswrapper[4719]: I1009 15:18:29.245517 4719 trace.go:236] Trace[383963276]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Oct-2025 15:18:19.243) (total time: 10001ms): Oct 09 15:18:29 crc kubenswrapper[4719]: Trace[383963276]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:18:29.245) Oct 09 15:18:29 crc kubenswrapper[4719]: Trace[383963276]: [10.001683601s] [10.001683601s] END Oct 09 15:18:29 crc kubenswrapper[4719]: E1009 15:18:29.245543 4719 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 09 15:18:29 crc kubenswrapper[4719]: I1009 15:18:29.277513 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 09 15:18:29 crc kubenswrapper[4719]: I1009 15:18:29.279745 4719 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6bb563b23f59ad2cd83e71016c9f1497905e250c15aaabbedafa95973a5646c0" exitCode=255 Oct 09 15:18:29 crc kubenswrapper[4719]: I1009 15:18:29.279822 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6bb563b23f59ad2cd83e71016c9f1497905e250c15aaabbedafa95973a5646c0"} Oct 09 15:18:29 crc kubenswrapper[4719]: I1009 15:18:29.280050 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:29 crc kubenswrapper[4719]: I1009 15:18:29.281512 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:29 crc kubenswrapper[4719]: I1009 15:18:29.281559 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:29 crc kubenswrapper[4719]: I1009 15:18:29.281573 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:29 crc kubenswrapper[4719]: I1009 15:18:29.282389 4719 scope.go:117] "RemoveContainer" containerID="6bb563b23f59ad2cd83e71016c9f1497905e250c15aaabbedafa95973a5646c0" Oct 09 15:18:29 crc kubenswrapper[4719]: I1009 15:18:29.468479 4719 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 09 15:18:29 crc kubenswrapper[4719]: I1009 15:18:29.468571 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 09 15:18:29 crc kubenswrapper[4719]: I1009 15:18:29.472581 4719 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 09 15:18:29 crc kubenswrapper[4719]: I1009 15:18:29.472663 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 09 15:18:30 crc kubenswrapper[4719]: I1009 15:18:30.283134 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 09 15:18:30 crc kubenswrapper[4719]: I1009 15:18:30.284472 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30"} Oct 09 15:18:30 crc kubenswrapper[4719]: I1009 15:18:30.284608 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:30 crc kubenswrapper[4719]: I1009 15:18:30.285272 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:30 crc kubenswrapper[4719]: I1009 15:18:30.285295 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:30 crc kubenswrapper[4719]: I1009 15:18:30.285304 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:32 crc kubenswrapper[4719]: I1009 15:18:32.220756 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 15:18:32 crc kubenswrapper[4719]: I1009 15:18:32.220928 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:32 crc kubenswrapper[4719]: I1009 15:18:32.221656 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 15:18:32 crc kubenswrapper[4719]: I1009 15:18:32.222311 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:32 crc kubenswrapper[4719]: I1009 15:18:32.222375 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:32 crc kubenswrapper[4719]: I1009 15:18:32.222392 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:32 crc kubenswrapper[4719]: I1009 15:18:32.228294 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 15:18:32 crc kubenswrapper[4719]: I1009 15:18:32.290230 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:32 crc kubenswrapper[4719]: I1009 15:18:32.291056 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:32 crc kubenswrapper[4719]: I1009 15:18:32.291088 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:32 crc kubenswrapper[4719]: I1009 15:18:32.291100 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:32 crc kubenswrapper[4719]: I1009 15:18:32.800373 4719 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 09 15:18:32 crc kubenswrapper[4719]: I1009 15:18:32.800439 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 09 15:18:33 crc kubenswrapper[4719]: I1009 15:18:33.169454 4719 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 09 15:18:33 crc kubenswrapper[4719]: I1009 15:18:33.291972 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:33 crc kubenswrapper[4719]: I1009 15:18:33.292734 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:33 crc kubenswrapper[4719]: I1009 15:18:33.292763 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:33 crc kubenswrapper[4719]: I1009 15:18:33.292774 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:34 crc kubenswrapper[4719]: I1009 15:18:34.306493 4719 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 09 15:18:34 crc kubenswrapper[4719]: E1009 15:18:34.462568 4719 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 09 15:18:34 crc kubenswrapper[4719]: E1009 15:18:34.465506 4719 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 09 15:18:34 crc kubenswrapper[4719]: I1009 15:18:34.466058 4719 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 09 15:18:34 crc kubenswrapper[4719]: I1009 15:18:34.466187 4719 trace.go:236] Trace[1397320277]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Oct-2025 15:18:22.523) (total time: 11942ms): Oct 09 15:18:34 crc kubenswrapper[4719]: Trace[1397320277]: ---"Objects listed" error: 11942ms (15:18:34.466) Oct 09 15:18:34 crc kubenswrapper[4719]: Trace[1397320277]: [11.942414516s] [11.942414516s] END Oct 09 15:18:34 crc kubenswrapper[4719]: I1009 15:18:34.466217 4719 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 09 15:18:34 crc kubenswrapper[4719]: I1009 15:18:34.466370 4719 trace.go:236] Trace[854268629]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Oct-2025 15:18:22.640) (total time: 11825ms): Oct 09 15:18:34 crc kubenswrapper[4719]: Trace[854268629]: ---"Objects listed" error: 11825ms (15:18:34.466) Oct 09 15:18:34 crc kubenswrapper[4719]: Trace[854268629]: [11.825770422s] [11.825770422s] END Oct 09 15:18:34 crc kubenswrapper[4719]: I1009 15:18:34.466394 4719 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 09 15:18:34 crc kubenswrapper[4719]: I1009 15:18:34.892207 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 09 15:18:34 crc kubenswrapper[4719]: I1009 15:18:34.911886 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.104858 4719 apiserver.go:52] "Watching apiserver" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.106727 4719 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.107275 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-etcd/etcd-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.107714 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.107774 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.107837 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.107909 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.107986 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.108081 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.107983 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.108276 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.108433 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.109589 4719 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.110321 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.110386 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.111419 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.111444 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.111794 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.112504 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.112566 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.112718 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.112898 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.116933 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-p9kwh"] Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.117262 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-sc5bv"] Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.117587 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.117827 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.121888 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zv8jk"] Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.127281 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.127406 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.127489 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.128920 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.132313 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.132422 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.132556 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.132572 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.132587 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.132599 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.132774 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.132929 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-kmbvp"] Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.133105 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.133927 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-j5mdb"] Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.134101 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.134225 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-j5mdb" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.137178 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.137702 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.137921 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.138183 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.138416 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.138544 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.138618 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.138652 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.138706 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.138660 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.139085 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.139284 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.148162 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.160911 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.170588 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.170627 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.170651 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.170674 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.170694 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.170714 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.170738 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.170757 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.170775 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.170794 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.170813 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.170831 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.170849 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.170870 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.170887 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.170903 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.170921 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.170945 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.170963 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.170980 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.170996 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.170998 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171015 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171097 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171178 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171216 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171240 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171260 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171269 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171279 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171381 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171404 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171500 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171527 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171548 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171571 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171589 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171606 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171595 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171646 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171665 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171680 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171696 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171711 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171727 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171745 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171783 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171799 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171817 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171840 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171870 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171907 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171923 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171939 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171954 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171969 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171945 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.171984 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172001 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172017 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172043 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172058 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172092 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172114 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172137 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172159 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172176 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172192 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172210 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172226 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172241 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172258 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172274 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172288 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172303 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172371 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172395 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172415 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172433 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172450 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172466 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172483 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172502 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172519 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172535 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172552 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172568 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172587 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172602 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172618 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172633 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172647 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172664 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172679 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172696 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172712 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172727 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172742 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172780 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172795 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172810 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172825 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172841 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172857 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172872 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172890 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172908 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172929 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172950 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172974 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172998 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173018 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173034 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173051 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173067 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173083 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173100 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173115 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173130 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173146 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173162 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173177 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173195 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173211 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173228 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173244 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173262 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173278 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173294 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173310 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173327 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173367 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173390 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173407 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173424 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173438 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173454 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173470 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173488 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173504 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173522 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173539 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173559 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173581 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173597 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173614 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173633 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173650 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173667 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173684 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173701 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173724 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173749 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173768 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173785 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173802 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173818 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173834 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173856 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173873 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173896 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173913 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173931 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173950 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173966 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173982 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173999 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.174016 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.174032 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.174048 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.174066 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.174083 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.174100 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.174117 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.174138 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.174157 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.174174 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.174190 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.174209 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.174225 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.174244 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.174261 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.174304 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.174324 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.174340 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.174379 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172128 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172416 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172479 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172568 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172837 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172948 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.172987 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173180 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173493 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.188418 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173637 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173838 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.173952 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.174073 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.174324 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.174407 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.174912 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.175380 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.175382 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.175599 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.175708 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.175777 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.175980 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.176091 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.176276 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.176297 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.176384 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.176538 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.176542 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.176674 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.176724 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.176735 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.176862 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.176998 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.177093 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.177263 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.177348 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.178112 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.178168 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.178316 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:18:35.677442886 +0000 UTC m=+21.187154171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.191136 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.178764 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.178916 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.178956 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.179131 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.179149 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.179289 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.179410 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.179532 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.179890 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.180031 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194175 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194300 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194339 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194375 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194397 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.180130 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194420 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194442 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194464 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194482 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194502 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194523 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194544 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194565 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194619 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194645 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194674 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194701 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194721 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194777 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194802 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194820 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194842 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194860 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194879 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194899 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194919 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194940 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195051 4719 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195065 4719 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195076 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195088 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195099 4719 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195109 4719 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195120 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195130 4719 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195141 4719 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195151 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195163 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195173 4719 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195186 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195195 4719 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195205 4719 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195215 4719 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195227 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195238 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195248 4719 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195258 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195268 4719 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195279 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195291 4719 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195303 4719 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195330 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195342 4719 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195366 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195378 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195389 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195399 4719 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195410 4719 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195421 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195431 4719 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195441 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195453 4719 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195465 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195476 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195488 4719 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195501 4719 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195512 4719 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195524 4719 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195533 4719 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195543 4719 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195552 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195631 4719 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195717 4719 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195744 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195764 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195836 4719 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195890 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195919 4719 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195973 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.180213 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.180042 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.180279 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.180366 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.180386 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.180447 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.180677 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.180891 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.180945 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.181030 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.181229 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.181239 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.181261 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.181277 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.181340 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.183804 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.183939 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.184075 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.184281 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.184326 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.184370 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.184443 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.184608 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.184643 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.184664 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.184680 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.185073 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.185321 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.185438 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.185452 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.185562 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.185555 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.185628 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.185648 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.196539 4719 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.197073 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.185764 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.185963 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.186186 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.186287 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.186859 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.186889 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.187332 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.187459 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.188200 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.188955 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.188961 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.189006 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.189002 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.189426 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.189513 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.190241 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.190446 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.190468 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.190494 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.190525 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.190580 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.190602 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.190597 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.190605 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.189266 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.191029 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.191148 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.191237 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.191249 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.191363 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.191478 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.191525 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.189517 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.191576 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.191587 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.191891 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.191934 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.192025 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.192049 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.192103 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.192159 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.192162 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.192258 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.192468 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.192508 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.192561 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.192565 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.192591 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.192612 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.192620 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.192696 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.192820 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.193653 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.193681 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.193775 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.193962 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.193986 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.193981 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194128 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.194482 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195681 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195709 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195843 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.195966 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.196161 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.196165 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.196542 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.196877 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.196895 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.196984 4719 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.197216 4719 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.197334 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.197887 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.198114 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.198121 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.198129 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.197720 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.198474 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.198684 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.199035 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.199211 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.200299 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.200492 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.200598 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.201071 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 15:18:35.701047101 +0000 UTC m=+21.210758386 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.201506 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 15:18:35.701482945 +0000 UTC m=+21.211194300 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.198546 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.201838 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.202056 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.202070 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.202522 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.204424 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.206557 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.209112 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.210110 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.212703 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.213238 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.213324 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.213754 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.213840 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.214533 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.214752 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.215987 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.216017 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.216034 4719 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.216246 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 15:18:35.716152021 +0000 UTC m=+21.225863306 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.218785 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.218814 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.218827 4719 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.218883 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 15:18:35.718870739 +0000 UTC m=+21.228582024 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.226724 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.228287 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.234741 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.234824 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.235009 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.235025 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.234762 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.235326 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.235842 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.236129 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.236818 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.240274 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.242395 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.245571 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.246401 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.248086 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.248248 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.248290 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.253281 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.261559 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.262147 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.270818 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.285521 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.296921 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-multus-cni-dir\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297068 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-host-var-lib-cni-multus\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297140 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5w9w\" (UniqueName: \"kubernetes.io/projected/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-kube-api-access-h5w9w\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297191 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297242 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-log-socket\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297281 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-cni-bin\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297310 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fea6a48c-769c-41bf-95ce-649cc31eb4e5-ovnkube-script-lib\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297342 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-run-ovn\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297386 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/99353559-5b0b-4a9e-b759-0321ef3a8a71-rootfs\") pod \"machine-config-daemon-p9kwh\" (UID: \"99353559-5b0b-4a9e-b759-0321ef3a8a71\") " pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297376 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297408 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-host-run-k8s-cni-cncf-io\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297431 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk5rs\" (UniqueName: \"kubernetes.io/projected/e7db0861-5252-4efa-9464-e64b6d069d8e-kube-api-access-lk5rs\") pod \"node-resolver-j5mdb\" (UID: \"e7db0861-5252-4efa-9464-e64b6d069d8e\") " pod="openshift-dns/node-resolver-j5mdb" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297480 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09d0ca53-1333-4d50-948a-81d97d3182f6-os-release\") pod \"multus-additional-cni-plugins-sc5bv\" (UID: \"09d0ca53-1333-4d50-948a-81d97d3182f6\") " pod="openshift-multus/multus-additional-cni-plugins-sc5bv" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297505 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09d0ca53-1333-4d50-948a-81d97d3182f6-system-cni-dir\") pod \"multus-additional-cni-plugins-sc5bv\" (UID: \"09d0ca53-1333-4d50-948a-81d97d3182f6\") " pod="openshift-multus/multus-additional-cni-plugins-sc5bv" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297554 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99353559-5b0b-4a9e-b759-0321ef3a8a71-proxy-tls\") pod \"machine-config-daemon-p9kwh\" (UID: \"99353559-5b0b-4a9e-b759-0321ef3a8a71\") " pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297576 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-multus-conf-dir\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297617 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-run-ovn-kubernetes\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297647 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297663 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-node-log\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297714 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-slash\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297727 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297731 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-cni-binary-copy\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297776 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-cni-netd\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297801 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09d0ca53-1333-4d50-948a-81d97d3182f6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sc5bv\" (UID: \"09d0ca53-1333-4d50-948a-81d97d3182f6\") " pod="openshift-multus/multus-additional-cni-plugins-sc5bv" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297843 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-system-cni-dir\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297864 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-host-var-lib-cni-bin\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297887 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-host-run-multus-certs\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297932 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-run-netns\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297954 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-cnibin\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297976 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-host-run-netns\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.297998 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e7db0861-5252-4efa-9464-e64b6d069d8e-hosts-file\") pod \"node-resolver-j5mdb\" (UID: \"e7db0861-5252-4efa-9464-e64b6d069d8e\") " pod="openshift-dns/node-resolver-j5mdb" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.298034 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-etc-openvswitch\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.298056 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.298100 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4klsz\" (UniqueName: \"kubernetes.io/projected/99353559-5b0b-4a9e-b759-0321ef3a8a71-kube-api-access-4klsz\") pod \"machine-config-daemon-p9kwh\" (UID: \"99353559-5b0b-4a9e-b759-0321ef3a8a71\") " pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.298125 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-run-openvswitch\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.298145 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-os-release\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.298166 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-hostroot\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.298186 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fea6a48c-769c-41bf-95ce-649cc31eb4e5-ovn-node-metrics-cert\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.298221 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-kubelet\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.298241 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-run-systemd\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.298260 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fea6a48c-769c-41bf-95ce-649cc31eb4e5-ovnkube-config\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.298318 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-multus-socket-dir-parent\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.298341 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-multus-daemon-config\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.298380 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09d0ca53-1333-4d50-948a-81d97d3182f6-cnibin\") pod \"multus-additional-cni-plugins-sc5bv\" (UID: \"09d0ca53-1333-4d50-948a-81d97d3182f6\") " pod="openshift-multus/multus-additional-cni-plugins-sc5bv" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.298416 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-var-lib-openvswitch\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.298443 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lszx6\" (UniqueName: \"kubernetes.io/projected/09d0ca53-1333-4d50-948a-81d97d3182f6-kube-api-access-lszx6\") pod \"multus-additional-cni-plugins-sc5bv\" (UID: \"09d0ca53-1333-4d50-948a-81d97d3182f6\") " pod="openshift-multus/multus-additional-cni-plugins-sc5bv" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.298472 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09d0ca53-1333-4d50-948a-81d97d3182f6-cni-binary-copy\") pod \"multus-additional-cni-plugins-sc5bv\" (UID: \"09d0ca53-1333-4d50-948a-81d97d3182f6\") " pod="openshift-multus/multus-additional-cni-plugins-sc5bv" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.298493 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/09d0ca53-1333-4d50-948a-81d97d3182f6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sc5bv\" (UID: \"09d0ca53-1333-4d50-948a-81d97d3182f6\") " pod="openshift-multus/multus-additional-cni-plugins-sc5bv" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.298514 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84cf8\" (UniqueName: \"kubernetes.io/projected/fea6a48c-769c-41bf-95ce-649cc31eb4e5-kube-api-access-84cf8\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.298535 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/99353559-5b0b-4a9e-b759-0321ef3a8a71-mcd-auth-proxy-config\") pod \"machine-config-daemon-p9kwh\" (UID: \"99353559-5b0b-4a9e-b759-0321ef3a8a71\") " pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.298557 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-systemd-units\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.298589 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-host-var-lib-kubelet\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.298609 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-etc-kubernetes\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.298634 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fea6a48c-769c-41bf-95ce-649cc31eb4e5-env-overrides\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.298975 4719 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299001 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299022 4719 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299042 4719 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299117 4719 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299172 4719 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299191 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299204 4719 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299217 4719 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299228 4719 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299241 4719 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299253 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299265 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299276 4719 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299287 4719 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299299 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299312 4719 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299323 4719 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299253 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299334 4719 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299821 4719 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299841 4719 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299871 4719 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299882 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299893 4719 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299904 4719 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.299915 4719 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300021 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300031 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300043 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300077 4719 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300091 4719 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300103 4719 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300113 4719 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300124 4719 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300132 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300153 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300162 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300131 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300171 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300248 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300262 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300274 4719 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300286 4719 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300297 4719 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300329 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300342 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300392 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300401 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300410 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300421 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300432 4719 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300469 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300480 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300491 4719 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300502 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300512 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300546 4719 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300559 4719 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300577 4719 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300587 4719 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300598 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300635 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300655 4719 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300668 4719 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300680 4719 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300715 4719 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300727 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300739 4719 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300775 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300785 4719 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300796 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300809 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300820 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300851 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300863 4719 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300874 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300883 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300892 4719 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300901 4719 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300927 4719 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300937 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300946 4719 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300954 4719 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300963 4719 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.300972 4719 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301019 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301031 4719 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301043 4719 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301054 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301068 4719 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301098 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301109 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301117 4719 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301127 4719 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301135 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301144 4719 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301170 4719 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301180 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301191 4719 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301202 4719 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301214 4719 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301225 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301257 4719 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301266 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301274 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301282 4719 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301290 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301299 4719 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301307 4719 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301338 4719 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301379 4719 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301405 4719 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301419 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301429 4719 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301440 4719 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301478 4719 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301489 4719 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301497 4719 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301508 4719 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301519 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301529 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301564 4719 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301575 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301587 4719 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301598 4719 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301609 4719 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301644 4719 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301653 4719 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301661 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301670 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301678 4719 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301687 4719 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301695 4719 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301730 4719 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301743 4719 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301754 4719 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301764 4719 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301775 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301783 4719 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301791 4719 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301799 4719 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301807 4719 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301815 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.301882 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.304232 4719 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30" exitCode=255 Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.304309 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30"} Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.304380 4719 scope.go:117] "RemoveContainer" containerID="6bb563b23f59ad2cd83e71016c9f1497905e250c15aaabbedafa95973a5646c0" Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.315124 4719 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.318074 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.330928 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.339331 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.347432 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.368079 4719 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.368253 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.376114 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.385542 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.395491 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.402383 4719 scope.go:117] "RemoveContainer" containerID="5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30" Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.402610 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.402760 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403151 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403159 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09d0ca53-1333-4d50-948a-81d97d3182f6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sc5bv\" (UID: \"09d0ca53-1333-4d50-948a-81d97d3182f6\") " pod="openshift-multus/multus-additional-cni-plugins-sc5bv" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403300 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-cni-netd\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403332 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-host-var-lib-cni-bin\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403381 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-host-run-multus-certs\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403431 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-run-netns\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403452 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-system-cni-dir\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403474 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-host-run-netns\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403469 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-cni-netd\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403510 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-host-run-multus-certs\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403492 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e7db0861-5252-4efa-9464-e64b6d069d8e-hosts-file\") pod \"node-resolver-j5mdb\" (UID: \"e7db0861-5252-4efa-9464-e64b6d069d8e\") " pod="openshift-dns/node-resolver-j5mdb" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403482 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-run-netns\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403450 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-host-var-lib-cni-bin\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403562 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e7db0861-5252-4efa-9464-e64b6d069d8e-hosts-file\") pod \"node-resolver-j5mdb\" (UID: \"e7db0861-5252-4efa-9464-e64b6d069d8e\") " pod="openshift-dns/node-resolver-j5mdb" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403567 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-system-cni-dir\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403604 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-etc-openvswitch\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403647 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-host-run-netns\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403698 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-etc-openvswitch\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403714 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403791 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4klsz\" (UniqueName: \"kubernetes.io/projected/99353559-5b0b-4a9e-b759-0321ef3a8a71-kube-api-access-4klsz\") pod \"machine-config-daemon-p9kwh\" (UID: \"99353559-5b0b-4a9e-b759-0321ef3a8a71\") " pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403801 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403845 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-cnibin\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403913 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-cnibin\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403917 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09d0ca53-1333-4d50-948a-81d97d3182f6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sc5bv\" (UID: \"09d0ca53-1333-4d50-948a-81d97d3182f6\") " pod="openshift-multus/multus-additional-cni-plugins-sc5bv" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403942 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-run-openvswitch\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403978 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-run-openvswitch\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.403998 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-os-release\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404026 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-hostroot\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404047 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-os-release\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404055 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-kubelet\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404078 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-run-systemd\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404081 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-kubelet\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404096 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fea6a48c-769c-41bf-95ce-649cc31eb4e5-ovn-node-metrics-cert\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404091 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-hostroot\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404118 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-multus-socket-dir-parent\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404121 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-run-systemd\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404136 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-multus-daemon-config\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404156 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09d0ca53-1333-4d50-948a-81d97d3182f6-cnibin\") pod \"multus-additional-cni-plugins-sc5bv\" (UID: \"09d0ca53-1333-4d50-948a-81d97d3182f6\") " pod="openshift-multus/multus-additional-cni-plugins-sc5bv" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404176 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-var-lib-openvswitch\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404198 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fea6a48c-769c-41bf-95ce-649cc31eb4e5-ovnkube-config\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404214 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lszx6\" (UniqueName: \"kubernetes.io/projected/09d0ca53-1333-4d50-948a-81d97d3182f6-kube-api-access-lszx6\") pod \"multus-additional-cni-plugins-sc5bv\" (UID: \"09d0ca53-1333-4d50-948a-81d97d3182f6\") " pod="openshift-multus/multus-additional-cni-plugins-sc5bv" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404235 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/09d0ca53-1333-4d50-948a-81d97d3182f6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sc5bv\" (UID: \"09d0ca53-1333-4d50-948a-81d97d3182f6\") " pod="openshift-multus/multus-additional-cni-plugins-sc5bv" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404253 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84cf8\" (UniqueName: \"kubernetes.io/projected/fea6a48c-769c-41bf-95ce-649cc31eb4e5-kube-api-access-84cf8\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404272 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/99353559-5b0b-4a9e-b759-0321ef3a8a71-mcd-auth-proxy-config\") pod \"machine-config-daemon-p9kwh\" (UID: \"99353559-5b0b-4a9e-b759-0321ef3a8a71\") " pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404290 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09d0ca53-1333-4d50-948a-81d97d3182f6-cni-binary-copy\") pod \"multus-additional-cni-plugins-sc5bv\" (UID: \"09d0ca53-1333-4d50-948a-81d97d3182f6\") " pod="openshift-multus/multus-additional-cni-plugins-sc5bv" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404305 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-systemd-units\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404323 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-etc-kubernetes\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404338 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fea6a48c-769c-41bf-95ce-649cc31eb4e5-env-overrides\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404371 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-host-var-lib-kubelet\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404388 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-multus-cni-dir\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404406 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-host-var-lib-cni-multus\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404425 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5w9w\" (UniqueName: \"kubernetes.io/projected/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-kube-api-access-h5w9w\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404441 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-log-socket\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404459 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-cni-bin\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404483 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fea6a48c-769c-41bf-95ce-649cc31eb4e5-ovnkube-script-lib\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404501 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-run-ovn\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404517 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/99353559-5b0b-4a9e-b759-0321ef3a8a71-rootfs\") pod \"machine-config-daemon-p9kwh\" (UID: \"99353559-5b0b-4a9e-b759-0321ef3a8a71\") " pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404533 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-host-run-k8s-cni-cncf-io\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404553 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk5rs\" (UniqueName: \"kubernetes.io/projected/e7db0861-5252-4efa-9464-e64b6d069d8e-kube-api-access-lk5rs\") pod \"node-resolver-j5mdb\" (UID: \"e7db0861-5252-4efa-9464-e64b6d069d8e\") " pod="openshift-dns/node-resolver-j5mdb" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404569 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09d0ca53-1333-4d50-948a-81d97d3182f6-os-release\") pod \"multus-additional-cni-plugins-sc5bv\" (UID: \"09d0ca53-1333-4d50-948a-81d97d3182f6\") " pod="openshift-multus/multus-additional-cni-plugins-sc5bv" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404590 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99353559-5b0b-4a9e-b759-0321ef3a8a71-proxy-tls\") pod \"machine-config-daemon-p9kwh\" (UID: \"99353559-5b0b-4a9e-b759-0321ef3a8a71\") " pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404607 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-multus-conf-dir\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404624 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09d0ca53-1333-4d50-948a-81d97d3182f6-system-cni-dir\") pod \"multus-additional-cni-plugins-sc5bv\" (UID: \"09d0ca53-1333-4d50-948a-81d97d3182f6\") " pod="openshift-multus/multus-additional-cni-plugins-sc5bv" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404640 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-node-log\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404657 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-run-ovn-kubernetes\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404673 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-slash\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404688 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-cni-binary-copy\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404784 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-multus-daemon-config\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404824 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-host-var-lib-cni-multus\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.405032 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09d0ca53-1333-4d50-948a-81d97d3182f6-cnibin\") pod \"multus-additional-cni-plugins-sc5bv\" (UID: \"09d0ca53-1333-4d50-948a-81d97d3182f6\") " pod="openshift-multus/multus-additional-cni-plugins-sc5bv" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.405081 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-var-lib-openvswitch\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.404176 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-multus-socket-dir-parent\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.405127 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09d0ca53-1333-4d50-948a-81d97d3182f6-system-cni-dir\") pod \"multus-additional-cni-plugins-sc5bv\" (UID: \"09d0ca53-1333-4d50-948a-81d97d3182f6\") " pod="openshift-multus/multus-additional-cni-plugins-sc5bv" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.405258 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-log-socket\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.405287 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09d0ca53-1333-4d50-948a-81d97d3182f6-os-release\") pod \"multus-additional-cni-plugins-sc5bv\" (UID: \"09d0ca53-1333-4d50-948a-81d97d3182f6\") " pod="openshift-multus/multus-additional-cni-plugins-sc5bv" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.405293 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-cni-bin\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.405421 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/99353559-5b0b-4a9e-b759-0321ef3a8a71-rootfs\") pod \"machine-config-daemon-p9kwh\" (UID: \"99353559-5b0b-4a9e-b759-0321ef3a8a71\") " pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.405467 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-run-ovn\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.405484 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-cni-binary-copy\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.405498 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-run-ovn-kubernetes\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.405530 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-node-log\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.405550 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-slash\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.405559 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-etc-kubernetes\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.405638 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/99353559-5b0b-4a9e-b759-0321ef3a8a71-mcd-auth-proxy-config\") pod \"machine-config-daemon-p9kwh\" (UID: \"99353559-5b0b-4a9e-b759-0321ef3a8a71\") " pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.405833 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-multus-conf-dir\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.405919 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fea6a48c-769c-41bf-95ce-649cc31eb4e5-ovnkube-config\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.405974 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-host-var-lib-kubelet\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.405996 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-systemd-units\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.406035 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-host-run-k8s-cni-cncf-io\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.406129 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09d0ca53-1333-4d50-948a-81d97d3182f6-cni-binary-copy\") pod \"multus-additional-cni-plugins-sc5bv\" (UID: \"09d0ca53-1333-4d50-948a-81d97d3182f6\") " pod="openshift-multus/multus-additional-cni-plugins-sc5bv" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.406149 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-multus-cni-dir\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.406483 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/09d0ca53-1333-4d50-948a-81d97d3182f6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sc5bv\" (UID: \"09d0ca53-1333-4d50-948a-81d97d3182f6\") " pod="openshift-multus/multus-additional-cni-plugins-sc5bv" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.406589 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fea6a48c-769c-41bf-95ce-649cc31eb4e5-ovnkube-script-lib\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.406845 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fea6a48c-769c-41bf-95ce-649cc31eb4e5-env-overrides\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.407985 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99353559-5b0b-4a9e-b759-0321ef3a8a71-proxy-tls\") pod \"machine-config-daemon-p9kwh\" (UID: \"99353559-5b0b-4a9e-b759-0321ef3a8a71\") " pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.409246 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fea6a48c-769c-41bf-95ce-649cc31eb4e5-ovn-node-metrics-cert\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.419219 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.420715 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.422109 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lszx6\" (UniqueName: \"kubernetes.io/projected/09d0ca53-1333-4d50-948a-81d97d3182f6-kube-api-access-lszx6\") pod \"multus-additional-cni-plugins-sc5bv\" (UID: \"09d0ca53-1333-4d50-948a-81d97d3182f6\") " pod="openshift-multus/multus-additional-cni-plugins-sc5bv" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.422774 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk5rs\" (UniqueName: \"kubernetes.io/projected/e7db0861-5252-4efa-9464-e64b6d069d8e-kube-api-access-lk5rs\") pod \"node-resolver-j5mdb\" (UID: \"e7db0861-5252-4efa-9464-e64b6d069d8e\") " pod="openshift-dns/node-resolver-j5mdb" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.424086 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4klsz\" (UniqueName: \"kubernetes.io/projected/99353559-5b0b-4a9e-b759-0321ef3a8a71-kube-api-access-4klsz\") pod \"machine-config-daemon-p9kwh\" (UID: \"99353559-5b0b-4a9e-b759-0321ef3a8a71\") " pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.424517 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84cf8\" (UniqueName: \"kubernetes.io/projected/fea6a48c-769c-41bf-95ce-649cc31eb4e5-kube-api-access-84cf8\") pod \"ovnkube-node-zv8jk\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.429553 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.429960 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5w9w\" (UniqueName: \"kubernetes.io/projected/6a7f4c67-0335-4c58-896a-b3059d9a9a3f-kube-api-access-h5w9w\") pod \"multus-kmbvp\" (UID: \"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\") " pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: W1009 15:18:35.431417 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-00edb9a622f3305a1985973d6fcea71223839bbf6ad0e4f3ac804461d0577392 WatchSource:0}: Error finding container 00edb9a622f3305a1985973d6fcea71223839bbf6ad0e4f3ac804461d0577392: Status 404 returned error can't find the container with id 00edb9a622f3305a1985973d6fcea71223839bbf6ad0e4f3ac804461d0577392 Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.433799 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.440492 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.443087 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.450610 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.454937 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.459066 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.463752 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.464210 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.469680 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kmbvp" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.475132 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.477528 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-j5mdb" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.489592 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: W1009 15:18:35.494453 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09d0ca53_1333_4d50_948a_81d97d3182f6.slice/crio-a2ae2605ecec87c6b67d405d385c9b857dc59fc9fccda0e3f3fc03c3347b831a WatchSource:0}: Error finding container a2ae2605ecec87c6b67d405d385c9b857dc59fc9fccda0e3f3fc03c3347b831a: Status 404 returned error can't find the container with id a2ae2605ecec87c6b67d405d385c9b857dc59fc9fccda0e3f3fc03c3347b831a Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.498547 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.512961 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.541542 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: W1009 15:18:35.557406 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7db0861_5252_4efa_9464_e64b6d069d8e.slice/crio-2b89bb3151de8d0479355c0b357d862b2ce6fde1b39709ba2de0ab9b31b9015b WatchSource:0}: Error finding container 2b89bb3151de8d0479355c0b357d862b2ce6fde1b39709ba2de0ab9b31b9015b: Status 404 returned error can't find the container with id 2b89bb3151de8d0479355c0b357d862b2ce6fde1b39709ba2de0ab9b31b9015b Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.571624 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.613714 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.632565 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.643641 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.676281 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6bb563b23f59ad2cd83e71016c9f1497905e250c15aaabbedafa95973a5646c0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:28Z\\\",\\\"message\\\":\\\"W1009 15:18:18.334783 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1009 15:18:18.335163 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760023098 cert, and key in /tmp/serving-cert-3158143578/serving-signer.crt, /tmp/serving-cert-3158143578/serving-signer.key\\\\nI1009 15:18:18.512278 1 observer_polling.go:159] Starting file observer\\\\nW1009 15:18:18.515605 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1009 15:18:18.515745 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:18.518399 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3158143578/tls.crt::/tmp/serving-cert-3158143578/tls.key\\\\\\\"\\\\nF1009 15:18:28.859833 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.701526 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.706644 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.706757 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.706789 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.706855 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:18:36.706836689 +0000 UTC m=+22.216547974 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.706892 4719 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.706931 4719 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.706946 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 15:18:36.706929382 +0000 UTC m=+22.216640727 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.706962 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 15:18:36.706955843 +0000 UTC m=+22.216667128 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.723250 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.742477 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.759267 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.769524 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.807209 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:18:35 crc kubenswrapper[4719]: I1009 15:18:35.807253 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.807380 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.807379 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.807413 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.807425 4719 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.807477 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 15:18:36.807461533 +0000 UTC m=+22.317172818 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.807394 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.807745 4719 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:18:35 crc kubenswrapper[4719]: E1009 15:18:35.807772 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 15:18:36.807765373 +0000 UTC m=+22.317476658 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.308213 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5c5861be17edd543ad7d5ce25d59c4781b5a7cd9cd6e861602104cbccd6ed91d"} Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.309476 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-j5mdb" event={"ID":"e7db0861-5252-4efa-9464-e64b6d069d8e","Type":"ContainerStarted","Data":"a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db"} Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.309529 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-j5mdb" event={"ID":"e7db0861-5252-4efa-9464-e64b6d069d8e","Type":"ContainerStarted","Data":"2b89bb3151de8d0479355c0b357d862b2ce6fde1b39709ba2de0ab9b31b9015b"} Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.310579 4719 generic.go:334] "Generic (PLEG): container finished" podID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerID="7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f" exitCode=0 Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.310651 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerDied","Data":"7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f"} Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.310704 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerStarted","Data":"27dad12d3d4a004efdc84622336987da28adafc12291f6e1ae7aadd3b5a54473"} Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.312097 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kmbvp" event={"ID":"6a7f4c67-0335-4c58-896a-b3059d9a9a3f","Type":"ContainerStarted","Data":"11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783"} Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.312124 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kmbvp" event={"ID":"6a7f4c67-0335-4c58-896a-b3059d9a9a3f","Type":"ContainerStarted","Data":"8d90d58c0dadcef53d22b2e6cb4dadf3a5be4581d1324044df04f1b211b90db5"} Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.313634 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576"} Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.313663 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"00edb9a622f3305a1985973d6fcea71223839bbf6ad0e4f3ac804461d0577392"} Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.315161 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerStarted","Data":"76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c"} Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.315182 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerStarted","Data":"b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a"} Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.315193 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerStarted","Data":"a4302c269999e1a2d2a34d2a0d9cdc00b75dfaf9d02b68ac5ca3870b8fe1d403"} Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.316761 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f"} Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.316784 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601"} Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.316793 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"bb2c7590e2515b79d6adf372ae69de431c50cbe5d8dad93d55aa2f696037f4ee"} Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.318131 4719 generic.go:334] "Generic (PLEG): container finished" podID="09d0ca53-1333-4d50-948a-81d97d3182f6" containerID="3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa" exitCode=0 Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.318178 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" event={"ID":"09d0ca53-1333-4d50-948a-81d97d3182f6","Type":"ContainerDied","Data":"3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa"} Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.318193 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" event={"ID":"09d0ca53-1333-4d50-948a-81d97d3182f6","Type":"ContainerStarted","Data":"a2ae2605ecec87c6b67d405d385c9b857dc59fc9fccda0e3f3fc03c3347b831a"} Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.320292 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.320900 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 09 15:18:36 crc kubenswrapper[4719]: E1009 15:18:36.331282 4719 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.331572 4719 scope.go:117] "RemoveContainer" containerID="5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30" Oct 09 15:18:36 crc kubenswrapper[4719]: E1009 15:18:36.331752 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.334938 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.344109 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.362610 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.385808 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:36Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.399330 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:36Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.413710 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:36Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.426918 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:36Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.437504 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:36Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.454998 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6bb563b23f59ad2cd83e71016c9f1497905e250c15aaabbedafa95973a5646c0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:28Z\\\",\\\"message\\\":\\\"W1009 15:18:18.334783 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1009 15:18:18.335163 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760023098 cert, and key in /tmp/serving-cert-3158143578/serving-signer.crt, /tmp/serving-cert-3158143578/serving-signer.key\\\\nI1009 15:18:18.512278 1 observer_polling.go:159] Starting file observer\\\\nW1009 15:18:18.515605 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1009 15:18:18.515745 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:18.518399 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3158143578/tls.crt::/tmp/serving-cert-3158143578/tls.key\\\\\\\"\\\\nF1009 15:18:28.859833 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:36Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.468459 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:36Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.484073 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:36Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.496146 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:36Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.511270 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:36Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.524855 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:36Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.536525 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:36Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.552488 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:36Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.565082 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:36Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.587539 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:36Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.604318 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:36Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.617207 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:36Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.634522 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:36Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.656474 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:36Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.694395 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:36Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.714801 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:18:36 crc kubenswrapper[4719]: E1009 15:18:36.715057 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:18:38.715031955 +0000 UTC m=+24.224743240 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.715396 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:18:36 crc kubenswrapper[4719]: E1009 15:18:36.715626 4719 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 15:18:36 crc kubenswrapper[4719]: E1009 15:18:36.715693 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 15:18:38.715675225 +0000 UTC m=+24.225386570 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.715641 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:18:36 crc kubenswrapper[4719]: E1009 15:18:36.715913 4719 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 15:18:36 crc kubenswrapper[4719]: E1009 15:18:36.716015 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 15:18:38.716003725 +0000 UTC m=+24.225715010 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.739684 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:36Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.775471 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:36Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.817125 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:18:36 crc kubenswrapper[4719]: I1009 15:18:36.817172 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:18:36 crc kubenswrapper[4719]: E1009 15:18:36.817288 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 15:18:36 crc kubenswrapper[4719]: E1009 15:18:36.817304 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 15:18:36 crc kubenswrapper[4719]: E1009 15:18:36.817314 4719 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:18:36 crc kubenswrapper[4719]: E1009 15:18:36.817316 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 15:18:36 crc kubenswrapper[4719]: E1009 15:18:36.817345 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 15:18:36 crc kubenswrapper[4719]: E1009 15:18:36.817367 4719 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:18:36 crc kubenswrapper[4719]: E1009 15:18:36.817368 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 15:18:38.817340513 +0000 UTC m=+24.327051798 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:18:36 crc kubenswrapper[4719]: E1009 15:18:36.817417 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 15:18:38.817402235 +0000 UTC m=+24.327113520 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.160962 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.160991 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.161005 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:18:37 crc kubenswrapper[4719]: E1009 15:18:37.161094 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:18:37 crc kubenswrapper[4719]: E1009 15:18:37.161475 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:18:37 crc kubenswrapper[4719]: E1009 15:18:37.161557 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.165189 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.165976 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.166779 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.167532 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.168260 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.168906 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.169646 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.170314 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.171077 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.171686 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.172325 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.175271 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.175884 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.176933 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.177517 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.183114 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.184133 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.184755 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.185915 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.186629 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.187746 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.188422 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.188916 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.190165 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.190767 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.191993 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.192784 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.193801 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.194555 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.195587 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.196227 4719 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.196343 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.198210 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.199381 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.199913 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.201821 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.202981 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.203616 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.205834 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.206649 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.207701 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.208409 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.209535 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.210615 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.213976 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.214649 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.215940 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.216840 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.217963 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.218565 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.219658 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.220220 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.220778 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.221613 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.295703 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-mtpbz"] Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.295983 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mtpbz" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.297301 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.301759 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.301798 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.301800 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.314110 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.332488 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerStarted","Data":"f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0"} Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.332578 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerStarted","Data":"4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337"} Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.332591 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerStarted","Data":"80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e"} Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.332602 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerStarted","Data":"e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47"} Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.332874 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.340480 4719 generic.go:334] "Generic (PLEG): container finished" podID="09d0ca53-1333-4d50-948a-81d97d3182f6" containerID="c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521" exitCode=0 Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.340544 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" event={"ID":"09d0ca53-1333-4d50-948a-81d97d3182f6","Type":"ContainerDied","Data":"c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521"} Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.340941 4719 scope.go:117] "RemoveContainer" containerID="5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30" Oct 09 15:18:37 crc kubenswrapper[4719]: E1009 15:18:37.341062 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.347417 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.374530 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.391566 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.421692 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.424699 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfpkf\" (UniqueName: \"kubernetes.io/projected/bb84e765-e2c6-410b-9681-7c14d88a2537-kube-api-access-sfpkf\") pod \"node-ca-mtpbz\" (UID: \"bb84e765-e2c6-410b-9681-7c14d88a2537\") " pod="openshift-image-registry/node-ca-mtpbz" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.424780 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bb84e765-e2c6-410b-9681-7c14d88a2537-serviceca\") pod \"node-ca-mtpbz\" (UID: \"bb84e765-e2c6-410b-9681-7c14d88a2537\") " pod="openshift-image-registry/node-ca-mtpbz" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.424822 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb84e765-e2c6-410b-9681-7c14d88a2537-host\") pod \"node-ca-mtpbz\" (UID: \"bb84e765-e2c6-410b-9681-7c14d88a2537\") " pod="openshift-image-registry/node-ca-mtpbz" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.439294 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.453497 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.465681 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.476108 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.489413 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.505548 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.522263 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.525549 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfpkf\" (UniqueName: \"kubernetes.io/projected/bb84e765-e2c6-410b-9681-7c14d88a2537-kube-api-access-sfpkf\") pod \"node-ca-mtpbz\" (UID: \"bb84e765-e2c6-410b-9681-7c14d88a2537\") " pod="openshift-image-registry/node-ca-mtpbz" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.525584 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bb84e765-e2c6-410b-9681-7c14d88a2537-serviceca\") pod \"node-ca-mtpbz\" (UID: \"bb84e765-e2c6-410b-9681-7c14d88a2537\") " pod="openshift-image-registry/node-ca-mtpbz" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.525616 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb84e765-e2c6-410b-9681-7c14d88a2537-host\") pod \"node-ca-mtpbz\" (UID: \"bb84e765-e2c6-410b-9681-7c14d88a2537\") " pod="openshift-image-registry/node-ca-mtpbz" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.525670 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb84e765-e2c6-410b-9681-7c14d88a2537-host\") pod \"node-ca-mtpbz\" (UID: \"bb84e765-e2c6-410b-9681-7c14d88a2537\") " pod="openshift-image-registry/node-ca-mtpbz" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.526786 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bb84e765-e2c6-410b-9681-7c14d88a2537-serviceca\") pod \"node-ca-mtpbz\" (UID: \"bb84e765-e2c6-410b-9681-7c14d88a2537\") " pod="openshift-image-registry/node-ca-mtpbz" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.536449 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.546345 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfpkf\" (UniqueName: \"kubernetes.io/projected/bb84e765-e2c6-410b-9681-7c14d88a2537-kube-api-access-sfpkf\") pod \"node-ca-mtpbz\" (UID: \"bb84e765-e2c6-410b-9681-7c14d88a2537\") " pod="openshift-image-registry/node-ca-mtpbz" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.548387 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.564577 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.581680 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.597169 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.635103 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.655638 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mtpbz" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.672935 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: W1009 15:18:37.703035 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb84e765_e2c6_410b_9681_7c14d88a2537.slice/crio-265e018d9e5dd80f19aa658512cbf47f801f125f3f701a1320d1bd5dfb1e507b WatchSource:0}: Error finding container 265e018d9e5dd80f19aa658512cbf47f801f125f3f701a1320d1bd5dfb1e507b: Status 404 returned error can't find the container with id 265e018d9e5dd80f19aa658512cbf47f801f125f3f701a1320d1bd5dfb1e507b Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.718792 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.758147 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.796631 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.833417 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.873972 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.919944 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.956300 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:37 crc kubenswrapper[4719]: I1009 15:18:37.993861 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:37Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.344551 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mtpbz" event={"ID":"bb84e765-e2c6-410b-9681-7c14d88a2537","Type":"ContainerStarted","Data":"2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989"} Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.344591 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mtpbz" event={"ID":"bb84e765-e2c6-410b-9681-7c14d88a2537","Type":"ContainerStarted","Data":"265e018d9e5dd80f19aa658512cbf47f801f125f3f701a1320d1bd5dfb1e507b"} Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.347324 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerStarted","Data":"59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a"} Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.347371 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerStarted","Data":"d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b"} Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.349447 4719 generic.go:334] "Generic (PLEG): container finished" podID="09d0ca53-1333-4d50-948a-81d97d3182f6" containerID="54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a" exitCode=0 Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.349516 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" event={"ID":"09d0ca53-1333-4d50-948a-81d97d3182f6","Type":"ContainerDied","Data":"54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a"} Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.351026 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f"} Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.362516 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:38Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.374774 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:38Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.386445 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:38Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.407219 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:38Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.416462 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:38Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.467653 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:38Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.480415 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:38Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.494227 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:38Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.506574 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:38Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.516823 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:38Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.529863 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:38Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.544691 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:38Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.559889 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:38Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.574341 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:38Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.596580 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:38Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.637680 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:38Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.676086 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:38Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.712379 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:38Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.736727 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:18:38 crc kubenswrapper[4719]: E1009 15:18:38.736914 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:18:42.736890052 +0000 UTC m=+28.246601337 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.736979 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.737020 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:18:38 crc kubenswrapper[4719]: E1009 15:18:38.737100 4719 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 15:18:38 crc kubenswrapper[4719]: E1009 15:18:38.737147 4719 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 15:18:38 crc kubenswrapper[4719]: E1009 15:18:38.737164 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 15:18:42.73714784 +0000 UTC m=+28.246859125 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 15:18:38 crc kubenswrapper[4719]: E1009 15:18:38.737180 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 15:18:42.737173731 +0000 UTC m=+28.246885006 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.760831 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:38Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.793525 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:38Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.838035 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.838092 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:18:38 crc kubenswrapper[4719]: E1009 15:18:38.838194 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 15:18:38 crc kubenswrapper[4719]: E1009 15:18:38.838195 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 15:18:38 crc kubenswrapper[4719]: E1009 15:18:38.838222 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 15:18:38 crc kubenswrapper[4719]: E1009 15:18:38.838235 4719 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:18:38 crc kubenswrapper[4719]: E1009 15:18:38.838285 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 15:18:42.838269321 +0000 UTC m=+28.347980606 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:18:38 crc kubenswrapper[4719]: E1009 15:18:38.838208 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 15:18:38 crc kubenswrapper[4719]: E1009 15:18:38.838301 4719 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:18:38 crc kubenswrapper[4719]: E1009 15:18:38.838327 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 15:18:42.838319073 +0000 UTC m=+28.348030358 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.838813 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:38Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.874672 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:38Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.916329 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:38Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.956456 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:38Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:38 crc kubenswrapper[4719]: I1009 15:18:38.998626 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:38Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.034372 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:39Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.081274 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:39Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.111674 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:39Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.160303 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.160346 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:18:39 crc kubenswrapper[4719]: E1009 15:18:39.160453 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.160483 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:18:39 crc kubenswrapper[4719]: E1009 15:18:39.160552 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:18:39 crc kubenswrapper[4719]: E1009 15:18:39.160610 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.355230 4719 generic.go:334] "Generic (PLEG): container finished" podID="09d0ca53-1333-4d50-948a-81d97d3182f6" containerID="275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623" exitCode=0 Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.355299 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" event={"ID":"09d0ca53-1333-4d50-948a-81d97d3182f6","Type":"ContainerDied","Data":"275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623"} Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.368285 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:39Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.394576 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:39Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.404719 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:39Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.416944 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:39Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.429678 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:39Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.443033 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:39Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.454230 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:39Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.472496 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:39Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.481595 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:39Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.516180 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:39Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.554638 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:39Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.595553 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:39Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.638743 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:39Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.675462 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:39Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.803272 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.807272 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.813110 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.821024 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:39Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.834976 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:39Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.845675 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:39Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.863001 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:39Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.892473 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:39Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.934601 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:39Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:39 crc kubenswrapper[4719]: I1009 15:18:39.977150 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:39Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.013815 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.060282 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.094334 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.135857 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.175294 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.215846 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.255937 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.298925 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.337648 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.365320 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerStarted","Data":"65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e"} Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.368053 4719 generic.go:334] "Generic (PLEG): container finished" podID="09d0ca53-1333-4d50-948a-81d97d3182f6" containerID="73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32" exitCode=0 Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.368127 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" event={"ID":"09d0ca53-1333-4d50-948a-81d97d3182f6","Type":"ContainerDied","Data":"73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32"} Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.381219 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.416651 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.466665 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.500793 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.536187 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.575432 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.619800 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.652955 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.702414 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.736934 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.775598 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.816712 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.854256 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.865598 4719 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.867846 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.867881 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.867893 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.868030 4719 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.923801 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.928125 4719 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.928503 4719 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.929579 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.929702 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.929796 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.929883 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.929973 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:40Z","lastTransitionTime":"2025-10-09T15:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:40 crc kubenswrapper[4719]: E1009 15:18:40.945307 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.950729 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.950784 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.950801 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.950825 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.950840 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:40Z","lastTransitionTime":"2025-10-09T15:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:40 crc kubenswrapper[4719]: E1009 15:18:40.970141 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.973774 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.973811 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.973821 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.973835 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.973844 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:40Z","lastTransitionTime":"2025-10-09T15:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.980995 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: E1009 15:18:40.986566 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:40Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.989875 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.989913 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.989926 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.989943 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:40 crc kubenswrapper[4719]: I1009 15:18:40.989954 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:40Z","lastTransitionTime":"2025-10-09T15:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:41 crc kubenswrapper[4719]: E1009 15:18:41.003629 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.007376 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.007398 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.007407 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.007420 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.007429 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:41Z","lastTransitionTime":"2025-10-09T15:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.016801 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: E1009 15:18:41.020435 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: E1009 15:18:41.020592 4719 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.022311 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.022369 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.022384 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.022400 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.022410 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:41Z","lastTransitionTime":"2025-10-09T15:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.055438 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.096065 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.124325 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.124370 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.124397 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.124411 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.124437 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:41Z","lastTransitionTime":"2025-10-09T15:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.143185 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.160111 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.160190 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.160276 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:18:41 crc kubenswrapper[4719]: E1009 15:18:41.160209 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:18:41 crc kubenswrapper[4719]: E1009 15:18:41.160454 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:18:41 crc kubenswrapper[4719]: E1009 15:18:41.160480 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.175727 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.216839 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.226435 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.226478 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.226494 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.226517 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.226533 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:41Z","lastTransitionTime":"2025-10-09T15:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.260754 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.306159 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.329302 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.329622 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.329717 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.329799 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.329888 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:41Z","lastTransitionTime":"2025-10-09T15:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.337970 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.376038 4719 generic.go:334] "Generic (PLEG): container finished" podID="09d0ca53-1333-4d50-948a-81d97d3182f6" containerID="f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc" exitCode=0 Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.376082 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" event={"ID":"09d0ca53-1333-4d50-948a-81d97d3182f6","Type":"ContainerDied","Data":"f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc"} Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.382308 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.421863 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.437434 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.437488 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.437500 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.437518 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.437531 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:41Z","lastTransitionTime":"2025-10-09T15:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.465267 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.493098 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.538744 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.541617 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.541655 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.541666 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.541683 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.541694 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:41Z","lastTransitionTime":"2025-10-09T15:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.574166 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.617003 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.647192 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.647241 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.647251 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.647266 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.647275 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:41Z","lastTransitionTime":"2025-10-09T15:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.654935 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.695066 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.735144 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.749774 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.749811 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.749819 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.749849 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.749858 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:41Z","lastTransitionTime":"2025-10-09T15:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.778518 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.816580 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.852417 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.852446 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.852455 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.852468 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.852478 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:41Z","lastTransitionTime":"2025-10-09T15:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.859873 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.894728 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.940155 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.954576 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.954608 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.954616 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.954633 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.954641 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:41Z","lastTransitionTime":"2025-10-09T15:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:41 crc kubenswrapper[4719]: I1009 15:18:41.978860 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:41Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.016394 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.053863 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.056949 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.056982 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.056992 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.057007 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.057016 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:42Z","lastTransitionTime":"2025-10-09T15:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.093209 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.159536 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.159573 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.159584 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.159599 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.159607 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:42Z","lastTransitionTime":"2025-10-09T15:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.262019 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.262063 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.262075 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.262091 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.262103 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:42Z","lastTransitionTime":"2025-10-09T15:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.365050 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.365099 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.365110 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.365132 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.365143 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:42Z","lastTransitionTime":"2025-10-09T15:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.385761 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerStarted","Data":"19670744f4667f0104e6d2d4688e32b9acb7cbdc2946819ca192f259d6a1d8a8"} Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.386067 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.392831 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" event={"ID":"09d0ca53-1333-4d50-948a-81d97d3182f6","Type":"ContainerStarted","Data":"dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45"} Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.411517 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.422121 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.422582 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.435391 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.446778 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.460706 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.467197 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.467249 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.467265 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.467286 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.467302 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:42Z","lastTransitionTime":"2025-10-09T15:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.476311 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.491310 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.504936 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.517145 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.528734 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.539537 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.570098 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.570142 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.570152 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.570169 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.570181 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:42Z","lastTransitionTime":"2025-10-09T15:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.575127 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.614491 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.660415 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19670744f4667f0104e6d2d4688e32b9acb7cbdc2946819ca192f259d6a1d8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.672637 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.672670 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.672696 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.672719 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.672734 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:42Z","lastTransitionTime":"2025-10-09T15:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.696620 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.735506 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.773270 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.774695 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.774739 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.774748 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.774763 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.774772 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:42Z","lastTransitionTime":"2025-10-09T15:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.784044 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:18:42 crc kubenswrapper[4719]: E1009 15:18:42.784165 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:18:50.784145254 +0000 UTC m=+36.293856539 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.784212 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.784305 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:18:42 crc kubenswrapper[4719]: E1009 15:18:42.784325 4719 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 15:18:42 crc kubenswrapper[4719]: E1009 15:18:42.784384 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 15:18:50.784369261 +0000 UTC m=+36.294080546 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 15:18:42 crc kubenswrapper[4719]: E1009 15:18:42.784485 4719 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 15:18:42 crc kubenswrapper[4719]: E1009 15:18:42.784550 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 15:18:50.784534667 +0000 UTC m=+36.294245952 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.815779 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.854306 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.876320 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.876373 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.876385 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.876399 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.876417 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:42Z","lastTransitionTime":"2025-10-09T15:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.884854 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.884899 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:18:42 crc kubenswrapper[4719]: E1009 15:18:42.884963 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 15:18:42 crc kubenswrapper[4719]: E1009 15:18:42.884979 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 15:18:42 crc kubenswrapper[4719]: E1009 15:18:42.884982 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 15:18:42 crc kubenswrapper[4719]: E1009 15:18:42.884994 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 15:18:42 crc kubenswrapper[4719]: E1009 15:18:42.884997 4719 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:18:42 crc kubenswrapper[4719]: E1009 15:18:42.885004 4719 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:18:42 crc kubenswrapper[4719]: E1009 15:18:42.885039 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 15:18:50.885026887 +0000 UTC m=+36.394738172 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:18:42 crc kubenswrapper[4719]: E1009 15:18:42.885054 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 15:18:50.885049117 +0000 UTC m=+36.394760402 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.898382 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19670744f4667f0104e6d2d4688e32b9acb7cbdc2946819ca192f259d6a1d8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.933975 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.978510 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.978550 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.978561 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.978577 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.978587 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:42Z","lastTransitionTime":"2025-10-09T15:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:42 crc kubenswrapper[4719]: I1009 15:18:42.983642 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.014154 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:43Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.056013 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:43Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.081168 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.081197 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.081205 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.081218 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.081226 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:43Z","lastTransitionTime":"2025-10-09T15:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.096874 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:43Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.132748 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:43Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.161136 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:18:43 crc kubenswrapper[4719]: E1009 15:18:43.161297 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.161184 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:18:43 crc kubenswrapper[4719]: E1009 15:18:43.161434 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.161147 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:18:43 crc kubenswrapper[4719]: E1009 15:18:43.161504 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.175878 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:43Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.183444 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.183469 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.183477 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.183489 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.183498 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:43Z","lastTransitionTime":"2025-10-09T15:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.213017 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:43Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.255874 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:43Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.286048 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.286091 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.286103 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.286121 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.286133 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:43Z","lastTransitionTime":"2025-10-09T15:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.296945 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:43Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.388880 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.388933 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.388945 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.388960 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.389209 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:43Z","lastTransitionTime":"2025-10-09T15:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.394983 4719 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.395642 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.416075 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.429448 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:43Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.441654 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:43Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.453624 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:43Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.463295 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:43Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.490852 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.490890 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.490900 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.490915 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.490925 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:43Z","lastTransitionTime":"2025-10-09T15:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.502650 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:43Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.536027 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:43Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.574948 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:43Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.593424 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.593468 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.593478 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.593496 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.593514 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:43Z","lastTransitionTime":"2025-10-09T15:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.615693 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:43Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.654675 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:43Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.694217 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:43Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.695947 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.696054 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.696083 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.696121 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.696143 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:43Z","lastTransitionTime":"2025-10-09T15:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.733048 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:43Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.772877 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:43Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.798979 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.799023 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.799034 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.799050 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.799062 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:43Z","lastTransitionTime":"2025-10-09T15:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.819708 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19670744f4667f0104e6d2d4688e32b9acb7cbdc2946819ca192f259d6a1d8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:43Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.852723 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:43Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.895597 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:43Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.903145 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.903175 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.903184 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.903198 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:43 crc kubenswrapper[4719]: I1009 15:18:43.903212 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:43Z","lastTransitionTime":"2025-10-09T15:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.004939 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.004992 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.005005 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.005020 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.005030 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:44Z","lastTransitionTime":"2025-10-09T15:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.107725 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.107770 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.107781 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.107799 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.107812 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:44Z","lastTransitionTime":"2025-10-09T15:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.210447 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.210492 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.210503 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.210518 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.210527 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:44Z","lastTransitionTime":"2025-10-09T15:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.312588 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.312632 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.312643 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.312663 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.312674 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:44Z","lastTransitionTime":"2025-10-09T15:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.398037 4719 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.414841 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.414871 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.414885 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.414919 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.414932 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:44Z","lastTransitionTime":"2025-10-09T15:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.516982 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.517150 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.517165 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.517182 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.517193 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:44Z","lastTransitionTime":"2025-10-09T15:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.619425 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.619462 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.619471 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.619486 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.619495 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:44Z","lastTransitionTime":"2025-10-09T15:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.721785 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.721826 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.721835 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.721849 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.721858 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:44Z","lastTransitionTime":"2025-10-09T15:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.823918 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.823951 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.823960 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.823974 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.823983 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:44Z","lastTransitionTime":"2025-10-09T15:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.926392 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.926431 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.926443 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.926460 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:44 crc kubenswrapper[4719]: I1009 15:18:44.926472 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:44Z","lastTransitionTime":"2025-10-09T15:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.028837 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.028875 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.028886 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.028900 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.028910 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:45Z","lastTransitionTime":"2025-10-09T15:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.131643 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.131684 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.131693 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.131707 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.131717 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:45Z","lastTransitionTime":"2025-10-09T15:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.160940 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.161010 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.161048 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:18:45 crc kubenswrapper[4719]: E1009 15:18:45.161176 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:18:45 crc kubenswrapper[4719]: E1009 15:18:45.161485 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:18:45 crc kubenswrapper[4719]: E1009 15:18:45.161564 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.175326 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.189704 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.200847 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.249619 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.252887 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.253411 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.253443 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.255689 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:45Z","lastTransitionTime":"2025-10-09T15:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.276962 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.287787 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.305804 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.317132 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.330467 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.342033 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.352953 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.359463 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.359560 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.359575 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.360611 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.360645 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:45Z","lastTransitionTime":"2025-10-09T15:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.365214 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.376602 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.398073 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19670744f4667f0104e6d2d4688e32b9acb7cbdc2946819ca192f259d6a1d8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.402724 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zv8jk_fea6a48c-769c-41bf-95ce-649cc31eb4e5/ovnkube-controller/0.log" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.404768 4719 generic.go:334] "Generic (PLEG): container finished" podID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerID="19670744f4667f0104e6d2d4688e32b9acb7cbdc2946819ca192f259d6a1d8a8" exitCode=1 Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.404798 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerDied","Data":"19670744f4667f0104e6d2d4688e32b9acb7cbdc2946819ca192f259d6a1d8a8"} Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.405374 4719 scope.go:117] "RemoveContainer" containerID="19670744f4667f0104e6d2d4688e32b9acb7cbdc2946819ca192f259d6a1d8a8" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.409773 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.430418 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.441270 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.451427 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.463140 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.463178 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.463189 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.463204 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.463217 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:45Z","lastTransitionTime":"2025-10-09T15:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.463889 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.475784 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.499908 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19670744f4667f0104e6d2d4688e32b9acb7cbdc2946819ca192f259d6a1d8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19670744f4667f0104e6d2d4688e32b9acb7cbdc2946819ca192f259d6a1d8a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:18:45Z\\\",\\\"message\\\":\\\"1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 15:18:44.902141 6028 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 15:18:44.902167 6028 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 15:18:44.902191 6028 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1009 15:18:44.902199 6028 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 15:18:44.929737 6028 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1009 15:18:44.929804 6028 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1009 15:18:44.929829 6028 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1009 15:18:44.929846 6028 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1009 15:18:44.929854 6028 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1009 15:18:44.929869 6028 factory.go:656] Stopping watch factory\\\\nI1009 15:18:44.929882 6028 ovnkube.go:599] Stopped ovnkube\\\\nI1009 15:18:44.929910 6028 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1009 15:18:44.929921 6028 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1009 15:18:44.929927 6028 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1009 15:18:44.929932 6028 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.511318 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.532530 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.544181 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.555951 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.566743 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.566805 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.566824 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.566842 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.566860 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:45Z","lastTransitionTime":"2025-10-09T15:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.569377 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.580658 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.598831 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.608813 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.624092 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.637900 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.668702 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.668740 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.668749 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.668769 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.668779 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:45Z","lastTransitionTime":"2025-10-09T15:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.771379 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.771431 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.771474 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.771493 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.771505 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:45Z","lastTransitionTime":"2025-10-09T15:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.873663 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.873906 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.874022 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.874105 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.874181 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:45Z","lastTransitionTime":"2025-10-09T15:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.976742 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.976960 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.977023 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.977089 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:45 crc kubenswrapper[4719]: I1009 15:18:45.977178 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:45Z","lastTransitionTime":"2025-10-09T15:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.079584 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.079867 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.079877 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.079891 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.079900 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:46Z","lastTransitionTime":"2025-10-09T15:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.182441 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.182476 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.182485 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.182500 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.182509 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:46Z","lastTransitionTime":"2025-10-09T15:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.284749 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.284785 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.284794 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.284807 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.284818 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:46Z","lastTransitionTime":"2025-10-09T15:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.387340 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.387445 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.387458 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.387474 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.387483 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:46Z","lastTransitionTime":"2025-10-09T15:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.408922 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zv8jk_fea6a48c-769c-41bf-95ce-649cc31eb4e5/ovnkube-controller/1.log" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.409490 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zv8jk_fea6a48c-769c-41bf-95ce-649cc31eb4e5/ovnkube-controller/0.log" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.412116 4719 generic.go:334] "Generic (PLEG): container finished" podID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerID="0b0968bec2451c23d394ca65074ad152d78a54bd1dda603a35fb14f5e67af7ec" exitCode=1 Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.412199 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerDied","Data":"0b0968bec2451c23d394ca65074ad152d78a54bd1dda603a35fb14f5e67af7ec"} Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.412246 4719 scope.go:117] "RemoveContainer" containerID="19670744f4667f0104e6d2d4688e32b9acb7cbdc2946819ca192f259d6a1d8a8" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.412909 4719 scope.go:117] "RemoveContainer" containerID="0b0968bec2451c23d394ca65074ad152d78a54bd1dda603a35fb14f5e67af7ec" Oct 09 15:18:46 crc kubenswrapper[4719]: E1009 15:18:46.413190 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.433494 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:46Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.444738 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:46Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.464332 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:46Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.475535 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:46Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.486059 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:46Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.489513 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.489560 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.489573 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.489591 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.489603 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:46Z","lastTransitionTime":"2025-10-09T15:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.499963 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:46Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.517001 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:46Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.529949 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:46Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.542724 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:46Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.554537 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:46Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.566693 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:46Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.585923 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b0968bec2451c23d394ca65074ad152d78a54bd1dda603a35fb14f5e67af7ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19670744f4667f0104e6d2d4688e32b9acb7cbdc2946819ca192f259d6a1d8a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:18:45Z\\\",\\\"message\\\":\\\"1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 15:18:44.902141 6028 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 15:18:44.902167 6028 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 15:18:44.902191 6028 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1009 15:18:44.902199 6028 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 15:18:44.929737 6028 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1009 15:18:44.929804 6028 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1009 15:18:44.929829 6028 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1009 15:18:44.929846 6028 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1009 15:18:44.929854 6028 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1009 15:18:44.929869 6028 factory.go:656] Stopping watch factory\\\\nI1009 15:18:44.929882 6028 ovnkube.go:599] Stopped ovnkube\\\\nI1009 15:18:44.929910 6028 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1009 15:18:44.929921 6028 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1009 15:18:44.929927 6028 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1009 15:18:44.929932 6028 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b0968bec2451c23d394ca65074ad152d78a54bd1dda603a35fb14f5e67af7ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:18:46Z\\\",\\\"message\\\":\\\" 8bc1afc2-8724-4135-84df-aee09f23af4c 4514 0 2025-02-23 05:12:24 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-operator] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mco-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0077a47eb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-operator,},ClusterIP:10.217.4.183,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.183],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngre\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:46Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.592048 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.592085 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.592095 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.592110 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.592121 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:46Z","lastTransitionTime":"2025-10-09T15:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.596931 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:46Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.608491 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:46Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.619288 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:46Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.693951 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.693985 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.693993 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.694006 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.694014 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:46Z","lastTransitionTime":"2025-10-09T15:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.795833 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.795862 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.795870 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.795881 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.795890 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:46Z","lastTransitionTime":"2025-10-09T15:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.897683 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.897710 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.897717 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.897729 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:46 crc kubenswrapper[4719]: I1009 15:18:46.897736 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:46Z","lastTransitionTime":"2025-10-09T15:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.000782 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.000949 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.000967 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.000990 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.001009 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:47Z","lastTransitionTime":"2025-10-09T15:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.104179 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.104255 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.104277 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.104304 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.104327 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:47Z","lastTransitionTime":"2025-10-09T15:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.136224 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp"] Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.137451 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.141447 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.144172 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.151688 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.160538 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.160580 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:18:47 crc kubenswrapper[4719]: E1009 15:18:47.160623 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.160660 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:18:47 crc kubenswrapper[4719]: E1009 15:18:47.160683 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:18:47 crc kubenswrapper[4719]: E1009 15:18:47.160807 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.161916 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kddxt\" (UniqueName: \"kubernetes.io/projected/12b565dc-6ccc-4404-95f7-c8cf09f91802-kube-api-access-kddxt\") pod \"ovnkube-control-plane-749d76644c-vdgtp\" (UID: \"12b565dc-6ccc-4404-95f7-c8cf09f91802\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.161970 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/12b565dc-6ccc-4404-95f7-c8cf09f91802-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-vdgtp\" (UID: \"12b565dc-6ccc-4404-95f7-c8cf09f91802\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.161986 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/12b565dc-6ccc-4404-95f7-c8cf09f91802-env-overrides\") pod \"ovnkube-control-plane-749d76644c-vdgtp\" (UID: \"12b565dc-6ccc-4404-95f7-c8cf09f91802\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.162009 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/12b565dc-6ccc-4404-95f7-c8cf09f91802-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-vdgtp\" (UID: \"12b565dc-6ccc-4404-95f7-c8cf09f91802\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.165833 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.182368 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b0968bec2451c23d394ca65074ad152d78a54bd1dda603a35fb14f5e67af7ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19670744f4667f0104e6d2d4688e32b9acb7cbdc2946819ca192f259d6a1d8a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:18:45Z\\\",\\\"message\\\":\\\"1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 15:18:44.902141 6028 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 15:18:44.902167 6028 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 15:18:44.902191 6028 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1009 15:18:44.902199 6028 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 15:18:44.929737 6028 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1009 15:18:44.929804 6028 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1009 15:18:44.929829 6028 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1009 15:18:44.929846 6028 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1009 15:18:44.929854 6028 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1009 15:18:44.929869 6028 factory.go:656] Stopping watch factory\\\\nI1009 15:18:44.929882 6028 ovnkube.go:599] Stopped ovnkube\\\\nI1009 15:18:44.929910 6028 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1009 15:18:44.929921 6028 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1009 15:18:44.929927 6028 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1009 15:18:44.929932 6028 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b0968bec2451c23d394ca65074ad152d78a54bd1dda603a35fb14f5e67af7ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:18:46Z\\\",\\\"message\\\":\\\" 8bc1afc2-8724-4135-84df-aee09f23af4c 4514 0 2025-02-23 05:12:24 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-operator] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mco-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0077a47eb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-operator,},ClusterIP:10.217.4.183,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.183],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngre\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.191186 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.206182 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.206244 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.206257 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.206276 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.206288 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:47Z","lastTransitionTime":"2025-10-09T15:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.208496 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.222036 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.232856 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.243639 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.253281 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.263217 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/12b565dc-6ccc-4404-95f7-c8cf09f91802-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-vdgtp\" (UID: \"12b565dc-6ccc-4404-95f7-c8cf09f91802\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.263271 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/12b565dc-6ccc-4404-95f7-c8cf09f91802-env-overrides\") pod \"ovnkube-control-plane-749d76644c-vdgtp\" (UID: \"12b565dc-6ccc-4404-95f7-c8cf09f91802\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.263298 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/12b565dc-6ccc-4404-95f7-c8cf09f91802-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-vdgtp\" (UID: \"12b565dc-6ccc-4404-95f7-c8cf09f91802\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.263324 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kddxt\" (UniqueName: \"kubernetes.io/projected/12b565dc-6ccc-4404-95f7-c8cf09f91802-kube-api-access-kddxt\") pod \"ovnkube-control-plane-749d76644c-vdgtp\" (UID: \"12b565dc-6ccc-4404-95f7-c8cf09f91802\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.263850 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/12b565dc-6ccc-4404-95f7-c8cf09f91802-env-overrides\") pod \"ovnkube-control-plane-749d76644c-vdgtp\" (UID: \"12b565dc-6ccc-4404-95f7-c8cf09f91802\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.263941 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/12b565dc-6ccc-4404-95f7-c8cf09f91802-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-vdgtp\" (UID: \"12b565dc-6ccc-4404-95f7-c8cf09f91802\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.268328 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.270805 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/12b565dc-6ccc-4404-95f7-c8cf09f91802-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-vdgtp\" (UID: \"12b565dc-6ccc-4404-95f7-c8cf09f91802\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.281912 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kddxt\" (UniqueName: \"kubernetes.io/projected/12b565dc-6ccc-4404-95f7-c8cf09f91802-kube-api-access-kddxt\") pod \"ovnkube-control-plane-749d76644c-vdgtp\" (UID: \"12b565dc-6ccc-4404-95f7-c8cf09f91802\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.282926 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.296937 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.308055 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.308089 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.308097 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.308112 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.308124 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:47Z","lastTransitionTime":"2025-10-09T15:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.309785 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.320790 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b565dc-6ccc-4404-95f7-c8cf09f91802\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vdgtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.331949 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.342468 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.410655 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.410693 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.410702 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.410715 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.410747 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:47Z","lastTransitionTime":"2025-10-09T15:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.416398 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zv8jk_fea6a48c-769c-41bf-95ce-649cc31eb4e5/ovnkube-controller/1.log" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.421192 4719 scope.go:117] "RemoveContainer" containerID="0b0968bec2451c23d394ca65074ad152d78a54bd1dda603a35fb14f5e67af7ec" Oct 09 15:18:47 crc kubenswrapper[4719]: E1009 15:18:47.421493 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.430242 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b565dc-6ccc-4404-95f7-c8cf09f91802\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vdgtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.441574 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.450946 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.455819 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.463828 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.475508 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.487238 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.496877 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.505951 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.513144 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.513176 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.513186 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.513201 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.513211 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:47Z","lastTransitionTime":"2025-10-09T15:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.518916 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.529060 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.547310 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b0968bec2451c23d394ca65074ad152d78a54bd1dda603a35fb14f5e67af7ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b0968bec2451c23d394ca65074ad152d78a54bd1dda603a35fb14f5e67af7ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:18:46Z\\\",\\\"message\\\":\\\" 8bc1afc2-8724-4135-84df-aee09f23af4c 4514 0 2025-02-23 05:12:24 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-operator] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mco-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0077a47eb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-operator,},ClusterIP:10.217.4.183,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.183],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngre\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.558231 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.578768 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.590510 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.601274 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.613723 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:47Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.614973 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.614997 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.615006 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.615020 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.615029 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:47Z","lastTransitionTime":"2025-10-09T15:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.716493 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.716697 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.716847 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.716927 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.716987 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:47Z","lastTransitionTime":"2025-10-09T15:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.819801 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.819837 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.819848 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.819863 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.819871 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:47Z","lastTransitionTime":"2025-10-09T15:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.921897 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.921941 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.921954 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.921971 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:47 crc kubenswrapper[4719]: I1009 15:18:47.921982 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:47Z","lastTransitionTime":"2025-10-09T15:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.023996 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.024031 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.024040 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.024054 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.024063 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:48Z","lastTransitionTime":"2025-10-09T15:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.126845 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.126896 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.126909 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.126931 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.126943 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:48Z","lastTransitionTime":"2025-10-09T15:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.229244 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.229280 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.229288 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.229301 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.229310 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:48Z","lastTransitionTime":"2025-10-09T15:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.332314 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.332401 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.332419 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.332440 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.332455 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:48Z","lastTransitionTime":"2025-10-09T15:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.424011 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" event={"ID":"12b565dc-6ccc-4404-95f7-c8cf09f91802","Type":"ContainerStarted","Data":"6df96c88745808317300d950f2d991691695576773b7de02958ec718445cc3f6"} Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.424057 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" event={"ID":"12b565dc-6ccc-4404-95f7-c8cf09f91802","Type":"ContainerStarted","Data":"1c126340dff33c7a571fc152c4c8ed154e104aaab937ba7f68070763d79825b5"} Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.424068 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" event={"ID":"12b565dc-6ccc-4404-95f7-c8cf09f91802","Type":"ContainerStarted","Data":"1408128b8cb514d68a75aa8afb83065f75772d979a8d64d2417b6b29e694881e"} Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.435166 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.435211 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.435220 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.435235 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.435245 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:48Z","lastTransitionTime":"2025-10-09T15:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.441749 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.451153 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.462607 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.479857 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b0968bec2451c23d394ca65074ad152d78a54bd1dda603a35fb14f5e67af7ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b0968bec2451c23d394ca65074ad152d78a54bd1dda603a35fb14f5e67af7ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:18:46Z\\\",\\\"message\\\":\\\" 8bc1afc2-8724-4135-84df-aee09f23af4c 4514 0 2025-02-23 05:12:24 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-operator] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mco-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0077a47eb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-operator,},ClusterIP:10.217.4.183,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.183],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngre\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.487974 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.489040 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.489729 4719 scope.go:117] "RemoveContainer" containerID="0b0968bec2451c23d394ca65074ad152d78a54bd1dda603a35fb14f5e67af7ec" Oct 09 15:18:48 crc kubenswrapper[4719]: E1009 15:18:48.489866 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.498930 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.508905 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.518034 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.529107 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.537465 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.537506 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.537520 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.537536 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.537545 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:48Z","lastTransitionTime":"2025-10-09T15:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.538569 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.555231 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.566490 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.578246 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.588061 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-58bdp"] Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.588702 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:18:48 crc kubenswrapper[4719]: E1009 15:18:48.588784 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.590844 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.599651 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b565dc-6ccc-4404-95f7-c8cf09f91802\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c126340dff33c7a571fc152c4c8ed154e104aaab937ba7f68070763d79825b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df96c88745808317300d950f2d991691695576773b7de02958ec718445cc3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vdgtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.613008 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.623168 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.636702 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.640062 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.640094 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.640105 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.640121 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.640130 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:48Z","lastTransitionTime":"2025-10-09T15:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.649777 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.659812 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b565dc-6ccc-4404-95f7-c8cf09f91802\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c126340dff33c7a571fc152c4c8ed154e104aaab937ba7f68070763d79825b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df96c88745808317300d950f2d991691695576773b7de02958ec718445cc3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vdgtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.671827 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.682244 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.691914 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.695248 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs\") pod \"network-metrics-daemon-58bdp\" (UID: \"d00237ae-ca20-4202-8e24-e4988fbf5269\") " pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.695299 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hwpx\" (UniqueName: \"kubernetes.io/projected/d00237ae-ca20-4202-8e24-e4988fbf5269-kube-api-access-5hwpx\") pod \"network-metrics-daemon-58bdp\" (UID: \"d00237ae-ca20-4202-8e24-e4988fbf5269\") " pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.703295 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.721056 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b0968bec2451c23d394ca65074ad152d78a54bd1dda603a35fb14f5e67af7ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b0968bec2451c23d394ca65074ad152d78a54bd1dda603a35fb14f5e67af7ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:18:46Z\\\",\\\"message\\\":\\\" 8bc1afc2-8724-4135-84df-aee09f23af4c 4514 0 2025-02-23 05:12:24 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-operator] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mco-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0077a47eb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-operator,},ClusterIP:10.217.4.183,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.183],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngre\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.730449 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.740886 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.742118 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.742178 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.742188 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.742221 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.742231 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:48Z","lastTransitionTime":"2025-10-09T15:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.752643 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.764162 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.774572 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.784530 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.793313 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-58bdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00237ae-ca20-4202-8e24-e4988fbf5269\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-58bdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.795651 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs\") pod \"network-metrics-daemon-58bdp\" (UID: \"d00237ae-ca20-4202-8e24-e4988fbf5269\") " pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.795755 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hwpx\" (UniqueName: \"kubernetes.io/projected/d00237ae-ca20-4202-8e24-e4988fbf5269-kube-api-access-5hwpx\") pod \"network-metrics-daemon-58bdp\" (UID: \"d00237ae-ca20-4202-8e24-e4988fbf5269\") " pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:18:48 crc kubenswrapper[4719]: E1009 15:18:48.795821 4719 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 15:18:48 crc kubenswrapper[4719]: E1009 15:18:48.795921 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs podName:d00237ae-ca20-4202-8e24-e4988fbf5269 nodeName:}" failed. No retries permitted until 2025-10-09 15:18:49.295901813 +0000 UTC m=+34.805613098 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs") pod "network-metrics-daemon-58bdp" (UID: "d00237ae-ca20-4202-8e24-e4988fbf5269") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.810201 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hwpx\" (UniqueName: \"kubernetes.io/projected/d00237ae-ca20-4202-8e24-e4988fbf5269-kube-api-access-5hwpx\") pod \"network-metrics-daemon-58bdp\" (UID: \"d00237ae-ca20-4202-8e24-e4988fbf5269\") " pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.812444 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:48Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.845065 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.845096 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.845104 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.845116 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.845127 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:48Z","lastTransitionTime":"2025-10-09T15:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.947640 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.947677 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.947687 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.947704 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:48 crc kubenswrapper[4719]: I1009 15:18:48.947714 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:48Z","lastTransitionTime":"2025-10-09T15:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.050215 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.050266 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.050276 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.050291 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.050302 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:49Z","lastTransitionTime":"2025-10-09T15:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.152961 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.153001 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.153025 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.153043 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.153054 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:49Z","lastTransitionTime":"2025-10-09T15:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.160118 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.160167 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.160127 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:18:49 crc kubenswrapper[4719]: E1009 15:18:49.160255 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:18:49 crc kubenswrapper[4719]: E1009 15:18:49.160308 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:18:49 crc kubenswrapper[4719]: E1009 15:18:49.160446 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.255948 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.256204 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.256310 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.256423 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.256553 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:49Z","lastTransitionTime":"2025-10-09T15:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.300771 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs\") pod \"network-metrics-daemon-58bdp\" (UID: \"d00237ae-ca20-4202-8e24-e4988fbf5269\") " pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:18:49 crc kubenswrapper[4719]: E1009 15:18:49.301198 4719 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 15:18:49 crc kubenswrapper[4719]: E1009 15:18:49.301385 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs podName:d00237ae-ca20-4202-8e24-e4988fbf5269 nodeName:}" failed. No retries permitted until 2025-10-09 15:18:50.30134263 +0000 UTC m=+35.811053985 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs") pod "network-metrics-daemon-58bdp" (UID: "d00237ae-ca20-4202-8e24-e4988fbf5269") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.358546 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.358577 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.358585 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.358598 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.358606 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:49Z","lastTransitionTime":"2025-10-09T15:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.460856 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.460906 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.460918 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.460933 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.460944 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:49Z","lastTransitionTime":"2025-10-09T15:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.563675 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.563967 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.563977 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.563991 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.564001 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:49Z","lastTransitionTime":"2025-10-09T15:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.665824 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.665861 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.665872 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.665888 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.665899 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:49Z","lastTransitionTime":"2025-10-09T15:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.769585 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.769650 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.769668 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.769698 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.769719 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:49Z","lastTransitionTime":"2025-10-09T15:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.871873 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.871919 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.871931 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.871948 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.871960 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:49Z","lastTransitionTime":"2025-10-09T15:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.974130 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.974183 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.974197 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.974213 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:49 crc kubenswrapper[4719]: I1009 15:18:49.974225 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:49Z","lastTransitionTime":"2025-10-09T15:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.076915 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.076969 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.076981 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.076998 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.077009 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:50Z","lastTransitionTime":"2025-10-09T15:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.160166 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:18:50 crc kubenswrapper[4719]: E1009 15:18:50.160450 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.160736 4719 scope.go:117] "RemoveContainer" containerID="5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.180462 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.180496 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.180507 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.180522 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.180533 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:50Z","lastTransitionTime":"2025-10-09T15:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.282394 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.282460 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.282470 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.282484 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.282493 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:50Z","lastTransitionTime":"2025-10-09T15:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.310104 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs\") pod \"network-metrics-daemon-58bdp\" (UID: \"d00237ae-ca20-4202-8e24-e4988fbf5269\") " pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:18:50 crc kubenswrapper[4719]: E1009 15:18:50.310389 4719 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 15:18:50 crc kubenswrapper[4719]: E1009 15:18:50.310507 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs podName:d00237ae-ca20-4202-8e24-e4988fbf5269 nodeName:}" failed. No retries permitted until 2025-10-09 15:18:52.310484475 +0000 UTC m=+37.820195780 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs") pod "network-metrics-daemon-58bdp" (UID: "d00237ae-ca20-4202-8e24-e4988fbf5269") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.384910 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.384953 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.384965 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.384981 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.384991 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:50Z","lastTransitionTime":"2025-10-09T15:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.434212 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.436655 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ac6f2af57f612bf33446a88a0a093adb3b64f562412d9a0bd03f3964c281ba4a"} Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.436948 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.467463 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:50Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.487880 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.487952 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.488032 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.488073 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.488118 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:50Z","lastTransitionTime":"2025-10-09T15:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.501490 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:50Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.522825 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-58bdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00237ae-ca20-4202-8e24-e4988fbf5269\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-58bdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:50Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.542245 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:50Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.555894 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:50Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.568547 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:50Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.579563 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:50Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.587750 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b565dc-6ccc-4404-95f7-c8cf09f91802\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c126340dff33c7a571fc152c4c8ed154e104aaab937ba7f68070763d79825b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df96c88745808317300d950f2d991691695576773b7de02958ec718445cc3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vdgtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:50Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.593476 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.593529 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.593541 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.593558 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.593569 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:50Z","lastTransitionTime":"2025-10-09T15:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.603458 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6f2af57f612bf33446a88a0a093adb3b64f562412d9a0bd03f3964c281ba4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:50Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.614283 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:50Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.628327 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:50Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.640879 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:50Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.650810 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:50Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.665535 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b0968bec2451c23d394ca65074ad152d78a54bd1dda603a35fb14f5e67af7ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b0968bec2451c23d394ca65074ad152d78a54bd1dda603a35fb14f5e67af7ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:18:46Z\\\",\\\"message\\\":\\\" 8bc1afc2-8724-4135-84df-aee09f23af4c 4514 0 2025-02-23 05:12:24 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-operator] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mco-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0077a47eb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-operator,},ClusterIP:10.217.4.183,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.183],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngre\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:50Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.675187 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:50Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.686439 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:50Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.695673 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.695729 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.695739 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.695753 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.695762 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:50Z","lastTransitionTime":"2025-10-09T15:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.697758 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:50Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.797973 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.798021 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.798031 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.798047 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.798061 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:50Z","lastTransitionTime":"2025-10-09T15:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.816442 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.816586 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:18:50 crc kubenswrapper[4719]: E1009 15:18:50.816653 4719 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 15:18:50 crc kubenswrapper[4719]: E1009 15:18:50.816659 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:19:06.816630415 +0000 UTC m=+52.326341740 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:18:50 crc kubenswrapper[4719]: E1009 15:18:50.816733 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 15:19:06.816720437 +0000 UTC m=+52.326431732 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.816775 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:18:50 crc kubenswrapper[4719]: E1009 15:18:50.816977 4719 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 15:18:50 crc kubenswrapper[4719]: E1009 15:18:50.817028 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 15:19:06.817012647 +0000 UTC m=+52.326723932 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.900469 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.900501 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.900510 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.900524 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.900533 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:50Z","lastTransitionTime":"2025-10-09T15:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.918310 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:18:50 crc kubenswrapper[4719]: I1009 15:18:50.918434 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:18:50 crc kubenswrapper[4719]: E1009 15:18:50.918559 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 15:18:50 crc kubenswrapper[4719]: E1009 15:18:50.918576 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 15:18:50 crc kubenswrapper[4719]: E1009 15:18:50.918586 4719 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:18:50 crc kubenswrapper[4719]: E1009 15:18:50.918627 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 15:19:06.918614583 +0000 UTC m=+52.428325868 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:18:50 crc kubenswrapper[4719]: E1009 15:18:50.919006 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 15:18:50 crc kubenswrapper[4719]: E1009 15:18:50.919040 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 15:18:50 crc kubenswrapper[4719]: E1009 15:18:50.919051 4719 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:18:50 crc kubenswrapper[4719]: E1009 15:18:50.919107 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 15:19:06.919090838 +0000 UTC m=+52.428802123 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.002607 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.002637 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.002659 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.002674 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.002682 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:51Z","lastTransitionTime":"2025-10-09T15:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.105091 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.105136 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.105148 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.105167 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.105178 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:51Z","lastTransitionTime":"2025-10-09T15:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.160898 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.160982 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:18:51 crc kubenswrapper[4719]: E1009 15:18:51.161064 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.161063 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:18:51 crc kubenswrapper[4719]: E1009 15:18:51.161153 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:18:51 crc kubenswrapper[4719]: E1009 15:18:51.161250 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.207154 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.207191 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.207200 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.207216 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.207226 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:51Z","lastTransitionTime":"2025-10-09T15:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.235011 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.235250 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.235324 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.235416 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.235477 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:51Z","lastTransitionTime":"2025-10-09T15:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:51 crc kubenswrapper[4719]: E1009 15:18:51.247314 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:51Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.250663 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.250695 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.250705 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.250735 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.250744 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:51Z","lastTransitionTime":"2025-10-09T15:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:51 crc kubenswrapper[4719]: E1009 15:18:51.263634 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:51Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.266806 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.266933 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.267002 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.267082 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.267138 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:51Z","lastTransitionTime":"2025-10-09T15:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:51 crc kubenswrapper[4719]: E1009 15:18:51.277949 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:51Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.280621 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.280749 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.280817 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.280879 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.280945 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:51Z","lastTransitionTime":"2025-10-09T15:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:51 crc kubenswrapper[4719]: E1009 15:18:51.291609 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:51Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.295599 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.295671 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.295685 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.295702 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.295713 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:51Z","lastTransitionTime":"2025-10-09T15:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:51 crc kubenswrapper[4719]: E1009 15:18:51.307191 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:51Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:51 crc kubenswrapper[4719]: E1009 15:18:51.307390 4719 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.309041 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.309078 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.309087 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.309102 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.309111 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:51Z","lastTransitionTime":"2025-10-09T15:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.410872 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.411134 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.411215 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.411302 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.411405 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:51Z","lastTransitionTime":"2025-10-09T15:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.514121 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.514168 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.514179 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.514196 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.514207 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:51Z","lastTransitionTime":"2025-10-09T15:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.616241 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.616321 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.616338 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.616405 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.616443 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:51Z","lastTransitionTime":"2025-10-09T15:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.719995 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.720056 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.720108 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.720142 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.720163 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:51Z","lastTransitionTime":"2025-10-09T15:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.824240 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.824309 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.824330 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.824386 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.824404 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:51Z","lastTransitionTime":"2025-10-09T15:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.927216 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.927258 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.927273 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.927295 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:51 crc kubenswrapper[4719]: I1009 15:18:51.927311 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:51Z","lastTransitionTime":"2025-10-09T15:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.030257 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.030316 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.030332 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.030379 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.030411 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:52Z","lastTransitionTime":"2025-10-09T15:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.132087 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.132117 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.132126 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.132203 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.132213 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:52Z","lastTransitionTime":"2025-10-09T15:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.160651 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:18:52 crc kubenswrapper[4719]: E1009 15:18:52.160804 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.234114 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.234153 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.234165 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.234181 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.234193 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:52Z","lastTransitionTime":"2025-10-09T15:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.332415 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs\") pod \"network-metrics-daemon-58bdp\" (UID: \"d00237ae-ca20-4202-8e24-e4988fbf5269\") " pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:18:52 crc kubenswrapper[4719]: E1009 15:18:52.332550 4719 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 15:18:52 crc kubenswrapper[4719]: E1009 15:18:52.332615 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs podName:d00237ae-ca20-4202-8e24-e4988fbf5269 nodeName:}" failed. No retries permitted until 2025-10-09 15:18:56.332596402 +0000 UTC m=+41.842307687 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs") pod "network-metrics-daemon-58bdp" (UID: "d00237ae-ca20-4202-8e24-e4988fbf5269") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.337755 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.337805 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.337821 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.337842 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.337857 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:52Z","lastTransitionTime":"2025-10-09T15:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.440330 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.440380 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.440392 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.440407 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.440415 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:52Z","lastTransitionTime":"2025-10-09T15:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.542713 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.542759 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.542768 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.542784 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.542793 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:52Z","lastTransitionTime":"2025-10-09T15:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.645134 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.645167 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.645176 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.645190 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.645201 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:52Z","lastTransitionTime":"2025-10-09T15:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.748281 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.748321 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.748331 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.748359 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.748368 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:52Z","lastTransitionTime":"2025-10-09T15:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.851134 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.851183 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.851198 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.851215 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.851226 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:52Z","lastTransitionTime":"2025-10-09T15:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.953636 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.953677 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.953687 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.953703 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:52 crc kubenswrapper[4719]: I1009 15:18:52.953714 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:52Z","lastTransitionTime":"2025-10-09T15:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.055919 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.055951 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.055960 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.055974 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.055983 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:53Z","lastTransitionTime":"2025-10-09T15:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.158093 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.158124 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.158134 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.158147 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.158157 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:53Z","lastTransitionTime":"2025-10-09T15:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.160383 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.160430 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.160383 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:18:53 crc kubenswrapper[4719]: E1009 15:18:53.160485 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:18:53 crc kubenswrapper[4719]: E1009 15:18:53.160545 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:18:53 crc kubenswrapper[4719]: E1009 15:18:53.160647 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.259787 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.259846 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.259855 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.259871 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.259881 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:53Z","lastTransitionTime":"2025-10-09T15:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.365596 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.365631 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.365641 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.365656 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.365667 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:53Z","lastTransitionTime":"2025-10-09T15:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.468501 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.468547 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.468558 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.468574 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.468586 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:53Z","lastTransitionTime":"2025-10-09T15:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.570854 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.570890 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.570917 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.570930 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.570938 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:53Z","lastTransitionTime":"2025-10-09T15:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.673360 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.673418 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.673431 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.673448 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.673461 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:53Z","lastTransitionTime":"2025-10-09T15:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.775895 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.775923 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.775931 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.775945 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.775955 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:53Z","lastTransitionTime":"2025-10-09T15:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.878895 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.878967 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.878989 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.879025 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.879048 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:53Z","lastTransitionTime":"2025-10-09T15:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.982155 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.982237 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.982256 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.982289 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:53 crc kubenswrapper[4719]: I1009 15:18:53.982322 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:53Z","lastTransitionTime":"2025-10-09T15:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.086553 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.086614 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.086626 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.086643 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.086656 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:54Z","lastTransitionTime":"2025-10-09T15:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.161295 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:18:54 crc kubenswrapper[4719]: E1009 15:18:54.161659 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.189685 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.189745 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.189755 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.189771 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.189782 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:54Z","lastTransitionTime":"2025-10-09T15:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.292816 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.293377 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.293481 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.293588 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.293690 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:54Z","lastTransitionTime":"2025-10-09T15:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.396333 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.396649 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.396735 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.396827 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.396919 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:54Z","lastTransitionTime":"2025-10-09T15:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.500593 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.500667 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.500685 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.500715 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.500738 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:54Z","lastTransitionTime":"2025-10-09T15:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.603540 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.603625 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.603645 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.603675 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.603698 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:54Z","lastTransitionTime":"2025-10-09T15:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.706142 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.706181 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.706191 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.706205 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.706213 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:54Z","lastTransitionTime":"2025-10-09T15:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.808310 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.808408 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.808422 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.808442 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.808454 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:54Z","lastTransitionTime":"2025-10-09T15:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.911185 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.911571 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.911664 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.911755 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:54 crc kubenswrapper[4719]: I1009 15:18:54.911827 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:54Z","lastTransitionTime":"2025-10-09T15:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.014465 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.014506 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.014515 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.014529 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.014538 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:55Z","lastTransitionTime":"2025-10-09T15:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.116455 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.116517 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.116534 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.116559 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.116576 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:55Z","lastTransitionTime":"2025-10-09T15:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.161381 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.161431 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:18:55 crc kubenswrapper[4719]: E1009 15:18:55.161515 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:18:55 crc kubenswrapper[4719]: E1009 15:18:55.161572 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.161402 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:18:55 crc kubenswrapper[4719]: E1009 15:18:55.162254 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.175979 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6f2af57f612bf33446a88a0a093adb3b64f562412d9a0bd03f3964c281ba4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:55Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.188547 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:55Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.204373 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:55Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.218327 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.218397 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.218408 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.218423 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.218433 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:55Z","lastTransitionTime":"2025-10-09T15:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.218422 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:55Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.229672 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b565dc-6ccc-4404-95f7-c8cf09f91802\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c126340dff33c7a571fc152c4c8ed154e104aaab937ba7f68070763d79825b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df96c88745808317300d950f2d991691695576773b7de02958ec718445cc3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vdgtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:55Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.242927 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:55Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.256289 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:55Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.269375 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:55Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.282462 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:55Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.302809 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b0968bec2451c23d394ca65074ad152d78a54bd1dda603a35fb14f5e67af7ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b0968bec2451c23d394ca65074ad152d78a54bd1dda603a35fb14f5e67af7ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:18:46Z\\\",\\\"message\\\":\\\" 8bc1afc2-8724-4135-84df-aee09f23af4c 4514 0 2025-02-23 05:12:24 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-operator] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mco-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0077a47eb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-operator,},ClusterIP:10.217.4.183,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.183],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngre\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:55Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.314796 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:55Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.325856 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-58bdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00237ae-ca20-4202-8e24-e4988fbf5269\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-58bdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:55Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.326291 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.326397 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.326412 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.326438 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.326453 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:55Z","lastTransitionTime":"2025-10-09T15:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.345046 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:55Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.357657 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:55Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.372164 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:55Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.386878 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:55Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.399072 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:18:55Z is after 2025-08-24T17:21:41Z" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.429524 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.429573 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.429586 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.429610 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.429624 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:55Z","lastTransitionTime":"2025-10-09T15:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.532690 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.532768 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.532789 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.532825 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.532848 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:55Z","lastTransitionTime":"2025-10-09T15:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.635065 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.635106 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.635124 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.635140 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.635150 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:55Z","lastTransitionTime":"2025-10-09T15:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.738450 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.738520 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.738533 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.738558 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.738578 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:55Z","lastTransitionTime":"2025-10-09T15:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.841672 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.841727 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.841741 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.841760 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.841774 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:55Z","lastTransitionTime":"2025-10-09T15:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.944981 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.945063 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.945086 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.945137 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:55 crc kubenswrapper[4719]: I1009 15:18:55.945166 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:55Z","lastTransitionTime":"2025-10-09T15:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.049166 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.049233 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.049257 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.049290 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.049310 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:56Z","lastTransitionTime":"2025-10-09T15:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.151518 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.151559 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.151568 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.151589 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.151598 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:56Z","lastTransitionTime":"2025-10-09T15:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.161016 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:18:56 crc kubenswrapper[4719]: E1009 15:18:56.161143 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.255144 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.255190 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.255198 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.255213 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.255229 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:56Z","lastTransitionTime":"2025-10-09T15:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.358133 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.358168 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.358181 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.358195 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.358206 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:56Z","lastTransitionTime":"2025-10-09T15:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.375938 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs\") pod \"network-metrics-daemon-58bdp\" (UID: \"d00237ae-ca20-4202-8e24-e4988fbf5269\") " pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:18:56 crc kubenswrapper[4719]: E1009 15:18:56.376094 4719 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 15:18:56 crc kubenswrapper[4719]: E1009 15:18:56.376144 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs podName:d00237ae-ca20-4202-8e24-e4988fbf5269 nodeName:}" failed. No retries permitted until 2025-10-09 15:19:04.376127632 +0000 UTC m=+49.885838917 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs") pod "network-metrics-daemon-58bdp" (UID: "d00237ae-ca20-4202-8e24-e4988fbf5269") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.460129 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.460168 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.460178 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.460195 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.460208 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:56Z","lastTransitionTime":"2025-10-09T15:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.562859 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.562927 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.562949 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.562979 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.563000 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:56Z","lastTransitionTime":"2025-10-09T15:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.665206 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.665287 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.665314 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.665343 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.665414 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:56Z","lastTransitionTime":"2025-10-09T15:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.768620 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.768657 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.768668 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.768683 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.768694 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:56Z","lastTransitionTime":"2025-10-09T15:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.871052 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.871105 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.871115 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.871129 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.871138 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:56Z","lastTransitionTime":"2025-10-09T15:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.973391 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.973443 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.973455 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.973472 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:56 crc kubenswrapper[4719]: I1009 15:18:56.973483 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:56Z","lastTransitionTime":"2025-10-09T15:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.075989 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.076039 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.076053 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.076070 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.076083 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:57Z","lastTransitionTime":"2025-10-09T15:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.160487 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.160555 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.160487 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:18:57 crc kubenswrapper[4719]: E1009 15:18:57.160642 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:18:57 crc kubenswrapper[4719]: E1009 15:18:57.160680 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:18:57 crc kubenswrapper[4719]: E1009 15:18:57.160741 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.177742 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.177786 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.177802 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.177823 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.177838 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:57Z","lastTransitionTime":"2025-10-09T15:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.280318 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.280392 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.280410 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.280426 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.280436 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:57Z","lastTransitionTime":"2025-10-09T15:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.382733 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.382779 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.382789 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.382803 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.382812 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:57Z","lastTransitionTime":"2025-10-09T15:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.485454 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.485497 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.485507 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.485525 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.485537 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:57Z","lastTransitionTime":"2025-10-09T15:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.587632 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.587685 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.587697 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.587715 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.588071 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:57Z","lastTransitionTime":"2025-10-09T15:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.689820 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.689852 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.689860 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.689872 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.689881 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:57Z","lastTransitionTime":"2025-10-09T15:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.792470 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.792527 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.792549 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.792579 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.792603 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:57Z","lastTransitionTime":"2025-10-09T15:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.895261 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.895426 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.895470 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.895556 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.895927 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:57Z","lastTransitionTime":"2025-10-09T15:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.997439 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.997536 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.997548 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.997563 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:57 crc kubenswrapper[4719]: I1009 15:18:57.997571 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:57Z","lastTransitionTime":"2025-10-09T15:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.100802 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.100840 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.100854 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.100875 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.100889 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:58Z","lastTransitionTime":"2025-10-09T15:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.160808 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:18:58 crc kubenswrapper[4719]: E1009 15:18:58.160930 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.203369 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.203441 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.203452 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.203466 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.203477 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:58Z","lastTransitionTime":"2025-10-09T15:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.305850 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.305888 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.305899 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.305914 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.305925 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:58Z","lastTransitionTime":"2025-10-09T15:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.407933 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.407976 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.407986 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.408001 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.408009 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:58Z","lastTransitionTime":"2025-10-09T15:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.510270 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.510308 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.510320 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.510339 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.510378 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:58Z","lastTransitionTime":"2025-10-09T15:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.612252 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.612294 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.612304 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.612320 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.612331 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:58Z","lastTransitionTime":"2025-10-09T15:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.714474 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.714512 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.714521 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.714535 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.714544 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:58Z","lastTransitionTime":"2025-10-09T15:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.817549 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.817588 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.817601 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.817615 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.817627 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:58Z","lastTransitionTime":"2025-10-09T15:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.920659 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.920738 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.920766 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.920794 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:58 crc kubenswrapper[4719]: I1009 15:18:58.920813 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:58Z","lastTransitionTime":"2025-10-09T15:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.023651 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.023718 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.023741 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.023769 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.023789 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:59Z","lastTransitionTime":"2025-10-09T15:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.125832 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.125899 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.125917 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.126235 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.126279 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:59Z","lastTransitionTime":"2025-10-09T15:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.160741 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:18:59 crc kubenswrapper[4719]: E1009 15:18:59.160888 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.160908 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.160741 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:18:59 crc kubenswrapper[4719]: E1009 15:18:59.161043 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:18:59 crc kubenswrapper[4719]: E1009 15:18:59.161224 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.228510 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.228565 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.228581 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.228601 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.228618 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:59Z","lastTransitionTime":"2025-10-09T15:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.331122 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.331179 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.331197 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.331231 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.331250 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:59Z","lastTransitionTime":"2025-10-09T15:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.433880 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.433932 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.433941 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.433957 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.433967 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:59Z","lastTransitionTime":"2025-10-09T15:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.536341 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.536416 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.536433 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.536454 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.536516 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:59Z","lastTransitionTime":"2025-10-09T15:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.638855 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.638898 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.638912 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.638932 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.638951 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:59Z","lastTransitionTime":"2025-10-09T15:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.741711 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.741780 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.741802 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.741828 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.741849 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:59Z","lastTransitionTime":"2025-10-09T15:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.844142 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.844224 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.844259 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.844288 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.844309 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:59Z","lastTransitionTime":"2025-10-09T15:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.947045 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.947111 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.947129 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.947151 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:18:59 crc kubenswrapper[4719]: I1009 15:18:59.947168 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:18:59Z","lastTransitionTime":"2025-10-09T15:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.050389 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.050436 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.050445 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.050459 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.050467 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:00Z","lastTransitionTime":"2025-10-09T15:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.153414 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.153453 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.153462 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.153477 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.153487 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:00Z","lastTransitionTime":"2025-10-09T15:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.160887 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:00 crc kubenswrapper[4719]: E1009 15:19:00.161016 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.255269 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.255305 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.255313 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.255330 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.255339 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:00Z","lastTransitionTime":"2025-10-09T15:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.357731 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.357983 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.358045 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.358121 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.358182 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:00Z","lastTransitionTime":"2025-10-09T15:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.460636 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.460676 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.460688 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.460704 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.460715 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:00Z","lastTransitionTime":"2025-10-09T15:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.562282 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.562546 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.562646 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.562735 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.562815 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:00Z","lastTransitionTime":"2025-10-09T15:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.664921 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.665230 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.665311 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.665440 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.665527 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:00Z","lastTransitionTime":"2025-10-09T15:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.767315 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.767391 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.767403 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.767423 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.767434 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:00Z","lastTransitionTime":"2025-10-09T15:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.869931 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.869972 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.869986 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.870005 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.870017 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:00Z","lastTransitionTime":"2025-10-09T15:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.972673 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.972710 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.972719 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.972732 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:00 crc kubenswrapper[4719]: I1009 15:19:00.972741 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:00Z","lastTransitionTime":"2025-10-09T15:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.075587 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.075935 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.076074 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.076201 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.076324 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:01Z","lastTransitionTime":"2025-10-09T15:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.160726 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.160866 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:01 crc kubenswrapper[4719]: E1009 15:19:01.161110 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.160769 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:01 crc kubenswrapper[4719]: E1009 15:19:01.161247 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:01 crc kubenswrapper[4719]: E1009 15:19:01.161524 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.179094 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.179151 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.179164 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.179179 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.179189 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:01Z","lastTransitionTime":"2025-10-09T15:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.282014 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.282324 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.282437 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.282513 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.282651 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:01Z","lastTransitionTime":"2025-10-09T15:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.386664 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.387137 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.387204 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.387273 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.387331 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:01Z","lastTransitionTime":"2025-10-09T15:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.490084 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.490123 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.490136 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.490153 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.490168 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:01Z","lastTransitionTime":"2025-10-09T15:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.563908 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.564020 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.564039 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.564064 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.564126 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:01Z","lastTransitionTime":"2025-10-09T15:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:01 crc kubenswrapper[4719]: E1009 15:19:01.584169 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:01Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.589126 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.589203 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.589228 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.589276 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.589299 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:01Z","lastTransitionTime":"2025-10-09T15:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:01 crc kubenswrapper[4719]: E1009 15:19:01.608426 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:01Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.613367 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.613422 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.613436 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.613457 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.613470 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:01Z","lastTransitionTime":"2025-10-09T15:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:01 crc kubenswrapper[4719]: E1009 15:19:01.632453 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:01Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.636300 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.636335 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.636344 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.636375 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.636386 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:01Z","lastTransitionTime":"2025-10-09T15:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:01 crc kubenswrapper[4719]: E1009 15:19:01.655025 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:01Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.658850 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.658877 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.658885 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.658909 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.658918 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:01Z","lastTransitionTime":"2025-10-09T15:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:01 crc kubenswrapper[4719]: E1009 15:19:01.672122 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:01Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:01 crc kubenswrapper[4719]: E1009 15:19:01.672289 4719 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.674003 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.674041 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.674050 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.674064 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.674075 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:01Z","lastTransitionTime":"2025-10-09T15:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.777468 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.777553 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.777568 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.777590 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.777605 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:01Z","lastTransitionTime":"2025-10-09T15:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.880389 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.880456 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.880467 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.880507 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.880519 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:01Z","lastTransitionTime":"2025-10-09T15:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.982786 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.982831 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.982845 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.982864 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:01 crc kubenswrapper[4719]: I1009 15:19:01.982880 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:01Z","lastTransitionTime":"2025-10-09T15:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.086033 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.086086 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.086100 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.086120 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.086163 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:02Z","lastTransitionTime":"2025-10-09T15:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.161128 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:02 crc kubenswrapper[4719]: E1009 15:19:02.161281 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.189241 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.189321 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.189337 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.189373 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.189388 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:02Z","lastTransitionTime":"2025-10-09T15:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.292827 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.292886 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.292903 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.292928 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.292945 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:02Z","lastTransitionTime":"2025-10-09T15:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.395956 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.396009 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.396025 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.396044 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.396058 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:02Z","lastTransitionTime":"2025-10-09T15:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.498539 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.498579 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.498590 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.498604 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.498613 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:02Z","lastTransitionTime":"2025-10-09T15:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.600637 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.600663 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.600672 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.600686 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.600695 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:02Z","lastTransitionTime":"2025-10-09T15:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.703708 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.703735 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.703745 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.703756 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.703765 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:02Z","lastTransitionTime":"2025-10-09T15:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.805654 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.805681 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.805689 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.805701 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.805709 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:02Z","lastTransitionTime":"2025-10-09T15:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.908167 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.908203 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.908213 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.908227 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:02 crc kubenswrapper[4719]: I1009 15:19:02.908235 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:02Z","lastTransitionTime":"2025-10-09T15:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.010780 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.010841 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.010852 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.010869 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.010880 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:03Z","lastTransitionTime":"2025-10-09T15:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.112797 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.112837 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.112849 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.112866 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.112877 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:03Z","lastTransitionTime":"2025-10-09T15:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.161265 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.161331 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.161796 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.162221 4719 scope.go:117] "RemoveContainer" containerID="0b0968bec2451c23d394ca65074ad152d78a54bd1dda603a35fb14f5e67af7ec" Oct 09 15:19:03 crc kubenswrapper[4719]: E1009 15:19:03.162217 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:03 crc kubenswrapper[4719]: E1009 15:19:03.162391 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:03 crc kubenswrapper[4719]: E1009 15:19:03.162306 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.216388 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.216835 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.216847 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.216866 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.216876 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:03Z","lastTransitionTime":"2025-10-09T15:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.318770 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.318799 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.318807 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.318820 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.318831 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:03Z","lastTransitionTime":"2025-10-09T15:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.420496 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.420554 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.420562 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.420578 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.420587 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:03Z","lastTransitionTime":"2025-10-09T15:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.477186 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zv8jk_fea6a48c-769c-41bf-95ce-649cc31eb4e5/ovnkube-controller/1.log" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.479573 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerStarted","Data":"da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84"} Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.479969 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.492784 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:03Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.511367 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b0968bec2451c23d394ca65074ad152d78a54bd1dda603a35fb14f5e67af7ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:18:46Z\\\",\\\"message\\\":\\\" 8bc1afc2-8724-4135-84df-aee09f23af4c 4514 0 2025-02-23 05:12:24 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-operator] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mco-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0077a47eb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-operator,},ClusterIP:10.217.4.183,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.183],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngre\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:03Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.521737 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:03Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.523404 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.523497 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.523573 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.523686 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.523746 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:03Z","lastTransitionTime":"2025-10-09T15:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.532677 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:03Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.544156 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:03Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.556537 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:03Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.568181 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:03Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.586371 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-58bdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00237ae-ca20-4202-8e24-e4988fbf5269\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-58bdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:03Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.616569 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:03Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.626049 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.626087 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.626096 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.626110 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.626119 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:03Z","lastTransitionTime":"2025-10-09T15:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.638538 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:03Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.654572 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:03Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.675994 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:03Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.687198 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b565dc-6ccc-4404-95f7-c8cf09f91802\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c126340dff33c7a571fc152c4c8ed154e104aaab937ba7f68070763d79825b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df96c88745808317300d950f2d991691695576773b7de02958ec718445cc3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vdgtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:03Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.699905 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6f2af57f612bf33446a88a0a093adb3b64f562412d9a0bd03f3964c281ba4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:03Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.709497 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:03Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.719053 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:03Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.728022 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.728053 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.728063 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.728078 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.728091 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:03Z","lastTransitionTime":"2025-10-09T15:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.729254 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:03Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.830444 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.830492 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.830500 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.830518 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.830528 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:03Z","lastTransitionTime":"2025-10-09T15:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.932255 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.932305 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.932318 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.932334 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:03 crc kubenswrapper[4719]: I1009 15:19:03.932343 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:03Z","lastTransitionTime":"2025-10-09T15:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.045721 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.045771 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.045794 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.045827 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.045851 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:04Z","lastTransitionTime":"2025-10-09T15:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.148788 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.148845 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.148868 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.148896 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.148916 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:04Z","lastTransitionTime":"2025-10-09T15:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.160340 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:04 crc kubenswrapper[4719]: E1009 15:19:04.160608 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.251899 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.252215 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.252245 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.252277 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.252295 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:04Z","lastTransitionTime":"2025-10-09T15:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.355757 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.355845 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.355863 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.355889 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.355906 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:04Z","lastTransitionTime":"2025-10-09T15:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.457681 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs\") pod \"network-metrics-daemon-58bdp\" (UID: \"d00237ae-ca20-4202-8e24-e4988fbf5269\") " pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:04 crc kubenswrapper[4719]: E1009 15:19:04.457803 4719 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 15:19:04 crc kubenswrapper[4719]: E1009 15:19:04.457844 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs podName:d00237ae-ca20-4202-8e24-e4988fbf5269 nodeName:}" failed. No retries permitted until 2025-10-09 15:19:20.457830389 +0000 UTC m=+65.967541674 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs") pod "network-metrics-daemon-58bdp" (UID: "d00237ae-ca20-4202-8e24-e4988fbf5269") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.458292 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.458346 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.458427 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.458453 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.458629 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:04Z","lastTransitionTime":"2025-10-09T15:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.485454 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zv8jk_fea6a48c-769c-41bf-95ce-649cc31eb4e5/ovnkube-controller/2.log" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.486852 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zv8jk_fea6a48c-769c-41bf-95ce-649cc31eb4e5/ovnkube-controller/1.log" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.491162 4719 generic.go:334] "Generic (PLEG): container finished" podID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerID="da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84" exitCode=1 Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.491227 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerDied","Data":"da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84"} Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.491285 4719 scope.go:117] "RemoveContainer" containerID="0b0968bec2451c23d394ca65074ad152d78a54bd1dda603a35fb14f5e67af7ec" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.493305 4719 scope.go:117] "RemoveContainer" containerID="da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84" Oct 09 15:19:04 crc kubenswrapper[4719]: E1009 15:19:04.493647 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.513765 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:04Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.531731 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:04Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.553074 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:04Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.560791 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.560836 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.560857 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.560882 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.560898 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:04Z","lastTransitionTime":"2025-10-09T15:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.571093 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:04Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.587798 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-58bdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00237ae-ca20-4202-8e24-e4988fbf5269\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-58bdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:04Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.624018 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:04Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.638881 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:04Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.659193 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:04Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.663893 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.663951 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.663967 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.663990 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.664033 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:04Z","lastTransitionTime":"2025-10-09T15:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.673030 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:04Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.695051 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b565dc-6ccc-4404-95f7-c8cf09f91802\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c126340dff33c7a571fc152c4c8ed154e104aaab937ba7f68070763d79825b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df96c88745808317300d950f2d991691695576773b7de02958ec718445cc3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vdgtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:04Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.716751 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6f2af57f612bf33446a88a0a093adb3b64f562412d9a0bd03f3964c281ba4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:04Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.733224 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:04Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.747039 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:04Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.766177 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.766232 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.766246 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.766268 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.766283 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:04Z","lastTransitionTime":"2025-10-09T15:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.771450 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:04Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.792992 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b0968bec2451c23d394ca65074ad152d78a54bd1dda603a35fb14f5e67af7ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:18:46Z\\\",\\\"message\\\":\\\" 8bc1afc2-8724-4135-84df-aee09f23af4c 4514 0 2025-02-23 05:12:24 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-operator] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mco-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0077a47eb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-operator,},ClusterIP:10.217.4.183,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.183],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngre\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:19:04Z\\\",\\\"message\\\":\\\"2025-08-24T17:21:41Z]\\\\nI1009 15:19:04.075995 6407 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"d389393c-7ba9-422c-b3f5-06e391d537d2\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.189\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1009 15:19:04.076046 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:04Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.805599 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:04Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.819740 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:04Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.868861 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.868915 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.868932 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.868952 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.868965 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:04Z","lastTransitionTime":"2025-10-09T15:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.971386 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.971422 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.971433 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.971449 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:04 crc kubenswrapper[4719]: I1009 15:19:04.971460 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:04Z","lastTransitionTime":"2025-10-09T15:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.074377 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.074420 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.074432 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.074451 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.074465 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:05Z","lastTransitionTime":"2025-10-09T15:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.161011 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.161035 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:05 crc kubenswrapper[4719]: E1009 15:19:05.161125 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:05 crc kubenswrapper[4719]: E1009 15:19:05.161180 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.161246 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:05 crc kubenswrapper[4719]: E1009 15:19:05.161408 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.174542 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.176173 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.176276 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.176477 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.176530 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.176548 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:05Z","lastTransitionTime":"2025-10-09T15:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.184867 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-58bdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00237ae-ca20-4202-8e24-e4988fbf5269\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-58bdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.201181 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.225299 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.251816 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.274707 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.278539 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.278573 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.278585 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.278605 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.278622 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:05Z","lastTransitionTime":"2025-10-09T15:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.286505 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b565dc-6ccc-4404-95f7-c8cf09f91802\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c126340dff33c7a571fc152c4c8ed154e104aaab937ba7f68070763d79825b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df96c88745808317300d950f2d991691695576773b7de02958ec718445cc3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vdgtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.299642 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6f2af57f612bf33446a88a0a093adb3b64f562412d9a0bd03f3964c281ba4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.312914 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.325259 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.336117 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.348525 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.361508 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.373153 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.380050 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.380106 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.380124 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.380146 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.380161 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:05Z","lastTransitionTime":"2025-10-09T15:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.385240 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.399937 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.418902 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b0968bec2451c23d394ca65074ad152d78a54bd1dda603a35fb14f5e67af7ec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:18:46Z\\\",\\\"message\\\":\\\" 8bc1afc2-8724-4135-84df-aee09f23af4c 4514 0 2025-02-23 05:12:24 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-operator] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mco-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0077a47eb \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-operator,},ClusterIP:10.217.4.183,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.183],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngre\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:19:04Z\\\",\\\"message\\\":\\\"2025-08-24T17:21:41Z]\\\\nI1009 15:19:04.075995 6407 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"d389393c-7ba9-422c-b3f5-06e391d537d2\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.189\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1009 15:19:04.076046 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:19:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.482936 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.482975 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.482985 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.483001 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.483010 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:05Z","lastTransitionTime":"2025-10-09T15:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.495005 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zv8jk_fea6a48c-769c-41bf-95ce-649cc31eb4e5/ovnkube-controller/2.log" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.498256 4719 scope.go:117] "RemoveContainer" containerID="da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84" Oct 09 15:19:05 crc kubenswrapper[4719]: E1009 15:19:05.498415 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.510471 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6f2af57f612bf33446a88a0a093adb3b64f562412d9a0bd03f3964c281ba4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.519967 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.531816 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.542753 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.551189 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b565dc-6ccc-4404-95f7-c8cf09f91802\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c126340dff33c7a571fc152c4c8ed154e104aaab937ba7f68070763d79825b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df96c88745808317300d950f2d991691695576773b7de02958ec718445cc3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vdgtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.562256 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.572196 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.582592 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.585288 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.585330 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.585341 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.585377 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.585396 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:05Z","lastTransitionTime":"2025-10-09T15:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.592939 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.609252 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:19:04Z\\\",\\\"message\\\":\\\"2025-08-24T17:21:41Z]\\\\nI1009 15:19:04.075995 6407 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"d389393c-7ba9-422c-b3f5-06e391d537d2\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.189\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1009 15:19:04.076046 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:19:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.618104 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.635284 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.647503 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.658603 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.668914 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.678005 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.687333 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.687400 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.687420 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.687456 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.687468 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:05Z","lastTransitionTime":"2025-10-09T15:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.689333 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-58bdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00237ae-ca20-4202-8e24-e4988fbf5269\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-58bdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:05Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.789882 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.789921 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.789931 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.789945 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.789953 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:05Z","lastTransitionTime":"2025-10-09T15:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.892857 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.893418 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.893434 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.893455 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.893470 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:05Z","lastTransitionTime":"2025-10-09T15:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.996794 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.996886 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.996926 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.996978 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:05 crc kubenswrapper[4719]: I1009 15:19:05.996996 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:05Z","lastTransitionTime":"2025-10-09T15:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.102115 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.102185 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.102207 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.102234 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.102254 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:06Z","lastTransitionTime":"2025-10-09T15:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.161200 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:06 crc kubenswrapper[4719]: E1009 15:19:06.161517 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.205319 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.205412 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.205429 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.205449 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.205464 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:06Z","lastTransitionTime":"2025-10-09T15:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.308020 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.308062 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.308072 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.308087 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.308097 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:06Z","lastTransitionTime":"2025-10-09T15:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.410468 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.410508 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.410519 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.410535 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.410545 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:06Z","lastTransitionTime":"2025-10-09T15:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.513644 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.513695 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.513715 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.513738 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.513753 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:06Z","lastTransitionTime":"2025-10-09T15:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.616275 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.616311 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.616321 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.616335 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.616360 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:06Z","lastTransitionTime":"2025-10-09T15:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.718738 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.718792 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.718806 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.718826 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.718839 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:06Z","lastTransitionTime":"2025-10-09T15:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.821280 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.821330 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.821377 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.821394 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.821403 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:06Z","lastTransitionTime":"2025-10-09T15:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.883706 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:19:06 crc kubenswrapper[4719]: E1009 15:19:06.883849 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:19:38.883825898 +0000 UTC m=+84.393537183 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.884118 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.884217 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:06 crc kubenswrapper[4719]: E1009 15:19:06.884248 4719 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 15:19:06 crc kubenswrapper[4719]: E1009 15:19:06.884405 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 15:19:38.884390196 +0000 UTC m=+84.394101481 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 15:19:06 crc kubenswrapper[4719]: E1009 15:19:06.884304 4719 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 15:19:06 crc kubenswrapper[4719]: E1009 15:19:06.884557 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 15:19:38.884549071 +0000 UTC m=+84.394260356 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.923382 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.923429 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.923440 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.923455 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.923464 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:06Z","lastTransitionTime":"2025-10-09T15:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.985085 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:06 crc kubenswrapper[4719]: I1009 15:19:06.985414 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:06 crc kubenswrapper[4719]: E1009 15:19:06.985279 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 15:19:06 crc kubenswrapper[4719]: E1009 15:19:06.985723 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 15:19:06 crc kubenswrapper[4719]: E1009 15:19:06.985831 4719 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:19:06 crc kubenswrapper[4719]: E1009 15:19:06.985549 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 15:19:06 crc kubenswrapper[4719]: E1009 15:19:06.985925 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 15:19:06 crc kubenswrapper[4719]: E1009 15:19:06.985936 4719 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:19:06 crc kubenswrapper[4719]: E1009 15:19:06.985983 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 15:19:38.985968712 +0000 UTC m=+84.495679997 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:19:06 crc kubenswrapper[4719]: E1009 15:19:06.986101 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 15:19:38.986091296 +0000 UTC m=+84.495802581 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.025930 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.025962 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.025973 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.025986 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.025996 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:07Z","lastTransitionTime":"2025-10-09T15:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.128300 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.128330 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.128339 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.128373 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.128385 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:07Z","lastTransitionTime":"2025-10-09T15:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.160487 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:07 crc kubenswrapper[4719]: E1009 15:19:07.160687 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.160597 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:07 crc kubenswrapper[4719]: E1009 15:19:07.161004 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.160510 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:07 crc kubenswrapper[4719]: E1009 15:19:07.161626 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.231694 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.231738 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.231750 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.231766 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.231778 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:07Z","lastTransitionTime":"2025-10-09T15:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.334329 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.334400 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.334416 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.334436 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.334450 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:07Z","lastTransitionTime":"2025-10-09T15:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.436871 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.436905 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.436915 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.436930 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.436940 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:07Z","lastTransitionTime":"2025-10-09T15:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.538739 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.538780 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.538792 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.538808 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.538820 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:07Z","lastTransitionTime":"2025-10-09T15:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.641061 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.641095 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.641110 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.641129 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.641141 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:07Z","lastTransitionTime":"2025-10-09T15:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.743612 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.743674 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.743683 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.743699 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.743709 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:07Z","lastTransitionTime":"2025-10-09T15:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.845839 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.845880 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.845889 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.845904 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.845913 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:07Z","lastTransitionTime":"2025-10-09T15:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.948471 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.948512 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.948521 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.948539 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:07 crc kubenswrapper[4719]: I1009 15:19:07.948556 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:07Z","lastTransitionTime":"2025-10-09T15:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.050419 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.050468 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.050483 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.050498 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.050508 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:08Z","lastTransitionTime":"2025-10-09T15:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.153470 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.153683 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.153774 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.153836 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.153890 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:08Z","lastTransitionTime":"2025-10-09T15:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.160714 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:08 crc kubenswrapper[4719]: E1009 15:19:08.160829 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.256239 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.256289 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.256304 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.256319 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.256330 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:08Z","lastTransitionTime":"2025-10-09T15:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.358781 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.358837 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.358846 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.358859 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.358873 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:08Z","lastTransitionTime":"2025-10-09T15:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.461407 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.461450 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.461459 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.461474 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.461485 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:08Z","lastTransitionTime":"2025-10-09T15:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.563464 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.563503 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.563513 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.563528 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.563536 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:08Z","lastTransitionTime":"2025-10-09T15:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.665137 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.665179 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.665190 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.665205 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.665216 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:08Z","lastTransitionTime":"2025-10-09T15:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.767420 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.767459 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.767469 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.767485 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.767496 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:08Z","lastTransitionTime":"2025-10-09T15:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.870087 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.870136 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.870147 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.870164 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.870175 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:08Z","lastTransitionTime":"2025-10-09T15:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.972494 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.972533 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.972544 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.972561 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:08 crc kubenswrapper[4719]: I1009 15:19:08.972571 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:08Z","lastTransitionTime":"2025-10-09T15:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.074071 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.074103 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.074119 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.074139 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.074150 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:09Z","lastTransitionTime":"2025-10-09T15:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.160285 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:09 crc kubenswrapper[4719]: E1009 15:19:09.160438 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.160502 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:09 crc kubenswrapper[4719]: E1009 15:19:09.160575 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.160285 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:09 crc kubenswrapper[4719]: E1009 15:19:09.160644 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.175496 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.175536 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.175546 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.175559 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.175568 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:09Z","lastTransitionTime":"2025-10-09T15:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.277412 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.277446 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.277457 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.277473 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.277485 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:09Z","lastTransitionTime":"2025-10-09T15:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.379465 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.379503 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.379522 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.379540 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.379551 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:09Z","lastTransitionTime":"2025-10-09T15:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.481261 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.481301 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.481310 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.481325 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.481336 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:09Z","lastTransitionTime":"2025-10-09T15:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.583810 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.583852 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.583866 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.583889 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.583901 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:09Z","lastTransitionTime":"2025-10-09T15:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.686284 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.686319 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.686327 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.686340 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.686373 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:09Z","lastTransitionTime":"2025-10-09T15:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.788550 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.788594 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.788607 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.788625 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.788635 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:09Z","lastTransitionTime":"2025-10-09T15:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.872196 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.884559 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:09Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.891370 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.891404 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.891414 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.891427 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.891437 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:09Z","lastTransitionTime":"2025-10-09T15:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.897816 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:09Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.906155 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:09Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.916009 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-58bdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00237ae-ca20-4202-8e24-e4988fbf5269\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-58bdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:09Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.932022 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:09Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.942526 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:09Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.954313 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:09Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.966154 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:09Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.975440 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b565dc-6ccc-4404-95f7-c8cf09f91802\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c126340dff33c7a571fc152c4c8ed154e104aaab937ba7f68070763d79825b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df96c88745808317300d950f2d991691695576773b7de02958ec718445cc3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vdgtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:09Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.985558 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6f2af57f612bf33446a88a0a093adb3b64f562412d9a0bd03f3964c281ba4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:09Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.994004 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.994037 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.994047 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.994066 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.994079 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:09Z","lastTransitionTime":"2025-10-09T15:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:09 crc kubenswrapper[4719]: I1009 15:19:09.994065 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:09Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.002911 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:10Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.012719 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:10Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.022492 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:10Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.037902 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:19:04Z\\\",\\\"message\\\":\\\"2025-08-24T17:21:41Z]\\\\nI1009 15:19:04.075995 6407 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"d389393c-7ba9-422c-b3f5-06e391d537d2\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.189\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1009 15:19:04.076046 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:19:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:10Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.046121 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:10Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.056983 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:10Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.095656 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.095678 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.095686 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.095718 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.095727 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:10Z","lastTransitionTime":"2025-10-09T15:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.160824 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:10 crc kubenswrapper[4719]: E1009 15:19:10.160962 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.197521 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.197573 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.197587 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.197604 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.197615 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:10Z","lastTransitionTime":"2025-10-09T15:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.299281 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.299335 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.299370 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.299388 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.299399 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:10Z","lastTransitionTime":"2025-10-09T15:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.401555 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.401620 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.401648 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.401665 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.401675 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:10Z","lastTransitionTime":"2025-10-09T15:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.504563 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.504623 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.504635 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.504658 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.504670 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:10Z","lastTransitionTime":"2025-10-09T15:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.606474 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.606508 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.606520 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.606535 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.606546 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:10Z","lastTransitionTime":"2025-10-09T15:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.708599 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.708634 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.708647 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.708662 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.708672 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:10Z","lastTransitionTime":"2025-10-09T15:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.810293 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.810329 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.810338 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.810368 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.810386 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:10Z","lastTransitionTime":"2025-10-09T15:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.912776 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.912832 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.912842 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.912859 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:10 crc kubenswrapper[4719]: I1009 15:19:10.912867 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:10Z","lastTransitionTime":"2025-10-09T15:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.015120 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.015174 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.015184 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.015198 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.015208 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:11Z","lastTransitionTime":"2025-10-09T15:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.056937 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.066879 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.082923 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:11Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.094764 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:11Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.110750 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:11Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.117388 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.117418 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.117430 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.117445 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.117454 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:11Z","lastTransitionTime":"2025-10-09T15:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.122589 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:11Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.132744 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:11Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.143273 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-58bdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00237ae-ca20-4202-8e24-e4988fbf5269\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-58bdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:11Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.157870 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6f2af57f612bf33446a88a0a093adb3b64f562412d9a0bd03f3964c281ba4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:11Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.160675 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.160728 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:11 crc kubenswrapper[4719]: E1009 15:19:11.160770 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.160795 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:11 crc kubenswrapper[4719]: E1009 15:19:11.160934 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:11 crc kubenswrapper[4719]: E1009 15:19:11.161198 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.170405 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:11Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.188047 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:11Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.203546 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:11Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.216670 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b565dc-6ccc-4404-95f7-c8cf09f91802\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c126340dff33c7a571fc152c4c8ed154e104aaab937ba7f68070763d79825b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df96c88745808317300d950f2d991691695576773b7de02958ec718445cc3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vdgtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:11Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.219456 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.219486 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.219496 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.219565 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.219581 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:11Z","lastTransitionTime":"2025-10-09T15:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.233133 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:11Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.245045 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:11Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.260536 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:11Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.279138 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:11Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.306757 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:19:04Z\\\",\\\"message\\\":\\\"2025-08-24T17:21:41Z]\\\\nI1009 15:19:04.075995 6407 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"d389393c-7ba9-422c-b3f5-06e391d537d2\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.189\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1009 15:19:04.076046 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:19:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:11Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.319300 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:11Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.322645 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.322705 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.322729 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.322763 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.322787 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:11Z","lastTransitionTime":"2025-10-09T15:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.425597 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.425660 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.425672 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.425690 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.425700 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:11Z","lastTransitionTime":"2025-10-09T15:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.529454 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.529548 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.529628 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.529660 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.529678 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:11Z","lastTransitionTime":"2025-10-09T15:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.632122 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.632163 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.632172 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.632191 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.632202 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:11Z","lastTransitionTime":"2025-10-09T15:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.735850 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.735896 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.735906 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.735923 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.735933 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:11Z","lastTransitionTime":"2025-10-09T15:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.813317 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.813430 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.813459 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.813495 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.813517 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:11Z","lastTransitionTime":"2025-10-09T15:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:11 crc kubenswrapper[4719]: E1009 15:19:11.834079 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:11Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.838615 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.838646 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.838655 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.838669 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.838681 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:11Z","lastTransitionTime":"2025-10-09T15:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:11 crc kubenswrapper[4719]: E1009 15:19:11.849900 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:11Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.854851 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.854917 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.854935 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.854963 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.854984 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:11Z","lastTransitionTime":"2025-10-09T15:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:11 crc kubenswrapper[4719]: E1009 15:19:11.868179 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:11Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.873492 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.873546 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.873559 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.873587 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.873602 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:11Z","lastTransitionTime":"2025-10-09T15:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:11 crc kubenswrapper[4719]: E1009 15:19:11.887051 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:11Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.892779 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.892827 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.892837 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.892853 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.892924 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:11Z","lastTransitionTime":"2025-10-09T15:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:11 crc kubenswrapper[4719]: E1009 15:19:11.908117 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:11Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:11 crc kubenswrapper[4719]: E1009 15:19:11.908313 4719 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.910220 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.910284 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.910296 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.910318 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:11 crc kubenswrapper[4719]: I1009 15:19:11.910335 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:11Z","lastTransitionTime":"2025-10-09T15:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.014062 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.014148 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.014168 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.014201 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.014224 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:12Z","lastTransitionTime":"2025-10-09T15:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.117107 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.117154 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.117178 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.117205 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.117220 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:12Z","lastTransitionTime":"2025-10-09T15:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.160344 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:12 crc kubenswrapper[4719]: E1009 15:19:12.160599 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.221555 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.221614 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.221630 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.221667 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.221685 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:12Z","lastTransitionTime":"2025-10-09T15:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.324442 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.324474 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.324482 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.324495 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.324503 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:12Z","lastTransitionTime":"2025-10-09T15:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.428590 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.428673 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.428691 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.428722 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.428747 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:12Z","lastTransitionTime":"2025-10-09T15:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.532810 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.532884 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.532907 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.532943 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.532966 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:12Z","lastTransitionTime":"2025-10-09T15:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.637164 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.637252 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.637272 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.637300 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.637319 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:12Z","lastTransitionTime":"2025-10-09T15:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.740058 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.740123 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.740136 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.740154 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.740168 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:12Z","lastTransitionTime":"2025-10-09T15:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.843220 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.843271 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.843286 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.843307 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.843320 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:12Z","lastTransitionTime":"2025-10-09T15:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.947250 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.947293 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.947305 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.947337 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:12 crc kubenswrapper[4719]: I1009 15:19:12.947368 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:12Z","lastTransitionTime":"2025-10-09T15:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.050610 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.050657 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.050667 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.050687 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.050697 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:13Z","lastTransitionTime":"2025-10-09T15:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.152932 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.153015 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.153033 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.153060 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.153077 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:13Z","lastTransitionTime":"2025-10-09T15:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.160255 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.160318 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:13 crc kubenswrapper[4719]: E1009 15:19:13.160436 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.160321 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:13 crc kubenswrapper[4719]: E1009 15:19:13.160531 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:13 crc kubenswrapper[4719]: E1009 15:19:13.160618 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.255865 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.255904 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.255914 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.255930 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.255942 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:13Z","lastTransitionTime":"2025-10-09T15:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.358944 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.359015 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.359026 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.359043 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.359056 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:13Z","lastTransitionTime":"2025-10-09T15:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.461582 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.461653 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.461662 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.461678 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.461687 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:13Z","lastTransitionTime":"2025-10-09T15:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.564657 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.564711 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.564724 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.564739 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.564751 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:13Z","lastTransitionTime":"2025-10-09T15:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.667767 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.667807 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.667816 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.667829 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.667837 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:13Z","lastTransitionTime":"2025-10-09T15:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.770787 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.770829 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.770840 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.770855 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.770868 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:13Z","lastTransitionTime":"2025-10-09T15:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.873116 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.873153 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.873166 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.873183 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.873192 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:13Z","lastTransitionTime":"2025-10-09T15:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.976384 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.976447 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.976460 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.976480 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:13 crc kubenswrapper[4719]: I1009 15:19:13.976492 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:13Z","lastTransitionTime":"2025-10-09T15:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.080243 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.080291 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.080303 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.080325 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.080342 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:14Z","lastTransitionTime":"2025-10-09T15:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.160611 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:14 crc kubenswrapper[4719]: E1009 15:19:14.160840 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.183868 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.183929 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.183941 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.183960 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.183972 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:14Z","lastTransitionTime":"2025-10-09T15:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.287119 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.287175 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.287192 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.287218 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.287236 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:14Z","lastTransitionTime":"2025-10-09T15:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.391062 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.391163 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.391191 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.391235 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.391262 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:14Z","lastTransitionTime":"2025-10-09T15:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.494924 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.494971 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.494980 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.494995 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.495004 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:14Z","lastTransitionTime":"2025-10-09T15:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.596790 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.596826 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.596857 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.596870 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.596878 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:14Z","lastTransitionTime":"2025-10-09T15:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.699913 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.699970 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.699984 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.700004 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.700015 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:14Z","lastTransitionTime":"2025-10-09T15:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.803497 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.803548 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.803557 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.803574 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.803589 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:14Z","lastTransitionTime":"2025-10-09T15:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.905578 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.905613 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.905621 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.905635 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:14 crc kubenswrapper[4719]: I1009 15:19:14.905650 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:14Z","lastTransitionTime":"2025-10-09T15:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.008050 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.008096 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.008113 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.008134 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.008147 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:15Z","lastTransitionTime":"2025-10-09T15:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.110255 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.110295 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.110311 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.110330 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.110340 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:15Z","lastTransitionTime":"2025-10-09T15:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.160812 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.160958 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.161005 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:15 crc kubenswrapper[4719]: E1009 15:19:15.161057 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:15 crc kubenswrapper[4719]: E1009 15:19:15.161241 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:15 crc kubenswrapper[4719]: E1009 15:19:15.161329 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.172975 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:15Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.184529 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:15Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.196789 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd31818-5445-47d9-af8f-fa49dde2a7ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7137edca40a10e85d3116f62b5dbe6ffea35d9473164173af2dea55f1794397c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd0662699e43951e6e139dbe8bb44c36a0120144c90a7f21010cbf68a2abcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4a7a60b0336fb0e1a046f59f9d60cc55a056b70959c7e6a33b6b15b879bd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b628c71ffc4577dac4247fca1780e229a260bd382075e7eeb15d7f71fa688c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b628c71ffc4577dac4247fca1780e229a260bd382075e7eeb15d7f71fa688c27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:15Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.208648 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:15Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.212618 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.212655 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.212664 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.212678 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.212689 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:15Z","lastTransitionTime":"2025-10-09T15:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.228726 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:19:04Z\\\",\\\"message\\\":\\\"2025-08-24T17:21:41Z]\\\\nI1009 15:19:04.075995 6407 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"d389393c-7ba9-422c-b3f5-06e391d537d2\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.189\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1009 15:19:04.076046 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:19:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:15Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.238808 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:15Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.250790 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-58bdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00237ae-ca20-4202-8e24-e4988fbf5269\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-58bdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:15Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.271446 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:15Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.283472 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:15Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.295143 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:15Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.308045 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:15Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.314833 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.314886 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.314894 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.314907 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.314933 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:15Z","lastTransitionTime":"2025-10-09T15:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.319527 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b565dc-6ccc-4404-95f7-c8cf09f91802\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c126340dff33c7a571fc152c4c8ed154e104aaab937ba7f68070763d79825b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df96c88745808317300d950f2d991691695576773b7de02958ec718445cc3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vdgtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:15Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.334106 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6f2af57f612bf33446a88a0a093adb3b64f562412d9a0bd03f3964c281ba4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:15Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.345257 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:15Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.358372 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:15Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.370480 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:15Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.380714 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:15Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.390277 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:15Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.417032 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.417060 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.417069 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.417083 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.417091 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:15Z","lastTransitionTime":"2025-10-09T15:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.519005 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.519070 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.519085 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.519104 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.519115 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:15Z","lastTransitionTime":"2025-10-09T15:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.620700 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.620740 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.620749 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.620978 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.620989 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:15Z","lastTransitionTime":"2025-10-09T15:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.723061 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.723090 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.723098 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.723113 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.723122 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:15Z","lastTransitionTime":"2025-10-09T15:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.825543 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.825572 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.825580 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.825593 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.825602 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:15Z","lastTransitionTime":"2025-10-09T15:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.927802 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.927851 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.927868 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.927892 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:15 crc kubenswrapper[4719]: I1009 15:19:15.927909 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:15Z","lastTransitionTime":"2025-10-09T15:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.030695 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.030768 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.030780 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.030822 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.030835 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:16Z","lastTransitionTime":"2025-10-09T15:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.133761 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.133834 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.133846 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.133870 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.133882 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:16Z","lastTransitionTime":"2025-10-09T15:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.160678 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:16 crc kubenswrapper[4719]: E1009 15:19:16.160823 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.236872 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.236922 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.236935 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.236954 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.236964 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:16Z","lastTransitionTime":"2025-10-09T15:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.340998 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.341067 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.341082 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.341103 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.341116 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:16Z","lastTransitionTime":"2025-10-09T15:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.443501 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.443548 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.443558 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.443575 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.443585 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:16Z","lastTransitionTime":"2025-10-09T15:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.547922 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.547979 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.547992 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.548014 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.548028 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:16Z","lastTransitionTime":"2025-10-09T15:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.651975 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.652053 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.652070 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.652100 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.652122 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:16Z","lastTransitionTime":"2025-10-09T15:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.755186 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.755227 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.755236 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.755250 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.755259 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:16Z","lastTransitionTime":"2025-10-09T15:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.857935 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.857961 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.857990 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.858006 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.858017 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:16Z","lastTransitionTime":"2025-10-09T15:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.960897 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.960967 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.960985 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.961013 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:16 crc kubenswrapper[4719]: I1009 15:19:16.961030 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:16Z","lastTransitionTime":"2025-10-09T15:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.065326 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.065434 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.065459 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.065493 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.065533 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:17Z","lastTransitionTime":"2025-10-09T15:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.160987 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:17 crc kubenswrapper[4719]: E1009 15:19:17.161212 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.161514 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:17 crc kubenswrapper[4719]: E1009 15:19:17.161587 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.161736 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:17 crc kubenswrapper[4719]: E1009 15:19:17.161926 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.174970 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.175016 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.175026 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.175046 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.175057 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:17Z","lastTransitionTime":"2025-10-09T15:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.277922 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.277961 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.277971 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.277989 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.278000 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:17Z","lastTransitionTime":"2025-10-09T15:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.379959 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.380016 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.380024 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.380038 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.380048 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:17Z","lastTransitionTime":"2025-10-09T15:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.487633 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.487674 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.487683 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.487696 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.487705 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:17Z","lastTransitionTime":"2025-10-09T15:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.590347 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.590431 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.590446 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.590480 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.590498 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:17Z","lastTransitionTime":"2025-10-09T15:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.693461 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.693748 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.693901 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.694041 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.694154 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:17Z","lastTransitionTime":"2025-10-09T15:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.797791 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.798324 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.798424 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.798490 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.798552 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:17Z","lastTransitionTime":"2025-10-09T15:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.901575 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.901675 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.901710 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.901748 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:17 crc kubenswrapper[4719]: I1009 15:19:17.901781 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:17Z","lastTransitionTime":"2025-10-09T15:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.004302 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.004375 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.004388 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.004405 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.004414 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:18Z","lastTransitionTime":"2025-10-09T15:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.107184 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.107480 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.107593 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.107661 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.107723 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:18Z","lastTransitionTime":"2025-10-09T15:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.160886 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:18 crc kubenswrapper[4719]: E1009 15:19:18.161023 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.210651 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.210685 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.210694 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.210708 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.210717 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:18Z","lastTransitionTime":"2025-10-09T15:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.312856 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.312914 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.312924 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.312938 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.312947 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:18Z","lastTransitionTime":"2025-10-09T15:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.415247 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.415289 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.415298 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.415311 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.415320 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:18Z","lastTransitionTime":"2025-10-09T15:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.517668 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.517712 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.517723 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.517775 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.517787 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:18Z","lastTransitionTime":"2025-10-09T15:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.619913 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.619965 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.619974 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.619987 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.619995 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:18Z","lastTransitionTime":"2025-10-09T15:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.722343 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.722420 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.722437 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.722454 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.722465 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:18Z","lastTransitionTime":"2025-10-09T15:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.828461 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.828533 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.828542 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.828556 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.828564 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:18Z","lastTransitionTime":"2025-10-09T15:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.930531 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.930564 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.930575 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.930589 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:18 crc kubenswrapper[4719]: I1009 15:19:18.930598 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:18Z","lastTransitionTime":"2025-10-09T15:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.032215 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.032253 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.032265 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.032279 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.032288 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:19Z","lastTransitionTime":"2025-10-09T15:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.134812 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.134850 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.134863 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.134880 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.134891 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:19Z","lastTransitionTime":"2025-10-09T15:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.160508 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.160546 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:19 crc kubenswrapper[4719]: E1009 15:19:19.160628 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.160661 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:19 crc kubenswrapper[4719]: E1009 15:19:19.160723 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:19 crc kubenswrapper[4719]: E1009 15:19:19.160792 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.161624 4719 scope.go:117] "RemoveContainer" containerID="da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84" Oct 09 15:19:19 crc kubenswrapper[4719]: E1009 15:19:19.161882 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.237959 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.237994 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.238002 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.238016 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.238025 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:19Z","lastTransitionTime":"2025-10-09T15:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.339785 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.339809 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.339817 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.339829 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.339837 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:19Z","lastTransitionTime":"2025-10-09T15:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.442061 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.442096 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.442106 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.442121 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.442135 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:19Z","lastTransitionTime":"2025-10-09T15:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.544757 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.544791 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.544802 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.544886 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.544896 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:19Z","lastTransitionTime":"2025-10-09T15:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.647281 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.647317 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.647326 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.647342 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.647371 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:19Z","lastTransitionTime":"2025-10-09T15:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.749094 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.749128 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.749137 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.749150 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.749158 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:19Z","lastTransitionTime":"2025-10-09T15:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.851336 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.851394 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.851405 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.851420 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.851433 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:19Z","lastTransitionTime":"2025-10-09T15:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.957196 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.957229 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.957237 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.957252 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:19 crc kubenswrapper[4719]: I1009 15:19:19.957261 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:19Z","lastTransitionTime":"2025-10-09T15:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.059887 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.059929 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.059937 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.059951 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.059960 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:20Z","lastTransitionTime":"2025-10-09T15:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.160600 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:20 crc kubenswrapper[4719]: E1009 15:19:20.160732 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.161514 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.161558 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.161570 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.161587 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.161598 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:20Z","lastTransitionTime":"2025-10-09T15:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.263327 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.263392 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.263404 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.263422 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.263434 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:20Z","lastTransitionTime":"2025-10-09T15:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.365555 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.365596 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.365605 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.365620 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.365630 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:20Z","lastTransitionTime":"2025-10-09T15:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.468158 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.468209 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.468222 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.468239 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.468250 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:20Z","lastTransitionTime":"2025-10-09T15:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.529941 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs\") pod \"network-metrics-daemon-58bdp\" (UID: \"d00237ae-ca20-4202-8e24-e4988fbf5269\") " pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:20 crc kubenswrapper[4719]: E1009 15:19:20.530074 4719 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 15:19:20 crc kubenswrapper[4719]: E1009 15:19:20.530132 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs podName:d00237ae-ca20-4202-8e24-e4988fbf5269 nodeName:}" failed. No retries permitted until 2025-10-09 15:19:52.530114089 +0000 UTC m=+98.039825374 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs") pod "network-metrics-daemon-58bdp" (UID: "d00237ae-ca20-4202-8e24-e4988fbf5269") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.570077 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.570108 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.570117 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.570130 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.570140 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:20Z","lastTransitionTime":"2025-10-09T15:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.672504 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.672556 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.672595 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.672609 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.672617 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:20Z","lastTransitionTime":"2025-10-09T15:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.774505 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.774542 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.774554 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.774575 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.774589 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:20Z","lastTransitionTime":"2025-10-09T15:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.876416 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.876450 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.876459 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.876472 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.876481 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:20Z","lastTransitionTime":"2025-10-09T15:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.978711 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.978734 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.978742 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.978755 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:20 crc kubenswrapper[4719]: I1009 15:19:20.978764 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:20Z","lastTransitionTime":"2025-10-09T15:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.080780 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.080821 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.080829 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.080843 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.080852 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:21Z","lastTransitionTime":"2025-10-09T15:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.161082 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.161104 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.161119 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:21 crc kubenswrapper[4719]: E1009 15:19:21.161206 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:21 crc kubenswrapper[4719]: E1009 15:19:21.161299 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:21 crc kubenswrapper[4719]: E1009 15:19:21.161401 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.182955 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.182987 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.182995 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.183011 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.183020 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:21Z","lastTransitionTime":"2025-10-09T15:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.285529 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.285571 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.285580 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.285597 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.285606 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:21Z","lastTransitionTime":"2025-10-09T15:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.387718 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.387756 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.387768 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.387783 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.387796 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:21Z","lastTransitionTime":"2025-10-09T15:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.490377 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.490418 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.490428 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.490444 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.490455 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:21Z","lastTransitionTime":"2025-10-09T15:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.592985 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.593034 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.593042 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.593055 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.593065 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:21Z","lastTransitionTime":"2025-10-09T15:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.695549 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.695586 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.695594 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.695608 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.695616 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:21Z","lastTransitionTime":"2025-10-09T15:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.798232 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.798276 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.798287 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.798303 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.798314 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:21Z","lastTransitionTime":"2025-10-09T15:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.900780 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.900812 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.900826 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.900842 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:21 crc kubenswrapper[4719]: I1009 15:19:21.900854 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:21Z","lastTransitionTime":"2025-10-09T15:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.003002 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.003027 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.003035 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.003048 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.003059 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:22Z","lastTransitionTime":"2025-10-09T15:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.105202 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.105250 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.105260 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.105276 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.105287 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:22Z","lastTransitionTime":"2025-10-09T15:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.115681 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.115732 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.115748 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.115769 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.115783 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:22Z","lastTransitionTime":"2025-10-09T15:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:22 crc kubenswrapper[4719]: E1009 15:19:22.129549 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:22Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.132899 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.132932 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.132942 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.132960 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.132971 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:22Z","lastTransitionTime":"2025-10-09T15:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:22 crc kubenswrapper[4719]: E1009 15:19:22.146619 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:22Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.149684 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.149719 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.149730 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.149745 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.149758 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:22Z","lastTransitionTime":"2025-10-09T15:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.161257 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:22 crc kubenswrapper[4719]: E1009 15:19:22.161415 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:22 crc kubenswrapper[4719]: E1009 15:19:22.161308 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:22Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.165367 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.165401 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.165413 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.165427 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.165438 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:22Z","lastTransitionTime":"2025-10-09T15:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:22 crc kubenswrapper[4719]: E1009 15:19:22.176971 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:22Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.180207 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.180236 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.180244 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.180258 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.180267 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:22Z","lastTransitionTime":"2025-10-09T15:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:22 crc kubenswrapper[4719]: E1009 15:19:22.191270 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:22Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:22 crc kubenswrapper[4719]: E1009 15:19:22.191405 4719 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.207831 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.207999 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.208094 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.208199 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.208286 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:22Z","lastTransitionTime":"2025-10-09T15:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.311209 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.311623 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.311748 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.311881 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.312026 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:22Z","lastTransitionTime":"2025-10-09T15:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.414163 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.414189 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.414198 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.414211 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.414219 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:22Z","lastTransitionTime":"2025-10-09T15:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.516667 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.516711 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.516722 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.516738 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.516749 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:22Z","lastTransitionTime":"2025-10-09T15:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.553395 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kmbvp_6a7f4c67-0335-4c58-896a-b3059d9a9a3f/kube-multus/0.log" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.553445 4719 generic.go:334] "Generic (PLEG): container finished" podID="6a7f4c67-0335-4c58-896a-b3059d9a9a3f" containerID="11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783" exitCode=1 Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.553481 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kmbvp" event={"ID":"6a7f4c67-0335-4c58-896a-b3059d9a9a3f","Type":"ContainerDied","Data":"11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783"} Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.553852 4719 scope.go:117] "RemoveContainer" containerID="11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.568483 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:22Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.584187 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd31818-5445-47d9-af8f-fa49dde2a7ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7137edca40a10e85d3116f62b5dbe6ffea35d9473164173af2dea55f1794397c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd0662699e43951e6e139dbe8bb44c36a0120144c90a7f21010cbf68a2abcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4a7a60b0336fb0e1a046f59f9d60cc55a056b70959c7e6a33b6b15b879bd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b628c71ffc4577dac4247fca1780e229a260bd382075e7eeb15d7f71fa688c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b628c71ffc4577dac4247fca1780e229a260bd382075e7eeb15d7f71fa688c27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:22Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.596081 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:22Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.614380 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:19:04Z\\\",\\\"message\\\":\\\"2025-08-24T17:21:41Z]\\\\nI1009 15:19:04.075995 6407 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"d389393c-7ba9-422c-b3f5-06e391d537d2\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.189\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1009 15:19:04.076046 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:19:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:22Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.618095 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.618127 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.618134 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.618147 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.618156 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:22Z","lastTransitionTime":"2025-10-09T15:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.624374 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:22Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.642367 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:22Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.653455 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:22Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.665165 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:22Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.677883 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:22Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.690039 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:22Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.703297 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-58bdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00237ae-ca20-4202-8e24-e4988fbf5269\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-58bdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:22Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.717464 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6f2af57f612bf33446a88a0a093adb3b64f562412d9a0bd03f3964c281ba4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:22Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.720157 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.720198 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.720210 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.720226 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.720238 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:22Z","lastTransitionTime":"2025-10-09T15:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.728906 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:22Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.744045 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:22Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.757019 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"2025-10-09T15:18:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6cb1650a-7cd1-47ae-8a33-6737992732d4\\\\n2025-10-09T15:18:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6cb1650a-7cd1-47ae-8a33-6737992732d4 to /host/opt/cni/bin/\\\\n2025-10-09T15:18:37Z [verbose] multus-daemon started\\\\n2025-10-09T15:18:37Z [verbose] Readiness Indicator file check\\\\n2025-10-09T15:19:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:22Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.769422 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b565dc-6ccc-4404-95f7-c8cf09f91802\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c126340dff33c7a571fc152c4c8ed154e104aaab937ba7f68070763d79825b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df96c88745808317300d950f2d991691695576773b7de02958ec718445cc3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vdgtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:22Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.781427 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:22Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.793793 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:22Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.822387 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.822423 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.822433 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.822449 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.822459 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:22Z","lastTransitionTime":"2025-10-09T15:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.924646 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.924679 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.924690 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.924707 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:22 crc kubenswrapper[4719]: I1009 15:19:22.924717 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:22Z","lastTransitionTime":"2025-10-09T15:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.026969 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.026995 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.027004 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.027017 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.027026 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:23Z","lastTransitionTime":"2025-10-09T15:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.132548 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.132585 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.132594 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.132609 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.132621 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:23Z","lastTransitionTime":"2025-10-09T15:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.160164 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.160222 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.160180 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:23 crc kubenswrapper[4719]: E1009 15:19:23.160294 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:23 crc kubenswrapper[4719]: E1009 15:19:23.160463 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:23 crc kubenswrapper[4719]: E1009 15:19:23.160551 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.235338 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.236021 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.236088 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.236151 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.236214 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:23Z","lastTransitionTime":"2025-10-09T15:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.338903 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.338933 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.338941 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.338955 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.338964 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:23Z","lastTransitionTime":"2025-10-09T15:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.440921 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.440969 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.440982 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.441000 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.441010 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:23Z","lastTransitionTime":"2025-10-09T15:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.543108 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.543136 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.543144 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.543160 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.543170 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:23Z","lastTransitionTime":"2025-10-09T15:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.556905 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kmbvp_6a7f4c67-0335-4c58-896a-b3059d9a9a3f/kube-multus/0.log" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.556965 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kmbvp" event={"ID":"6a7f4c67-0335-4c58-896a-b3059d9a9a3f","Type":"ContainerStarted","Data":"201751e1a01c1fefb61309835c66a89743c507dff1e0d6e75a5ecf3447831840"} Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.570521 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6f2af57f612bf33446a88a0a093adb3b64f562412d9a0bd03f3964c281ba4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:23Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.581534 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:23Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.594945 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:23Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.607118 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201751e1a01c1fefb61309835c66a89743c507dff1e0d6e75a5ecf3447831840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"2025-10-09T15:18:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6cb1650a-7cd1-47ae-8a33-6737992732d4\\\\n2025-10-09T15:18:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6cb1650a-7cd1-47ae-8a33-6737992732d4 to /host/opt/cni/bin/\\\\n2025-10-09T15:18:37Z [verbose] multus-daemon started\\\\n2025-10-09T15:18:37Z [verbose] Readiness Indicator file check\\\\n2025-10-09T15:19:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:23Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.616857 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b565dc-6ccc-4404-95f7-c8cf09f91802\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c126340dff33c7a571fc152c4c8ed154e104aaab937ba7f68070763d79825b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df96c88745808317300d950f2d991691695576773b7de02958ec718445cc3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vdgtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:23Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.628720 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:23Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.639402 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:23Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.645423 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.645462 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.645476 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.645493 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.645526 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:23Z","lastTransitionTime":"2025-10-09T15:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.650247 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:23Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.661274 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd31818-5445-47d9-af8f-fa49dde2a7ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7137edca40a10e85d3116f62b5dbe6ffea35d9473164173af2dea55f1794397c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd0662699e43951e6e139dbe8bb44c36a0120144c90a7f21010cbf68a2abcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4a7a60b0336fb0e1a046f59f9d60cc55a056b70959c7e6a33b6b15b879bd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b628c71ffc4577dac4247fca1780e229a260bd382075e7eeb15d7f71fa688c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b628c71ffc4577dac4247fca1780e229a260bd382075e7eeb15d7f71fa688c27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:23Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.672130 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:23Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.688019 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:19:04Z\\\",\\\"message\\\":\\\"2025-08-24T17:21:41Z]\\\\nI1009 15:19:04.075995 6407 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"d389393c-7ba9-422c-b3f5-06e391d537d2\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.189\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1009 15:19:04.076046 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:19:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:23Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.697419 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:23Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.714060 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:23Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.748076 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.748136 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.748145 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.748160 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.748170 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:23Z","lastTransitionTime":"2025-10-09T15:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.759093 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:23Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.775821 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:23Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.788964 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:23Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.799196 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:23Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.808513 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-58bdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00237ae-ca20-4202-8e24-e4988fbf5269\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-58bdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:23Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.850834 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.850874 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.850883 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.850897 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.850906 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:23Z","lastTransitionTime":"2025-10-09T15:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.952972 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.953019 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.953030 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.953050 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:23 crc kubenswrapper[4719]: I1009 15:19:23.953060 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:23Z","lastTransitionTime":"2025-10-09T15:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.054959 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.054992 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.055001 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.055014 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.055023 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:24Z","lastTransitionTime":"2025-10-09T15:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.156662 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.156695 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.156706 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.156722 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.156732 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:24Z","lastTransitionTime":"2025-10-09T15:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.161076 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:24 crc kubenswrapper[4719]: E1009 15:19:24.161315 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.170961 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.258954 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.258990 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.258999 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.259012 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.259021 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:24Z","lastTransitionTime":"2025-10-09T15:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.360543 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.360582 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.360592 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.360606 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.360615 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:24Z","lastTransitionTime":"2025-10-09T15:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.462558 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.462616 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.462628 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.462645 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.462656 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:24Z","lastTransitionTime":"2025-10-09T15:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.564140 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.564178 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.564186 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.564201 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.564211 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:24Z","lastTransitionTime":"2025-10-09T15:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.666807 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.666849 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.666861 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.666878 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.666890 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:24Z","lastTransitionTime":"2025-10-09T15:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.769428 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.769479 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.769509 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.769537 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.769549 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:24Z","lastTransitionTime":"2025-10-09T15:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.871199 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.871238 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.871247 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.871260 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.871269 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:24Z","lastTransitionTime":"2025-10-09T15:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.973246 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.973278 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.973287 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.973299 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:24 crc kubenswrapper[4719]: I1009 15:19:24.973309 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:24Z","lastTransitionTime":"2025-10-09T15:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.075635 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.075682 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.075690 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.075705 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.075714 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:25Z","lastTransitionTime":"2025-10-09T15:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.160463 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.160508 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.160496 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:25 crc kubenswrapper[4719]: E1009 15:19:25.160600 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:25 crc kubenswrapper[4719]: E1009 15:19:25.160701 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:25 crc kubenswrapper[4719]: E1009 15:19:25.160770 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.176616 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201751e1a01c1fefb61309835c66a89743c507dff1e0d6e75a5ecf3447831840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"2025-10-09T15:18:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6cb1650a-7cd1-47ae-8a33-6737992732d4\\\\n2025-10-09T15:18:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6cb1650a-7cd1-47ae-8a33-6737992732d4 to /host/opt/cni/bin/\\\\n2025-10-09T15:18:37Z [verbose] multus-daemon started\\\\n2025-10-09T15:18:37Z [verbose] Readiness Indicator file check\\\\n2025-10-09T15:19:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:25Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.178470 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.178496 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.178505 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.178518 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.178527 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:25Z","lastTransitionTime":"2025-10-09T15:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.192708 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b565dc-6ccc-4404-95f7-c8cf09f91802\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c126340dff33c7a571fc152c4c8ed154e104aaab937ba7f68070763d79825b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df96c88745808317300d950f2d991691695576773b7de02958ec718445cc3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vdgtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:25Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.211477 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6f2af57f612bf33446a88a0a093adb3b64f562412d9a0bd03f3964c281ba4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:25Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.225078 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:25Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.238987 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:25Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.250593 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1789dfc1-aa86-4e27-ae75-b5078112f7fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff1066e8910a7aa889e7cc5c7b2735a240197a60b66c9471b4fda297dba4176f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a73e5700d7dbee6fac767db433a82521b7af9241107369d3be4aa00593128763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a73e5700d7dbee6fac767db433a82521b7af9241107369d3be4aa00593128763\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:25Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.266303 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:25Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.279237 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:25Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.280489 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.280540 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.280550 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.280564 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.280574 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:25Z","lastTransitionTime":"2025-10-09T15:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.296321 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:19:04Z\\\",\\\"message\\\":\\\"2025-08-24T17:21:41Z]\\\\nI1009 15:19:04.075995 6407 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"d389393c-7ba9-422c-b3f5-06e391d537d2\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.189\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1009 15:19:04.076046 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:19:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:25Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.304787 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:25Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.314517 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:25Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.324669 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd31818-5445-47d9-af8f-fa49dde2a7ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7137edca40a10e85d3116f62b5dbe6ffea35d9473164173af2dea55f1794397c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd0662699e43951e6e139dbe8bb44c36a0120144c90a7f21010cbf68a2abcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4a7a60b0336fb0e1a046f59f9d60cc55a056b70959c7e6a33b6b15b879bd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b628c71ffc4577dac4247fca1780e229a260bd382075e7eeb15d7f71fa688c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b628c71ffc4577dac4247fca1780e229a260bd382075e7eeb15d7f71fa688c27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:25Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.336810 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:25Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.349135 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:25Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.358725 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:25Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.368662 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-58bdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00237ae-ca20-4202-8e24-e4988fbf5269\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-58bdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:25Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.382288 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.382320 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.382329 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.382345 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.382382 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:25Z","lastTransitionTime":"2025-10-09T15:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.389714 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:25Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.402820 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:25Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.413819 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:25Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.485412 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.485447 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.485456 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.485470 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.485478 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:25Z","lastTransitionTime":"2025-10-09T15:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.587748 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.587787 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.587798 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.587812 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.587823 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:25Z","lastTransitionTime":"2025-10-09T15:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.690392 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.690459 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.690473 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.690492 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.690504 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:25Z","lastTransitionTime":"2025-10-09T15:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.792440 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.792484 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.792493 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.792507 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.792518 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:25Z","lastTransitionTime":"2025-10-09T15:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.894399 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.894432 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.894440 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.894453 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.894463 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:25Z","lastTransitionTime":"2025-10-09T15:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.996609 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.996710 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.996830 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.996846 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:25 crc kubenswrapper[4719]: I1009 15:19:25.996855 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:25Z","lastTransitionTime":"2025-10-09T15:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.099582 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.099634 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.099648 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.099674 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.099689 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:26Z","lastTransitionTime":"2025-10-09T15:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.160132 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:26 crc kubenswrapper[4719]: E1009 15:19:26.160301 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.202462 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.202502 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.202513 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.202528 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.202539 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:26Z","lastTransitionTime":"2025-10-09T15:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.304870 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.304909 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.304918 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.304933 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.304942 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:26Z","lastTransitionTime":"2025-10-09T15:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.407247 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.407283 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.407293 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.407307 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.407315 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:26Z","lastTransitionTime":"2025-10-09T15:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.509916 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.509964 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.509978 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.509996 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.510008 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:26Z","lastTransitionTime":"2025-10-09T15:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.612338 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.612484 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.612499 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.612515 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.612528 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:26Z","lastTransitionTime":"2025-10-09T15:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.715497 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.715578 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.715593 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.715613 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.715627 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:26Z","lastTransitionTime":"2025-10-09T15:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.818450 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.818488 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.818498 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.818512 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.818522 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:26Z","lastTransitionTime":"2025-10-09T15:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.921014 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.921373 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.921383 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.921400 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:26 crc kubenswrapper[4719]: I1009 15:19:26.921410 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:26Z","lastTransitionTime":"2025-10-09T15:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.023609 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.023664 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.023675 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.023691 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.023703 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:27Z","lastTransitionTime":"2025-10-09T15:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.126146 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.126206 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.126219 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.126237 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.126249 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:27Z","lastTransitionTime":"2025-10-09T15:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.160781 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.160817 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.160820 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:27 crc kubenswrapper[4719]: E1009 15:19:27.160926 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:27 crc kubenswrapper[4719]: E1009 15:19:27.160988 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:27 crc kubenswrapper[4719]: E1009 15:19:27.161082 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.228424 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.228464 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.228476 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.228491 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.228503 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:27Z","lastTransitionTime":"2025-10-09T15:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.330695 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.330739 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.330753 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.330772 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.330784 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:27Z","lastTransitionTime":"2025-10-09T15:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.463733 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.463773 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.463784 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.463800 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.463810 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:27Z","lastTransitionTime":"2025-10-09T15:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.566870 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.566910 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.566918 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.566935 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.566946 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:27Z","lastTransitionTime":"2025-10-09T15:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.669605 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.669651 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.669669 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.669684 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.669693 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:27Z","lastTransitionTime":"2025-10-09T15:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.772221 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.772280 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.772292 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.772308 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.772320 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:27Z","lastTransitionTime":"2025-10-09T15:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.875272 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.875339 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.875385 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.875403 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.875415 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:27Z","lastTransitionTime":"2025-10-09T15:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.977723 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.977780 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.977794 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.977807 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:27 crc kubenswrapper[4719]: I1009 15:19:27.977819 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:27Z","lastTransitionTime":"2025-10-09T15:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.080330 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.080436 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.080484 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.080512 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.080531 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:28Z","lastTransitionTime":"2025-10-09T15:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.160738 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:28 crc kubenswrapper[4719]: E1009 15:19:28.160913 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.182524 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.182562 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.182571 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.182585 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.182594 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:28Z","lastTransitionTime":"2025-10-09T15:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.284967 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.285013 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.285028 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.285047 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.285058 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:28Z","lastTransitionTime":"2025-10-09T15:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.387720 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.387758 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.387774 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.387792 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.387804 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:28Z","lastTransitionTime":"2025-10-09T15:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.490233 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.490274 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.490285 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.490300 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.490310 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:28Z","lastTransitionTime":"2025-10-09T15:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.594331 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.594376 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.594385 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.594397 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.594407 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:28Z","lastTransitionTime":"2025-10-09T15:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.697161 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.697207 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.697219 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.697237 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.697248 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:28Z","lastTransitionTime":"2025-10-09T15:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.800062 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.800102 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.800110 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.800124 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.800132 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:28Z","lastTransitionTime":"2025-10-09T15:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.902315 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.902363 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.902376 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.902392 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:28 crc kubenswrapper[4719]: I1009 15:19:28.902402 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:28Z","lastTransitionTime":"2025-10-09T15:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.004472 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.004526 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.004542 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.004564 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.004581 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:29Z","lastTransitionTime":"2025-10-09T15:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.106382 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.106420 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.106434 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.106448 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.106458 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:29Z","lastTransitionTime":"2025-10-09T15:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.160297 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.160408 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:29 crc kubenswrapper[4719]: E1009 15:19:29.160598 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:29 crc kubenswrapper[4719]: E1009 15:19:29.160640 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.160777 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:29 crc kubenswrapper[4719]: E1009 15:19:29.160960 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.208285 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.208332 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.208345 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.208381 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.208389 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:29Z","lastTransitionTime":"2025-10-09T15:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.310707 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.310941 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.311038 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.311164 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.311260 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:29Z","lastTransitionTime":"2025-10-09T15:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.413765 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.413797 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.413807 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.413822 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.413833 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:29Z","lastTransitionTime":"2025-10-09T15:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.516281 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.516317 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.516328 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.516373 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.516383 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:29Z","lastTransitionTime":"2025-10-09T15:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.618614 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.618648 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.618657 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.618670 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.618678 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:29Z","lastTransitionTime":"2025-10-09T15:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.721593 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.721642 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.721657 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.721681 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.721695 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:29Z","lastTransitionTime":"2025-10-09T15:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.824216 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.824266 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.824275 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.824290 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.824300 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:29Z","lastTransitionTime":"2025-10-09T15:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.926456 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.926488 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.926497 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.926510 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:29 crc kubenswrapper[4719]: I1009 15:19:29.926520 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:29Z","lastTransitionTime":"2025-10-09T15:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.029561 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.029629 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.029640 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.029677 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.029685 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:30Z","lastTransitionTime":"2025-10-09T15:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.132087 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.132121 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.132129 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.132143 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.132153 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:30Z","lastTransitionTime":"2025-10-09T15:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.160964 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:30 crc kubenswrapper[4719]: E1009 15:19:30.161102 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.162345 4719 scope.go:117] "RemoveContainer" containerID="da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.234189 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.234314 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.234406 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.234476 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.234530 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:30Z","lastTransitionTime":"2025-10-09T15:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.336947 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.336973 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.336984 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.337000 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.337011 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:30Z","lastTransitionTime":"2025-10-09T15:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.439247 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.439300 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.439321 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.439387 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.439412 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:30Z","lastTransitionTime":"2025-10-09T15:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.541864 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.541892 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.541899 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.541911 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.541920 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:30Z","lastTransitionTime":"2025-10-09T15:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.576074 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zv8jk_fea6a48c-769c-41bf-95ce-649cc31eb4e5/ovnkube-controller/2.log" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.577800 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerStarted","Data":"4859b0f970ed0dea88b96ebd820f8f3806673c1ffff2ad8398b0934dec9535a8"} Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.584858 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.598866 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1789dfc1-aa86-4e27-ae75-b5078112f7fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff1066e8910a7aa889e7cc5c7b2735a240197a60b66c9471b4fda297dba4176f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a73e5700d7dbee6fac767db433a82521b7af9241107369d3be4aa00593128763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a73e5700d7dbee6fac767db433a82521b7af9241107369d3be4aa00593128763\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:30Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.610770 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:30Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.620547 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:30Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.629458 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:30Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.641307 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:30Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.643853 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.643871 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.643879 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.643891 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.643902 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:30Z","lastTransitionTime":"2025-10-09T15:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.655955 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd31818-5445-47d9-af8f-fa49dde2a7ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7137edca40a10e85d3116f62b5dbe6ffea35d9473164173af2dea55f1794397c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd0662699e43951e6e139dbe8bb44c36a0120144c90a7f21010cbf68a2abcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4a7a60b0336fb0e1a046f59f9d60cc55a056b70959c7e6a33b6b15b879bd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b628c71ffc4577dac4247fca1780e229a260bd382075e7eeb15d7f71fa688c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b628c71ffc4577dac4247fca1780e229a260bd382075e7eeb15d7f71fa688c27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:30Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.669531 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:30Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.687085 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4859b0f970ed0dea88b96ebd820f8f3806673c1ffff2ad8398b0934dec9535a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:19:04Z\\\",\\\"message\\\":\\\"2025-08-24T17:21:41Z]\\\\nI1009 15:19:04.075995 6407 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"d389393c-7ba9-422c-b3f5-06e391d537d2\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.189\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1009 15:19:04.076046 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:19:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:30Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.696473 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:30Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.709328 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-58bdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00237ae-ca20-4202-8e24-e4988fbf5269\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-58bdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:30Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.739100 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:30Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.745989 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.746026 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.746038 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.746056 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.746069 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:30Z","lastTransitionTime":"2025-10-09T15:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.750981 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:30Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.763126 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:30Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.773224 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:30Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.782193 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b565dc-6ccc-4404-95f7-c8cf09f91802\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c126340dff33c7a571fc152c4c8ed154e104aaab937ba7f68070763d79825b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df96c88745808317300d950f2d991691695576773b7de02958ec718445cc3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vdgtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:30Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.794372 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6f2af57f612bf33446a88a0a093adb3b64f562412d9a0bd03f3964c281ba4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:30Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.805027 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:30Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.819290 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:30Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.832817 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201751e1a01c1fefb61309835c66a89743c507dff1e0d6e75a5ecf3447831840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"2025-10-09T15:18:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6cb1650a-7cd1-47ae-8a33-6737992732d4\\\\n2025-10-09T15:18:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6cb1650a-7cd1-47ae-8a33-6737992732d4 to /host/opt/cni/bin/\\\\n2025-10-09T15:18:37Z [verbose] multus-daemon started\\\\n2025-10-09T15:18:37Z [verbose] Readiness Indicator file check\\\\n2025-10-09T15:19:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:30Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.849275 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.849320 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.849331 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.849365 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.849380 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:30Z","lastTransitionTime":"2025-10-09T15:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.951391 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.951441 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.951452 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.951470 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:30 crc kubenswrapper[4719]: I1009 15:19:30.951482 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:30Z","lastTransitionTime":"2025-10-09T15:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.053950 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.053988 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.053996 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.054009 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.054018 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:31Z","lastTransitionTime":"2025-10-09T15:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.156582 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.156631 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.156642 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.156659 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.156672 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:31Z","lastTransitionTime":"2025-10-09T15:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.160180 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.160218 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:31 crc kubenswrapper[4719]: E1009 15:19:31.160798 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.160568 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:31 crc kubenswrapper[4719]: E1009 15:19:31.160881 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:31 crc kubenswrapper[4719]: E1009 15:19:31.160970 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.258610 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.258653 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.258671 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.258691 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.258706 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:31Z","lastTransitionTime":"2025-10-09T15:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.360920 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.360974 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.360990 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.361009 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.361022 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:31Z","lastTransitionTime":"2025-10-09T15:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.463680 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.463760 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.463784 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.463815 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.463843 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:31Z","lastTransitionTime":"2025-10-09T15:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.566150 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.566435 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.566461 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.566489 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.566510 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:31Z","lastTransitionTime":"2025-10-09T15:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.581490 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zv8jk_fea6a48c-769c-41bf-95ce-649cc31eb4e5/ovnkube-controller/3.log" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.581946 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zv8jk_fea6a48c-769c-41bf-95ce-649cc31eb4e5/ovnkube-controller/2.log" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.584162 4719 generic.go:334] "Generic (PLEG): container finished" podID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerID="4859b0f970ed0dea88b96ebd820f8f3806673c1ffff2ad8398b0934dec9535a8" exitCode=1 Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.584199 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerDied","Data":"4859b0f970ed0dea88b96ebd820f8f3806673c1ffff2ad8398b0934dec9535a8"} Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.584247 4719 scope.go:117] "RemoveContainer" containerID="da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.586964 4719 scope.go:117] "RemoveContainer" containerID="4859b0f970ed0dea88b96ebd820f8f3806673c1ffff2ad8398b0934dec9535a8" Oct 09 15:19:31 crc kubenswrapper[4719]: E1009 15:19:31.588008 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.601769 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6f2af57f612bf33446a88a0a093adb3b64f562412d9a0bd03f3964c281ba4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:31Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.619304 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:31Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.641202 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:31Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.653455 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201751e1a01c1fefb61309835c66a89743c507dff1e0d6e75a5ecf3447831840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"2025-10-09T15:18:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6cb1650a-7cd1-47ae-8a33-6737992732d4\\\\n2025-10-09T15:18:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6cb1650a-7cd1-47ae-8a33-6737992732d4 to /host/opt/cni/bin/\\\\n2025-10-09T15:18:37Z [verbose] multus-daemon started\\\\n2025-10-09T15:18:37Z [verbose] Readiness Indicator file check\\\\n2025-10-09T15:19:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:31Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.664882 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b565dc-6ccc-4404-95f7-c8cf09f91802\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c126340dff33c7a571fc152c4c8ed154e104aaab937ba7f68070763d79825b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df96c88745808317300d950f2d991691695576773b7de02958ec718445cc3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vdgtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:31Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.668532 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.668565 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.668577 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.668592 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.668603 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:31Z","lastTransitionTime":"2025-10-09T15:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.676093 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1789dfc1-aa86-4e27-ae75-b5078112f7fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff1066e8910a7aa889e7cc5c7b2735a240197a60b66c9471b4fda297dba4176f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a73e5700d7dbee6fac767db433a82521b7af9241107369d3be4aa00593128763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a73e5700d7dbee6fac767db433a82521b7af9241107369d3be4aa00593128763\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:31Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.696155 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:31Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.710609 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:31Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.727393 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:31Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.741982 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd31818-5445-47d9-af8f-fa49dde2a7ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7137edca40a10e85d3116f62b5dbe6ffea35d9473164173af2dea55f1794397c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd0662699e43951e6e139dbe8bb44c36a0120144c90a7f21010cbf68a2abcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4a7a60b0336fb0e1a046f59f9d60cc55a056b70959c7e6a33b6b15b879bd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b628c71ffc4577dac4247fca1780e229a260bd382075e7eeb15d7f71fa688c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b628c71ffc4577dac4247fca1780e229a260bd382075e7eeb15d7f71fa688c27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:31Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.756967 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:31Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.771197 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.771258 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.771274 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.771296 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.771312 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:31Z","lastTransitionTime":"2025-10-09T15:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.779479 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4859b0f970ed0dea88b96ebd820f8f3806673c1ffff2ad8398b0934dec9535a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da91bde07d2b150f89890e9c8e745bac9308b61aa757606dd242a48a1c24fd84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:19:04Z\\\",\\\"message\\\":\\\"2025-08-24T17:21:41Z]\\\\nI1009 15:19:04.075995 6407 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"d389393c-7ba9-422c-b3f5-06e391d537d2\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/community-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.189\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1009 15:19:04.076046 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:19:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4859b0f970ed0dea88b96ebd820f8f3806673c1ffff2ad8398b0934dec9535a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:19:30Z\\\",\\\"message\\\":\\\"766 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 15:19:30.885977 6766 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 15:19:30.886090 6766 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 15:19:30.886123 6766 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 15:19:30.886425 6766 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 15:19:30.886481 6766 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 15:19:30.886521 6766 factory.go:656] Stopping watch factory\\\\nI1009 15:19:30.886562 6766 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 15:19:30.886590 6766 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 15:19:30.896640 6766 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1009 15:19:30.896672 6766 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1009 15:19:30.896720 6766 ovnkube.go:599] Stopped ovnkube\\\\nI1009 15:19:30.896740 6766 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1009 15:19:30.896821 6766 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:19:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:31Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.790846 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:31Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.819232 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:31Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.834271 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:31Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.847170 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:31Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.860709 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:31Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.870923 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:31Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.873800 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.873850 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.873862 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.873882 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.873894 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:31Z","lastTransitionTime":"2025-10-09T15:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.880737 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-58bdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00237ae-ca20-4202-8e24-e4988fbf5269\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-58bdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:31Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.976199 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.976262 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.976275 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.976292 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:31 crc kubenswrapper[4719]: I1009 15:19:31.976304 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:31Z","lastTransitionTime":"2025-10-09T15:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.078845 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.078890 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.078901 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.078918 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.078929 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:32Z","lastTransitionTime":"2025-10-09T15:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.161136 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:32 crc kubenswrapper[4719]: E1009 15:19:32.161323 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.181004 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.181056 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.181064 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.181079 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.181106 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:32Z","lastTransitionTime":"2025-10-09T15:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.283163 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.283197 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.283204 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.283217 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.283225 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:32Z","lastTransitionTime":"2025-10-09T15:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.304803 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.304842 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.304853 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.304869 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.304881 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:32Z","lastTransitionTime":"2025-10-09T15:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:32 crc kubenswrapper[4719]: E1009 15:19:32.323619 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:32Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.327240 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.327285 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.327295 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.327307 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.327317 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:32Z","lastTransitionTime":"2025-10-09T15:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:32 crc kubenswrapper[4719]: E1009 15:19:32.343034 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:32Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.346220 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.346268 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.346278 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.346293 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.346304 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:32Z","lastTransitionTime":"2025-10-09T15:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:32 crc kubenswrapper[4719]: E1009 15:19:32.362926 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:32Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.367222 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.367271 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.367282 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.367298 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.367311 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:32Z","lastTransitionTime":"2025-10-09T15:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:32 crc kubenswrapper[4719]: E1009 15:19:32.379789 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:32Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.383053 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.383087 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.383097 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.383113 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.383125 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:32Z","lastTransitionTime":"2025-10-09T15:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:32 crc kubenswrapper[4719]: E1009 15:19:32.398869 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:32Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:32 crc kubenswrapper[4719]: E1009 15:19:32.399001 4719 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.400600 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.400652 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.400669 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.400696 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.400713 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:32Z","lastTransitionTime":"2025-10-09T15:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.503391 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.503442 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.503456 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.503475 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.503486 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:32Z","lastTransitionTime":"2025-10-09T15:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.589933 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zv8jk_fea6a48c-769c-41bf-95ce-649cc31eb4e5/ovnkube-controller/3.log" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.594658 4719 scope.go:117] "RemoveContainer" containerID="4859b0f970ed0dea88b96ebd820f8f3806673c1ffff2ad8398b0934dec9535a8" Oct 09 15:19:32 crc kubenswrapper[4719]: E1009 15:19:32.594908 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.605486 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.605542 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.605559 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.605580 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.605595 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:32Z","lastTransitionTime":"2025-10-09T15:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.606682 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1789dfc1-aa86-4e27-ae75-b5078112f7fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff1066e8910a7aa889e7cc5c7b2735a240197a60b66c9471b4fda297dba4176f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a73e5700d7dbee6fac767db433a82521b7af9241107369d3be4aa00593128763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a73e5700d7dbee6fac767db433a82521b7af9241107369d3be4aa00593128763\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:32Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.623906 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:32Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.637956 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:32Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.651184 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:32Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.667687 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd31818-5445-47d9-af8f-fa49dde2a7ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7137edca40a10e85d3116f62b5dbe6ffea35d9473164173af2dea55f1794397c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd0662699e43951e6e139dbe8bb44c36a0120144c90a7f21010cbf68a2abcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4a7a60b0336fb0e1a046f59f9d60cc55a056b70959c7e6a33b6b15b879bd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b628c71ffc4577dac4247fca1780e229a260bd382075e7eeb15d7f71fa688c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b628c71ffc4577dac4247fca1780e229a260bd382075e7eeb15d7f71fa688c27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:32Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.679934 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:32Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.703383 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4859b0f970ed0dea88b96ebd820f8f3806673c1ffff2ad8398b0934dec9535a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4859b0f970ed0dea88b96ebd820f8f3806673c1ffff2ad8398b0934dec9535a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:19:30Z\\\",\\\"message\\\":\\\"766 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 15:19:30.885977 6766 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 15:19:30.886090 6766 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 15:19:30.886123 6766 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 15:19:30.886425 6766 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 15:19:30.886481 6766 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 15:19:30.886521 6766 factory.go:656] Stopping watch factory\\\\nI1009 15:19:30.886562 6766 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 15:19:30.886590 6766 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 15:19:30.896640 6766 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1009 15:19:30.896672 6766 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1009 15:19:30.896720 6766 ovnkube.go:599] Stopped ovnkube\\\\nI1009 15:19:30.896740 6766 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1009 15:19:30.896821 6766 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:19:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:32Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.707638 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.707704 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.707727 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.707756 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.707777 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:32Z","lastTransitionTime":"2025-10-09T15:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.713552 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:32Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.730227 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:32Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.744944 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:32Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.757684 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:32Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.769064 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:32Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.778740 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:32Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.788083 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-58bdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00237ae-ca20-4202-8e24-e4988fbf5269\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-58bdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:32Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.800581 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6f2af57f612bf33446a88a0a093adb3b64f562412d9a0bd03f3964c281ba4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:32Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.810055 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.810103 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.810113 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.810130 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.810142 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:32Z","lastTransitionTime":"2025-10-09T15:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.810838 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:32Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.824438 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:32Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.835683 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201751e1a01c1fefb61309835c66a89743c507dff1e0d6e75a5ecf3447831840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"2025-10-09T15:18:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6cb1650a-7cd1-47ae-8a33-6737992732d4\\\\n2025-10-09T15:18:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6cb1650a-7cd1-47ae-8a33-6737992732d4 to /host/opt/cni/bin/\\\\n2025-10-09T15:18:37Z [verbose] multus-daemon started\\\\n2025-10-09T15:18:37Z [verbose] Readiness Indicator file check\\\\n2025-10-09T15:19:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:32Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.845472 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b565dc-6ccc-4404-95f7-c8cf09f91802\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c126340dff33c7a571fc152c4c8ed154e104aaab937ba7f68070763d79825b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df96c88745808317300d950f2d991691695576773b7de02958ec718445cc3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vdgtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:32Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.912450 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.912501 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.912513 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.912532 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:32 crc kubenswrapper[4719]: I1009 15:19:32.912540 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:32Z","lastTransitionTime":"2025-10-09T15:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.014230 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.014271 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.014280 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.014295 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.014304 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:33Z","lastTransitionTime":"2025-10-09T15:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.116544 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.116591 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.116607 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.116624 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.116635 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:33Z","lastTransitionTime":"2025-10-09T15:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.160559 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.160598 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.160600 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:33 crc kubenswrapper[4719]: E1009 15:19:33.160699 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:33 crc kubenswrapper[4719]: E1009 15:19:33.160787 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:33 crc kubenswrapper[4719]: E1009 15:19:33.160880 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.218496 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.218530 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.218538 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.218551 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.218560 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:33Z","lastTransitionTime":"2025-10-09T15:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.321383 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.321486 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.321531 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.321593 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.321604 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:33Z","lastTransitionTime":"2025-10-09T15:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.425323 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.425394 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.425406 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.425427 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.425438 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:33Z","lastTransitionTime":"2025-10-09T15:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.528620 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.528666 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.528675 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.528692 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.528704 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:33Z","lastTransitionTime":"2025-10-09T15:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.631151 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.631234 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.631258 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.631287 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.631309 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:33Z","lastTransitionTime":"2025-10-09T15:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.733315 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.733372 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.733387 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.733404 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.733416 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:33Z","lastTransitionTime":"2025-10-09T15:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.837346 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.837409 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.837420 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.837434 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.837444 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:33Z","lastTransitionTime":"2025-10-09T15:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.941166 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.941220 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.941242 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.941270 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:33 crc kubenswrapper[4719]: I1009 15:19:33.941289 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:33Z","lastTransitionTime":"2025-10-09T15:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.045087 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.045147 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.045159 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.045183 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.045201 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:34Z","lastTransitionTime":"2025-10-09T15:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.148071 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.148123 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.148136 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.148152 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.148164 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:34Z","lastTransitionTime":"2025-10-09T15:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.160577 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:34 crc kubenswrapper[4719]: E1009 15:19:34.160703 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.251889 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.251930 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.251939 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.251954 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.251963 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:34Z","lastTransitionTime":"2025-10-09T15:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.355342 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.355393 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.355404 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.355421 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.355431 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:34Z","lastTransitionTime":"2025-10-09T15:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.459407 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.459482 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.459504 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.459535 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.459560 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:34Z","lastTransitionTime":"2025-10-09T15:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.562638 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.562689 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.562698 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.562717 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.562728 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:34Z","lastTransitionTime":"2025-10-09T15:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.666713 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.666764 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.666781 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.666807 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.666827 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:34Z","lastTransitionTime":"2025-10-09T15:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.769393 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.769425 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.769433 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.769445 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.769453 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:34Z","lastTransitionTime":"2025-10-09T15:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.871327 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.871384 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.871397 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.871414 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.871425 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:34Z","lastTransitionTime":"2025-10-09T15:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.973108 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.973155 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.973167 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.973183 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:34 crc kubenswrapper[4719]: I1009 15:19:34.973191 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:34Z","lastTransitionTime":"2025-10-09T15:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.075542 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.075587 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.075596 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.075609 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.075621 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:35Z","lastTransitionTime":"2025-10-09T15:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.160247 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.160314 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.160255 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:35 crc kubenswrapper[4719]: E1009 15:19:35.160583 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:35 crc kubenswrapper[4719]: E1009 15:19:35.160706 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:35 crc kubenswrapper[4719]: E1009 15:19:35.160859 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.176636 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:35Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.177823 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.177849 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.177861 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.177877 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.177889 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:35Z","lastTransitionTime":"2025-10-09T15:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.189787 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:35Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.198683 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:35Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.208804 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-58bdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00237ae-ca20-4202-8e24-e4988fbf5269\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-58bdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:35Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.226696 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:35Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.240177 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:35Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.255479 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:35Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.268007 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201751e1a01c1fefb61309835c66a89743c507dff1e0d6e75a5ecf3447831840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"2025-10-09T15:18:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6cb1650a-7cd1-47ae-8a33-6737992732d4\\\\n2025-10-09T15:18:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6cb1650a-7cd1-47ae-8a33-6737992732d4 to /host/opt/cni/bin/\\\\n2025-10-09T15:18:37Z [verbose] multus-daemon started\\\\n2025-10-09T15:18:37Z [verbose] Readiness Indicator file check\\\\n2025-10-09T15:19:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:35Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.278285 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b565dc-6ccc-4404-95f7-c8cf09f91802\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c126340dff33c7a571fc152c4c8ed154e104aaab937ba7f68070763d79825b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df96c88745808317300d950f2d991691695576773b7de02958ec718445cc3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vdgtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:35Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.279861 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.279889 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.279899 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.279911 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.279920 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:35Z","lastTransitionTime":"2025-10-09T15:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.290174 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6f2af57f612bf33446a88a0a093adb3b64f562412d9a0bd03f3964c281ba4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:35Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.299545 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:35Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.309128 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:35Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.318289 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1789dfc1-aa86-4e27-ae75-b5078112f7fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff1066e8910a7aa889e7cc5c7b2735a240197a60b66c9471b4fda297dba4176f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a73e5700d7dbee6fac767db433a82521b7af9241107369d3be4aa00593128763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a73e5700d7dbee6fac767db433a82521b7af9241107369d3be4aa00593128763\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:35Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.328141 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:35Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.339837 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:35Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.356220 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4859b0f970ed0dea88b96ebd820f8f3806673c1ffff2ad8398b0934dec9535a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4859b0f970ed0dea88b96ebd820f8f3806673c1ffff2ad8398b0934dec9535a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:19:30Z\\\",\\\"message\\\":\\\"766 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 15:19:30.885977 6766 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 15:19:30.886090 6766 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 15:19:30.886123 6766 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 15:19:30.886425 6766 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 15:19:30.886481 6766 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 15:19:30.886521 6766 factory.go:656] Stopping watch factory\\\\nI1009 15:19:30.886562 6766 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 15:19:30.886590 6766 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 15:19:30.896640 6766 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1009 15:19:30.896672 6766 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1009 15:19:30.896720 6766 ovnkube.go:599] Stopped ovnkube\\\\nI1009 15:19:30.896740 6766 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1009 15:19:30.896821 6766 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:19:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:35Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.366081 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:35Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.379142 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:35Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.382168 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.382206 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.382218 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.382231 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.382241 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:35Z","lastTransitionTime":"2025-10-09T15:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.392660 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd31818-5445-47d9-af8f-fa49dde2a7ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7137edca40a10e85d3116f62b5dbe6ffea35d9473164173af2dea55f1794397c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd0662699e43951e6e139dbe8bb44c36a0120144c90a7f21010cbf68a2abcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4a7a60b0336fb0e1a046f59f9d60cc55a056b70959c7e6a33b6b15b879bd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b628c71ffc4577dac4247fca1780e229a260bd382075e7eeb15d7f71fa688c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b628c71ffc4577dac4247fca1780e229a260bd382075e7eeb15d7f71fa688c27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:35Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.485069 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.485098 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.485107 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.485123 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.485132 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:35Z","lastTransitionTime":"2025-10-09T15:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.587602 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.587637 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.587649 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.587666 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.587678 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:35Z","lastTransitionTime":"2025-10-09T15:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.689917 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.689940 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.689947 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.689959 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.689967 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:35Z","lastTransitionTime":"2025-10-09T15:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.792256 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.792290 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.792301 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.792315 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.792324 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:35Z","lastTransitionTime":"2025-10-09T15:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.894248 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.894282 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.894293 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.894308 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.894318 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:35Z","lastTransitionTime":"2025-10-09T15:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.996957 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.996989 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.997001 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.997016 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:35 crc kubenswrapper[4719]: I1009 15:19:35.997027 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:35Z","lastTransitionTime":"2025-10-09T15:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.099737 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.099788 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.099797 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.099812 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.099822 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:36Z","lastTransitionTime":"2025-10-09T15:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.161014 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:36 crc kubenswrapper[4719]: E1009 15:19:36.161159 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.201866 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.201899 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.201907 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.201922 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.201931 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:36Z","lastTransitionTime":"2025-10-09T15:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.304953 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.305001 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.305012 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.305027 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.305038 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:36Z","lastTransitionTime":"2025-10-09T15:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.407641 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.407678 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.407688 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.407704 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.407713 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:36Z","lastTransitionTime":"2025-10-09T15:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.510105 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.510152 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.510161 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.510174 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.510184 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:36Z","lastTransitionTime":"2025-10-09T15:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.612446 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.612478 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.612486 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.612499 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.612508 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:36Z","lastTransitionTime":"2025-10-09T15:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.715569 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.715625 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.715647 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.715681 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.715703 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:36Z","lastTransitionTime":"2025-10-09T15:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.818416 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.818457 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.818467 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.818484 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.818493 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:36Z","lastTransitionTime":"2025-10-09T15:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.920734 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.920781 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.920792 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.920809 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:36 crc kubenswrapper[4719]: I1009 15:19:36.920823 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:36Z","lastTransitionTime":"2025-10-09T15:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.023797 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.023839 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.023850 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.023875 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.023887 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:37Z","lastTransitionTime":"2025-10-09T15:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.126338 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.126392 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.126402 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.126418 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.126429 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:37Z","lastTransitionTime":"2025-10-09T15:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.160291 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.160392 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:37 crc kubenswrapper[4719]: E1009 15:19:37.160452 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.160489 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:37 crc kubenswrapper[4719]: E1009 15:19:37.160631 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:37 crc kubenswrapper[4719]: E1009 15:19:37.160710 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.228238 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.228272 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.228280 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.228293 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.228302 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:37Z","lastTransitionTime":"2025-10-09T15:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.332166 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.332234 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.332246 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.332264 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.332280 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:37Z","lastTransitionTime":"2025-10-09T15:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.434422 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.434467 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.434478 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.434493 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.434504 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:37Z","lastTransitionTime":"2025-10-09T15:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.537169 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.537202 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.537210 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.537224 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.537232 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:37Z","lastTransitionTime":"2025-10-09T15:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.639944 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.640062 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.640094 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.640123 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.640143 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:37Z","lastTransitionTime":"2025-10-09T15:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.742618 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.742668 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.742681 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.742698 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.742711 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:37Z","lastTransitionTime":"2025-10-09T15:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.845264 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.845320 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.845338 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.845384 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.845404 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:37Z","lastTransitionTime":"2025-10-09T15:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.947733 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.947821 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.947835 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.947853 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:37 crc kubenswrapper[4719]: I1009 15:19:37.947880 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:37Z","lastTransitionTime":"2025-10-09T15:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.050102 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.050128 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.050136 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.050148 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.050156 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:38Z","lastTransitionTime":"2025-10-09T15:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.152691 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.152937 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.152954 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.152971 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.152981 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:38Z","lastTransitionTime":"2025-10-09T15:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.160271 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:38 crc kubenswrapper[4719]: E1009 15:19:38.160417 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.256315 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.256644 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.256653 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.256666 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.256677 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:38Z","lastTransitionTime":"2025-10-09T15:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.359513 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.359612 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.359626 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.359643 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.359655 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:38Z","lastTransitionTime":"2025-10-09T15:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.461262 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.461334 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.461365 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.461382 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.461394 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:38Z","lastTransitionTime":"2025-10-09T15:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.563568 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.563603 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.563612 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.563626 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.563635 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:38Z","lastTransitionTime":"2025-10-09T15:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.665993 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.666034 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.666044 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.666059 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.666068 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:38Z","lastTransitionTime":"2025-10-09T15:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.768630 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.768705 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.768728 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.768755 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.768771 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:38Z","lastTransitionTime":"2025-10-09T15:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.871867 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.871906 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.871917 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.871934 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.871947 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:38Z","lastTransitionTime":"2025-10-09T15:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.937713 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:19:38 crc kubenswrapper[4719]: E1009 15:19:38.937929 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:42.937899168 +0000 UTC m=+148.447610453 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.938010 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.938060 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:38 crc kubenswrapper[4719]: E1009 15:19:38.938151 4719 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 15:19:38 crc kubenswrapper[4719]: E1009 15:19:38.938177 4719 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 15:19:38 crc kubenswrapper[4719]: E1009 15:19:38.938220 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 15:20:42.938212098 +0000 UTC m=+148.447923383 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 15:19:38 crc kubenswrapper[4719]: E1009 15:19:38.938243 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 15:20:42.938230959 +0000 UTC m=+148.447942344 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.974125 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.974179 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.974197 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.974221 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:38 crc kubenswrapper[4719]: I1009 15:19:38.974237 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:38Z","lastTransitionTime":"2025-10-09T15:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.039128 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.039219 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:39 crc kubenswrapper[4719]: E1009 15:19:39.039479 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 15:19:39 crc kubenswrapper[4719]: E1009 15:19:39.039515 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 15:19:39 crc kubenswrapper[4719]: E1009 15:19:39.039543 4719 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:19:39 crc kubenswrapper[4719]: E1009 15:19:39.039480 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 15:19:39 crc kubenswrapper[4719]: E1009 15:19:39.039641 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 15:20:43.0396091 +0000 UTC m=+148.549320425 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:19:39 crc kubenswrapper[4719]: E1009 15:19:39.039665 4719 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 15:19:39 crc kubenswrapper[4719]: E1009 15:19:39.039689 4719 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:19:39 crc kubenswrapper[4719]: E1009 15:19:39.039762 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 15:20:43.039739334 +0000 UTC m=+148.549450659 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.076789 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.076830 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.076841 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.076859 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.076870 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:39Z","lastTransitionTime":"2025-10-09T15:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.161056 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.161068 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.161243 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:39 crc kubenswrapper[4719]: E1009 15:19:39.161386 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:39 crc kubenswrapper[4719]: E1009 15:19:39.161466 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:39 crc kubenswrapper[4719]: E1009 15:19:39.161579 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.180040 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.180091 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.180104 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.180121 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.180135 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:39Z","lastTransitionTime":"2025-10-09T15:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.283003 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.283066 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.283082 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.283106 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.283122 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:39Z","lastTransitionTime":"2025-10-09T15:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.385089 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.385128 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.385140 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.385160 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.385172 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:39Z","lastTransitionTime":"2025-10-09T15:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.487818 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.487860 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.487882 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.487899 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.487911 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:39Z","lastTransitionTime":"2025-10-09T15:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.589999 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.590038 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.590046 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.590062 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.590071 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:39Z","lastTransitionTime":"2025-10-09T15:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.692868 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.692925 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.692960 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.692977 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.692992 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:39Z","lastTransitionTime":"2025-10-09T15:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.795645 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.795714 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.795742 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.795767 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.795783 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:39Z","lastTransitionTime":"2025-10-09T15:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.898205 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.898252 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.898264 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.898279 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:39 crc kubenswrapper[4719]: I1009 15:19:39.898290 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:39Z","lastTransitionTime":"2025-10-09T15:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.000794 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.000855 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.000866 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.000885 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.000897 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:40Z","lastTransitionTime":"2025-10-09T15:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.103672 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.103714 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.103723 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.103737 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.103747 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:40Z","lastTransitionTime":"2025-10-09T15:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.160716 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:40 crc kubenswrapper[4719]: E1009 15:19:40.160951 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.205574 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.205626 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.205639 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.205656 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.205666 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:40Z","lastTransitionTime":"2025-10-09T15:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.308702 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.308750 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.308762 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.308779 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.308790 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:40Z","lastTransitionTime":"2025-10-09T15:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.411013 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.411058 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.411066 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.411082 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.411094 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:40Z","lastTransitionTime":"2025-10-09T15:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.512862 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.512908 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.512917 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.512934 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.512943 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:40Z","lastTransitionTime":"2025-10-09T15:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.614382 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.614442 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.614457 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.614479 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.614496 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:40Z","lastTransitionTime":"2025-10-09T15:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.716842 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.716870 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.716879 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.716893 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.716902 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:40Z","lastTransitionTime":"2025-10-09T15:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.819004 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.819063 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.819082 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.819102 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.819118 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:40Z","lastTransitionTime":"2025-10-09T15:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.921970 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.922033 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.922047 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.922065 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:40 crc kubenswrapper[4719]: I1009 15:19:40.922077 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:40Z","lastTransitionTime":"2025-10-09T15:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.024866 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.024911 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.024923 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.024945 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.024957 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:41Z","lastTransitionTime":"2025-10-09T15:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.127270 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.127313 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.127324 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.127339 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.127374 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:41Z","lastTransitionTime":"2025-10-09T15:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.161152 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.161173 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:41 crc kubenswrapper[4719]: E1009 15:19:41.161273 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.161152 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:41 crc kubenswrapper[4719]: E1009 15:19:41.161535 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:41 crc kubenswrapper[4719]: E1009 15:19:41.161892 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.230734 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.230774 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.230786 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.230803 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.230814 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:41Z","lastTransitionTime":"2025-10-09T15:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.333731 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.334166 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.334322 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.334516 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.334693 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:41Z","lastTransitionTime":"2025-10-09T15:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.437284 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.437387 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.437396 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.437411 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.437423 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:41Z","lastTransitionTime":"2025-10-09T15:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.540308 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.540618 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.540717 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.540829 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.540926 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:41Z","lastTransitionTime":"2025-10-09T15:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.642767 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.642807 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.642818 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.642833 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.642845 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:41Z","lastTransitionTime":"2025-10-09T15:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.745173 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.745216 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.745231 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.745248 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.745277 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:41Z","lastTransitionTime":"2025-10-09T15:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.848143 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.848174 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.848184 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.848199 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.848210 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:41Z","lastTransitionTime":"2025-10-09T15:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.951061 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.951101 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.951112 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.951129 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:41 crc kubenswrapper[4719]: I1009 15:19:41.951140 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:41Z","lastTransitionTime":"2025-10-09T15:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.053724 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.053765 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.053775 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.053791 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.053802 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:42Z","lastTransitionTime":"2025-10-09T15:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.156114 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.156141 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.156149 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.156163 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.156171 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:42Z","lastTransitionTime":"2025-10-09T15:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.160623 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:42 crc kubenswrapper[4719]: E1009 15:19:42.160713 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.258070 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.258106 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.258115 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.258128 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.258137 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:42Z","lastTransitionTime":"2025-10-09T15:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.360402 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.360436 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.360445 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.360456 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.360466 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:42Z","lastTransitionTime":"2025-10-09T15:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.462973 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.463008 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.463016 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.463029 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.463037 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:42Z","lastTransitionTime":"2025-10-09T15:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.564962 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.564994 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.565003 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.565014 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.565023 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:42Z","lastTransitionTime":"2025-10-09T15:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.615510 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.615546 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.615563 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.615581 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.615593 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:42Z","lastTransitionTime":"2025-10-09T15:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:42 crc kubenswrapper[4719]: E1009 15:19:42.626601 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.630207 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.630250 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.630259 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.630276 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.630286 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:42Z","lastTransitionTime":"2025-10-09T15:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:42 crc kubenswrapper[4719]: E1009 15:19:42.641216 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.644390 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.644424 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.644432 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.644446 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.644456 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:42Z","lastTransitionTime":"2025-10-09T15:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:42 crc kubenswrapper[4719]: E1009 15:19:42.655521 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.658086 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.658114 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.658125 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.658139 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.658150 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:42Z","lastTransitionTime":"2025-10-09T15:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:42 crc kubenswrapper[4719]: E1009 15:19:42.669372 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.672392 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.672420 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.672431 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.672447 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.672456 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:42Z","lastTransitionTime":"2025-10-09T15:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:42 crc kubenswrapper[4719]: E1009 15:19:42.685873 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:42Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:42 crc kubenswrapper[4719]: E1009 15:19:42.685992 4719 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.687725 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.687808 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.687830 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.687859 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.687880 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:42Z","lastTransitionTime":"2025-10-09T15:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.789887 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.789921 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.789929 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.789942 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.789951 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:42Z","lastTransitionTime":"2025-10-09T15:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.892141 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.892175 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.892185 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.892203 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.892214 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:42Z","lastTransitionTime":"2025-10-09T15:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.994426 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.994477 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.994488 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.994505 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:42 crc kubenswrapper[4719]: I1009 15:19:42.994518 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:42Z","lastTransitionTime":"2025-10-09T15:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.096928 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.096962 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.096971 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.096986 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.096996 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:43Z","lastTransitionTime":"2025-10-09T15:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.160848 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.160913 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.161186 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:43 crc kubenswrapper[4719]: E1009 15:19:43.161276 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:43 crc kubenswrapper[4719]: E1009 15:19:43.161325 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:43 crc kubenswrapper[4719]: E1009 15:19:43.161415 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.161660 4719 scope.go:117] "RemoveContainer" containerID="4859b0f970ed0dea88b96ebd820f8f3806673c1ffff2ad8398b0934dec9535a8" Oct 09 15:19:43 crc kubenswrapper[4719]: E1009 15:19:43.162015 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.199626 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.199704 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.199723 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.199741 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.199752 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:43Z","lastTransitionTime":"2025-10-09T15:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.301836 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.301872 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.301884 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.301898 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.301907 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:43Z","lastTransitionTime":"2025-10-09T15:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.404213 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.404263 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.404274 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.404287 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.404297 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:43Z","lastTransitionTime":"2025-10-09T15:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.507672 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.508049 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.508143 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.508218 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.508273 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:43Z","lastTransitionTime":"2025-10-09T15:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.610228 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.610274 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.610285 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.610308 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.610321 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:43Z","lastTransitionTime":"2025-10-09T15:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.712484 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.712537 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.712549 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.712564 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.712576 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:43Z","lastTransitionTime":"2025-10-09T15:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.814764 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.814798 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.814808 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.814857 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.814868 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:43Z","lastTransitionTime":"2025-10-09T15:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.917036 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.917077 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.917088 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.917104 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:43 crc kubenswrapper[4719]: I1009 15:19:43.917115 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:43Z","lastTransitionTime":"2025-10-09T15:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.019639 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.019728 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.019784 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.019806 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.020389 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:44Z","lastTransitionTime":"2025-10-09T15:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.122811 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.122847 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.122858 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.122874 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.122886 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:44Z","lastTransitionTime":"2025-10-09T15:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.160425 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:44 crc kubenswrapper[4719]: E1009 15:19:44.160665 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.225738 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.225986 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.226056 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.226138 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.226204 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:44Z","lastTransitionTime":"2025-10-09T15:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.328198 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.328239 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.328252 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.328268 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.328279 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:44Z","lastTransitionTime":"2025-10-09T15:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.430471 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.430509 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.430517 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.430548 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.430558 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:44Z","lastTransitionTime":"2025-10-09T15:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.536034 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.536071 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.536080 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.536116 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.536128 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:44Z","lastTransitionTime":"2025-10-09T15:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.637938 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.637982 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.637994 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.638008 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.638019 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:44Z","lastTransitionTime":"2025-10-09T15:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.740704 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.740748 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.740759 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.740779 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.740790 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:44Z","lastTransitionTime":"2025-10-09T15:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.842889 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.842943 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.842955 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.842974 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.842988 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:44Z","lastTransitionTime":"2025-10-09T15:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.945455 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.945492 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.945501 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.945516 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:44 crc kubenswrapper[4719]: I1009 15:19:44.945525 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:44Z","lastTransitionTime":"2025-10-09T15:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.048290 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.048385 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.048410 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.048432 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.048446 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:45Z","lastTransitionTime":"2025-10-09T15:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.151163 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.151206 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.151216 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.151231 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.151240 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:45Z","lastTransitionTime":"2025-10-09T15:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.160713 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:45 crc kubenswrapper[4719]: E1009 15:19:45.160872 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.160915 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.160965 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:45 crc kubenswrapper[4719]: E1009 15:19:45.161507 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:45 crc kubenswrapper[4719]: E1009 15:19:45.161633 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.189424 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b95bb6e-7df6-4400-8232-5ca5dab42396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06437f1617d00e0bc262d7c69eeec56bf8f9a4eef7ef19d989b9f88b1d18e8ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a80e9f68129932715d418ad10640f0c8baf9c482e525167e1c38f22b6f2766d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://794ad592484df23016457541f9458f2ca7bc0de2d71557b8118177e9f2dbde76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67517d4944e3c4c0d3250dcaad2bc81fafc78ed7fddb0fa64d2a52482a058e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2aea0a0b737a01a18248c4d665bb1f643bc97ff2944d643d5a544d84be20209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3873ee536db5337daed784166bf50ae2eb5e1c29d1628a844904761d8e092284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e09ee90a78cd8e48f562dc1277a3904e908829e0e15199d27fa435f3071731a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb5513dae3b869a29b7707c17409ca35e3e1fd8dc26ed74f8b1cc1e72e7ce641\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.203004 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.216155 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.227714 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.237795 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5mdb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7db0861-5252-4efa-9464-e64b6d069d8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a62b8142b6b6fd0cf9028590f2abce788d8e381c2303d7a824dd055ab02b94db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lk5rs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5mdb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.247388 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-58bdp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d00237ae-ca20-4202-8e24-e4988fbf5269\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-58bdp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.252966 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.253002 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.253013 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.253028 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.253039 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:45Z","lastTransitionTime":"2025-10-09T15:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.260406 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40166218-2855-45ef-b0e1-0fed4e3e2fde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01dc78fd80a15fa8151128108a351c6af42928695fdd745dea50e08fae6570ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc62bf1b49b2a4b402b2fcca31f9fe1663b36f463a0722a5876b2ca2a8e023ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b618cce898bc89b4b07b6f7fd73567d719ad9c9dc3a2a3959074bc2c2fe11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6f2af57f612bf33446a88a0a093adb3b64f562412d9a0bd03f3964c281ba4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b3ab4048ae1f61a71cc1110deec25771f7e1774c0f6d8726c076fa8ea3a1d30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 15:18:34.791014 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 15:18:34.791138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 15:18:34.792247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2043225324/tls.crt::/tmp/serving-cert-2043225324/tls.key\\\\\\\"\\\\nI1009 15:18:35.029901 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 15:18:35.033427 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 15:18:35.033448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 15:18:35.033473 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 15:18:35.033481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 15:18:35.045206 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 15:18:35.045257 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045266 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 15:18:35.045273 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 15:18:35.045277 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 15:18:35.045280 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 15:18:35.045285 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 15:18:35.045414 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 15:18:35.048459 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8544f7060b0b2c2885dcbdffbd744be5f028d8df543732ba79eb7cd3911afca6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e7b16ee3e154d0329d4ddabb2aa45e6e6ccf838dc9453d8854f1521622537d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.272578 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99353559-5b0b-4a9e-b759-0321ef3a8a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ae19d921bad282d96efffc7f2f7cfdc4b70f95932e69b9955ad1439a936d5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p9kwh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.290005 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d0ca53-1333-4d50-948a-81d97d3182f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcd9a40e12b42902018ecf483e6b42dfa415e4d6e282fc57eacbf507922dbd45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dd9cf59375771629bad78f53d5a27eff04c5931a0d96266f67147dd47cb4eaa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3874e781900a9575dc787e89affa840b1f4a966799d943935e8095739db8521\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54d096c28585859490ffa887cb060e3e25ee42828c5790a31db9113f5481450a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://275fc3f2dfa338299637c21b6fcbc074e06b8219d0246b3a8cf657d5ec433623\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73b3569d8ecf394227033562445b622d0cc81cf2d37185ca7c3d330e81ab3a32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8ea7d4def69aa7181b4fc151d216c5da1204c8b054827977e709de61ad8adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lszx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sc5bv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.304696 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kmbvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a7f4c67-0335-4c58-896a-b3059d9a9a3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201751e1a01c1fefb61309835c66a89743c507dff1e0d6e75a5ecf3447831840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:19:22Z\\\",\\\"message\\\":\\\"2025-10-09T15:18:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6cb1650a-7cd1-47ae-8a33-6737992732d4\\\\n2025-10-09T15:18:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6cb1650a-7cd1-47ae-8a33-6737992732d4 to /host/opt/cni/bin/\\\\n2025-10-09T15:18:37Z [verbose] multus-daemon started\\\\n2025-10-09T15:18:37Z [verbose] Readiness Indicator file check\\\\n2025-10-09T15:19:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5w9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kmbvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.316551 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b565dc-6ccc-4404-95f7-c8cf09f91802\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c126340dff33c7a571fc152c4c8ed154e104aaab937ba7f68070763d79825b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6df96c88745808317300d950f2d991691695576773b7de02958ec718445cc3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kddxt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vdgtp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.326261 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1789dfc1-aa86-4e27-ae75-b5078112f7fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff1066e8910a7aa889e7cc5c7b2735a240197a60b66c9471b4fda297dba4176f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a73e5700d7dbee6fac767db433a82521b7af9241107369d3be4aa00593128763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a73e5700d7dbee6fac767db433a82521b7af9241107369d3be4aa00593128763\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.338280 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab52b5e80f5f2de90ce76b34b21de83b3880ed13436c566f2c460bed1908576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.350825 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2199f3e31d7adde6f0b1aaf29a7f3da80a45d8a1f11908fd93b47d737b00872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.354686 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.354717 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.354726 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.354739 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.354751 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:45Z","lastTransitionTime":"2025-10-09T15:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.363558 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.380128 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd31818-5445-47d9-af8f-fa49dde2a7ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7137edca40a10e85d3116f62b5dbe6ffea35d9473164173af2dea55f1794397c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd0662699e43951e6e139dbe8bb44c36a0120144c90a7f21010cbf68a2abcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4a7a60b0336fb0e1a046f59f9d60cc55a056b70959c7e6a33b6b15b879bd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b628c71ffc4577dac4247fca1780e229a260bd382075e7eeb15d7f71fa688c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b628c71ffc4577dac4247fca1780e229a260bd382075e7eeb15d7f71fa688c27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.393454 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fcba5218f1503f2b3776c66a92350381ee11aee043429d72c70b7ae63d7bb29f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63d557f6902338a7aa577f2bbee6a159369d62be9724425a6e6a355f08586601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.411033 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fea6a48c-769c-41bf-95ce-649cc31eb4e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4859b0f970ed0dea88b96ebd820f8f3806673c1ffff2ad8398b0934dec9535a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4859b0f970ed0dea88b96ebd820f8f3806673c1ffff2ad8398b0934dec9535a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T15:19:30Z\\\",\\\"message\\\":\\\"766 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 15:19:30.885977 6766 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 15:19:30.886090 6766 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 15:19:30.886123 6766 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 15:19:30.886425 6766 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 15:19:30.886481 6766 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 15:19:30.886521 6766 factory.go:656] Stopping watch factory\\\\nI1009 15:19:30.886562 6766 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 15:19:30.886590 6766 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 15:19:30.896640 6766 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1009 15:19:30.896672 6766 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1009 15:19:30.896720 6766 ovnkube.go:599] Stopped ovnkube\\\\nI1009 15:19:30.896740 6766 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1009 15:19:30.896821 6766 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T15:19:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84cf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zv8jk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.420454 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:45Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.457122 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.457177 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.457186 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.457201 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.457212 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:45Z","lastTransitionTime":"2025-10-09T15:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.558932 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.558971 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.558983 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.558998 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.559009 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:45Z","lastTransitionTime":"2025-10-09T15:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.660398 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.660450 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.660459 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.660472 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.660482 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:45Z","lastTransitionTime":"2025-10-09T15:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.762332 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.762407 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.762419 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.762435 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.762447 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:45Z","lastTransitionTime":"2025-10-09T15:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.864745 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.864804 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.864820 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.864839 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.864851 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:45Z","lastTransitionTime":"2025-10-09T15:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.967089 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.967133 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.967143 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.967167 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:45 crc kubenswrapper[4719]: I1009 15:19:45.967179 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:45Z","lastTransitionTime":"2025-10-09T15:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.069692 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.069795 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.069805 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.069834 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.069844 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:46Z","lastTransitionTime":"2025-10-09T15:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.160808 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:46 crc kubenswrapper[4719]: E1009 15:19:46.161189 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.173468 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.173508 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.173517 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.173530 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.173538 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:46Z","lastTransitionTime":"2025-10-09T15:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.276223 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.276265 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.276280 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.276296 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.276307 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:46Z","lastTransitionTime":"2025-10-09T15:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.378511 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.378559 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.378568 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.378583 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.378592 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:46Z","lastTransitionTime":"2025-10-09T15:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.480913 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.480982 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.480996 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.481012 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.481022 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:46Z","lastTransitionTime":"2025-10-09T15:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.583107 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.583141 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.583152 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.583167 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.583177 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:46Z","lastTransitionTime":"2025-10-09T15:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.684797 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.684835 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.684845 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.684859 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.684868 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:46Z","lastTransitionTime":"2025-10-09T15:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.786886 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.786929 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.786940 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.786955 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.786965 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:46Z","lastTransitionTime":"2025-10-09T15:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.888754 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.888812 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.888821 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.888834 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.888842 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:46Z","lastTransitionTime":"2025-10-09T15:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.991281 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.991316 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.991326 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.991343 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:46 crc kubenswrapper[4719]: I1009 15:19:46.991381 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:46Z","lastTransitionTime":"2025-10-09T15:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.094092 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.094124 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.094133 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.094145 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.094153 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:47Z","lastTransitionTime":"2025-10-09T15:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.160818 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.160851 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.160894 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:47 crc kubenswrapper[4719]: E1009 15:19:47.160960 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:47 crc kubenswrapper[4719]: E1009 15:19:47.161031 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:47 crc kubenswrapper[4719]: E1009 15:19:47.161086 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.196238 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.196264 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.196272 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.196283 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.196291 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:47Z","lastTransitionTime":"2025-10-09T15:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.298773 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.298850 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.298864 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.298887 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.298903 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:47Z","lastTransitionTime":"2025-10-09T15:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.401308 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.401366 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.401379 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.401395 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.401406 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:47Z","lastTransitionTime":"2025-10-09T15:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.504569 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.504612 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.504624 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.504638 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.504648 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:47Z","lastTransitionTime":"2025-10-09T15:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.607125 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.607173 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.607182 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.607196 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.607205 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:47Z","lastTransitionTime":"2025-10-09T15:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.710588 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.710637 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.710649 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.710667 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.710680 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:47Z","lastTransitionTime":"2025-10-09T15:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.814175 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.814244 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.814258 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.814287 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.814304 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:47Z","lastTransitionTime":"2025-10-09T15:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.918290 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.918365 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.918377 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.918406 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:47 crc kubenswrapper[4719]: I1009 15:19:47.918454 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:47Z","lastTransitionTime":"2025-10-09T15:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.022418 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.022522 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.022534 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.022556 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.022570 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:48Z","lastTransitionTime":"2025-10-09T15:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.124872 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.125121 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.125132 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.125150 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.125159 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:48Z","lastTransitionTime":"2025-10-09T15:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.160588 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:48 crc kubenswrapper[4719]: E1009 15:19:48.160746 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.227737 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.227808 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.227824 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.227851 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.227867 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:48Z","lastTransitionTime":"2025-10-09T15:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.330871 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.330957 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.330975 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.331005 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.331029 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:48Z","lastTransitionTime":"2025-10-09T15:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.433123 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.433198 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.433213 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.433242 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.433258 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:48Z","lastTransitionTime":"2025-10-09T15:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.536001 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.536037 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.536049 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.536064 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.536075 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:48Z","lastTransitionTime":"2025-10-09T15:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.638239 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.638282 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.638291 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.638309 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.638322 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:48Z","lastTransitionTime":"2025-10-09T15:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.741162 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.741205 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.741215 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.741230 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.741240 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:48Z","lastTransitionTime":"2025-10-09T15:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.843380 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.843420 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.843435 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.843454 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.843466 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:48Z","lastTransitionTime":"2025-10-09T15:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.945714 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.945764 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.945780 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.945796 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:48 crc kubenswrapper[4719]: I1009 15:19:48.945807 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:48Z","lastTransitionTime":"2025-10-09T15:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.048172 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.048214 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.048221 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.048234 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.048243 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:49Z","lastTransitionTime":"2025-10-09T15:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.150499 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.150560 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.150576 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.150598 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.150618 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:49Z","lastTransitionTime":"2025-10-09T15:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.160894 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.160984 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:49 crc kubenswrapper[4719]: E1009 15:19:49.161020 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:49 crc kubenswrapper[4719]: E1009 15:19:49.161119 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.161143 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:49 crc kubenswrapper[4719]: E1009 15:19:49.161281 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.252611 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.252646 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.252661 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.252678 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.252690 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:49Z","lastTransitionTime":"2025-10-09T15:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.355805 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.355851 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.355863 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.355888 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.355900 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:49Z","lastTransitionTime":"2025-10-09T15:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.458585 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.458653 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.458664 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.458685 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.458703 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:49Z","lastTransitionTime":"2025-10-09T15:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.569635 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.569680 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.569690 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.569707 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.569717 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:49Z","lastTransitionTime":"2025-10-09T15:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.671713 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.671762 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.671774 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.671790 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.671801 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:49Z","lastTransitionTime":"2025-10-09T15:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.773714 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.773760 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.773775 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.773790 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.773802 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:49Z","lastTransitionTime":"2025-10-09T15:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.877069 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.877145 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.877169 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.877198 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.877219 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:49Z","lastTransitionTime":"2025-10-09T15:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.979846 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.979898 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.979913 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.979935 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:49 crc kubenswrapper[4719]: I1009 15:19:49.979950 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:49Z","lastTransitionTime":"2025-10-09T15:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.082376 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.082411 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.082419 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.082432 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.082441 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:50Z","lastTransitionTime":"2025-10-09T15:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.160446 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:50 crc kubenswrapper[4719]: E1009 15:19:50.160593 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.185440 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.185488 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.185499 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.185516 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.185528 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:50Z","lastTransitionTime":"2025-10-09T15:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.288054 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.288097 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.288109 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.288125 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.288137 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:50Z","lastTransitionTime":"2025-10-09T15:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.391025 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.391075 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.391090 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.391111 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.391127 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:50Z","lastTransitionTime":"2025-10-09T15:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.493284 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.493329 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.493337 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.493403 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.493412 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:50Z","lastTransitionTime":"2025-10-09T15:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.596195 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.596230 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.596238 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.596253 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.596262 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:50Z","lastTransitionTime":"2025-10-09T15:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.698320 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.698378 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.698405 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.698418 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.698426 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:50Z","lastTransitionTime":"2025-10-09T15:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.800865 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.800923 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.800939 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.800969 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.800986 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:50Z","lastTransitionTime":"2025-10-09T15:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.903784 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.903834 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.903849 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.903868 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:50 crc kubenswrapper[4719]: I1009 15:19:50.903883 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:50Z","lastTransitionTime":"2025-10-09T15:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.006090 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.006131 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.006141 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.006156 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.006167 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:51Z","lastTransitionTime":"2025-10-09T15:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.108948 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.108998 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.109009 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.109028 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.109040 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:51Z","lastTransitionTime":"2025-10-09T15:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.160675 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.160776 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.160699 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:51 crc kubenswrapper[4719]: E1009 15:19:51.160805 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:51 crc kubenswrapper[4719]: E1009 15:19:51.160977 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:51 crc kubenswrapper[4719]: E1009 15:19:51.161064 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.211790 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.211838 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.211854 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.211875 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.211893 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:51Z","lastTransitionTime":"2025-10-09T15:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.313950 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.314008 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.314020 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.314070 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.314085 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:51Z","lastTransitionTime":"2025-10-09T15:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.416778 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.416821 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.416830 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.416849 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.416858 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:51Z","lastTransitionTime":"2025-10-09T15:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.520711 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.520787 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.520810 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.520838 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.520860 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:51Z","lastTransitionTime":"2025-10-09T15:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.623778 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.623855 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.623878 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.623911 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.623941 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:51Z","lastTransitionTime":"2025-10-09T15:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.726690 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.726747 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.726769 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.726792 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.726807 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:51Z","lastTransitionTime":"2025-10-09T15:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.829655 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.829690 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.829705 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.829722 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.829736 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:51Z","lastTransitionTime":"2025-10-09T15:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.932823 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.932887 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.932903 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.932928 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:51 crc kubenswrapper[4719]: I1009 15:19:51.932948 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:51Z","lastTransitionTime":"2025-10-09T15:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.036050 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.036113 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.036129 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.036151 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.036165 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:52Z","lastTransitionTime":"2025-10-09T15:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.138828 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.138874 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.138886 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.138904 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.138915 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:52Z","lastTransitionTime":"2025-10-09T15:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.160618 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:52 crc kubenswrapper[4719]: E1009 15:19:52.160876 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.241804 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.241845 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.241857 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.241874 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.241885 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:52Z","lastTransitionTime":"2025-10-09T15:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.344380 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.344424 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.344436 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.344457 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.344469 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:52Z","lastTransitionTime":"2025-10-09T15:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.447200 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.447244 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.447254 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.447270 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.447282 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:52Z","lastTransitionTime":"2025-10-09T15:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.550077 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.550107 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.550119 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.550136 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.550153 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:52Z","lastTransitionTime":"2025-10-09T15:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.572849 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs\") pod \"network-metrics-daemon-58bdp\" (UID: \"d00237ae-ca20-4202-8e24-e4988fbf5269\") " pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:52 crc kubenswrapper[4719]: E1009 15:19:52.572987 4719 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 15:19:52 crc kubenswrapper[4719]: E1009 15:19:52.573034 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs podName:d00237ae-ca20-4202-8e24-e4988fbf5269 nodeName:}" failed. No retries permitted until 2025-10-09 15:20:56.573021158 +0000 UTC m=+162.082732443 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs") pod "network-metrics-daemon-58bdp" (UID: "d00237ae-ca20-4202-8e24-e4988fbf5269") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.652082 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.652142 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.652156 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.652175 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.652187 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:52Z","lastTransitionTime":"2025-10-09T15:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.754563 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.754631 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.754645 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.754663 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.754676 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:52Z","lastTransitionTime":"2025-10-09T15:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.769578 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.769628 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.769638 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.769653 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.769664 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:52Z","lastTransitionTime":"2025-10-09T15:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:52 crc kubenswrapper[4719]: E1009 15:19:52.781408 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:52Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.784791 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.784831 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.784842 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.784857 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.784867 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:52Z","lastTransitionTime":"2025-10-09T15:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:52 crc kubenswrapper[4719]: E1009 15:19:52.795484 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:52Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.798938 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.798974 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.798984 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.798999 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.799010 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:52Z","lastTransitionTime":"2025-10-09T15:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:52 crc kubenswrapper[4719]: E1009 15:19:52.810554 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:52Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.814198 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.814230 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.814238 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.814253 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.814263 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:52Z","lastTransitionTime":"2025-10-09T15:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:52 crc kubenswrapper[4719]: E1009 15:19:52.825828 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:52Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.829009 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.829046 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.829058 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.829072 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.829081 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:52Z","lastTransitionTime":"2025-10-09T15:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:52 crc kubenswrapper[4719]: E1009 15:19:52.838871 4719 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7d273987-9d8a-4a77-9956-ccb64e9e22c3\\\",\\\"systemUUID\\\":\\\"d18dc188-15d4-4547-94df-d9149082a3a0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:52Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:52 crc kubenswrapper[4719]: E1009 15:19:52.838983 4719 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.856280 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.856733 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.856894 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.856962 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.857025 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:52Z","lastTransitionTime":"2025-10-09T15:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.959764 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.959811 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.959822 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.959838 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:52 crc kubenswrapper[4719]: I1009 15:19:52.959850 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:52Z","lastTransitionTime":"2025-10-09T15:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.062323 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.062378 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.062387 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.062400 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.062409 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:53Z","lastTransitionTime":"2025-10-09T15:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.160987 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:53 crc kubenswrapper[4719]: E1009 15:19:53.161118 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.161177 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:53 crc kubenswrapper[4719]: E1009 15:19:53.161251 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.161441 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:53 crc kubenswrapper[4719]: E1009 15:19:53.161610 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.164239 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.164274 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.164287 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.164304 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.164317 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:53Z","lastTransitionTime":"2025-10-09T15:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.266520 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.266553 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.266561 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.266576 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.266587 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:53Z","lastTransitionTime":"2025-10-09T15:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.368520 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.368552 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.368564 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.368580 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.368591 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:53Z","lastTransitionTime":"2025-10-09T15:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.471110 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.471469 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.471598 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.471736 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.471871 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:53Z","lastTransitionTime":"2025-10-09T15:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.574442 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.574475 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.574483 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.574496 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.574505 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:53Z","lastTransitionTime":"2025-10-09T15:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.676807 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.676875 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.676891 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.676918 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.676936 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:53Z","lastTransitionTime":"2025-10-09T15:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.779786 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.779834 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.779845 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.779864 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.779876 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:53Z","lastTransitionTime":"2025-10-09T15:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.881947 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.882246 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.882316 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.882503 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.882600 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:53Z","lastTransitionTime":"2025-10-09T15:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.984974 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.985027 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.985039 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.985055 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:53 crc kubenswrapper[4719]: I1009 15:19:53.985067 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:53Z","lastTransitionTime":"2025-10-09T15:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.087312 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.087421 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.087440 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.087464 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.087480 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:54Z","lastTransitionTime":"2025-10-09T15:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.160916 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:54 crc kubenswrapper[4719]: E1009 15:19:54.161119 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.189868 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.189919 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.189931 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.189947 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.189958 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:54Z","lastTransitionTime":"2025-10-09T15:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.294814 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.294851 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.294863 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.294884 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.294898 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:54Z","lastTransitionTime":"2025-10-09T15:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.398835 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.398952 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.398993 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.399022 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.399060 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:54Z","lastTransitionTime":"2025-10-09T15:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.500741 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.500942 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.500964 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.501029 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.501049 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:54Z","lastTransitionTime":"2025-10-09T15:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.603062 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.603096 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.603107 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.603124 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.603137 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:54Z","lastTransitionTime":"2025-10-09T15:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.706769 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.706817 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.706860 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.706882 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.706896 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:54Z","lastTransitionTime":"2025-10-09T15:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.809214 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.809245 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.809257 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.809272 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.809282 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:54Z","lastTransitionTime":"2025-10-09T15:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.911984 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.912035 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.912048 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.912067 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:54 crc kubenswrapper[4719]: I1009 15:19:54.912084 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:54Z","lastTransitionTime":"2025-10-09T15:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.014495 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.014735 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.014812 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.014888 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.014958 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:55Z","lastTransitionTime":"2025-10-09T15:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.117719 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.117785 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.117802 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.117828 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.117847 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:55Z","lastTransitionTime":"2025-10-09T15:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.161139 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.161258 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.161155 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:55 crc kubenswrapper[4719]: E1009 15:19:55.161295 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:55 crc kubenswrapper[4719]: E1009 15:19:55.161543 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:55 crc kubenswrapper[4719]: E1009 15:19:55.161633 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.173751 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mtpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb84e765-e2c6-410b-9681-7c14d88a2537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be972d47f7ee97f2f54daa73198a83327281f9e9b2b1500205a17cf11518989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sfpkf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mtpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:55Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.187151 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834f7996-d1ce-470d-a1a5-0de5da2460d5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e15a1b9cee40ae4a30df34bde2f4dd9436cf3ff915293ea1e1431e8abd581423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b37949ed51a379d34fab6bf766fd7e35d376af137b55b6f12e8bef8495ab5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d165d88c0d88fb4b080bf594e5258fb74f33c521332c85bb9f5ef5b5d9fdab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed935aaa4f5122234731f8c22ec3d4ffeba8b500bfb51bf97414f39438da2f68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:55Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.198128 4719 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd31818-5445-47d9-af8f-fa49dde2a7ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T15:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7137edca40a10e85d3116f62b5dbe6ffea35d9473164173af2dea55f1794397c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd0662699e43951e6e139dbe8bb44c36a0120144c90a7f21010cbf68a2abcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db4a7a60b0336fb0e1a046f59f9d60cc55a056b70959c7e6a33b6b15b879bd76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T15:18:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b628c71ffc4577dac4247fca1780e229a260bd382075e7eeb15d7f71fa688c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b628c71ffc4577dac4247fca1780e229a260bd382075e7eeb15d7f71fa688c27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T15:18:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T15:18:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T15:18:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T15:19:55Z is after 2025-08-24T17:21:41Z" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.222438 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.222496 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.222514 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.222539 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.222555 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:55Z","lastTransitionTime":"2025-10-09T15:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.252403 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-j5mdb" podStartSLOduration=81.252383215 podStartE2EDuration="1m21.252383215s" podCreationTimestamp="2025-10-09 15:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:19:55.252308933 +0000 UTC m=+100.762020218" watchObservedRunningTime="2025-10-09 15:19:55.252383215 +0000 UTC m=+100.762094520" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.288676 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=81.288658372 podStartE2EDuration="1m21.288658372s" podCreationTimestamp="2025-10-09 15:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:19:55.287470264 +0000 UTC m=+100.797181559" watchObservedRunningTime="2025-10-09 15:19:55.288658372 +0000 UTC m=+100.798369677" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.324617 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.324650 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.324662 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.324679 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.324691 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:55Z","lastTransitionTime":"2025-10-09T15:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.378450 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vdgtp" podStartSLOduration=80.378434501 podStartE2EDuration="1m20.378434501s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:19:55.35106945 +0000 UTC m=+100.860780765" watchObservedRunningTime="2025-10-09 15:19:55.378434501 +0000 UTC m=+100.888145786" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.378725 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=80.37872153 podStartE2EDuration="1m20.37872153s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:19:55.377879263 +0000 UTC m=+100.887590558" watchObservedRunningTime="2025-10-09 15:19:55.37872153 +0000 UTC m=+100.888432815" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.420128 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podStartSLOduration=81.420108781 podStartE2EDuration="1m21.420108781s" podCreationTimestamp="2025-10-09 15:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:19:55.394743355 +0000 UTC m=+100.904454640" watchObservedRunningTime="2025-10-09 15:19:55.420108781 +0000 UTC m=+100.929820066" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.426568 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.426817 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.426888 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.426952 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.427017 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:55Z","lastTransitionTime":"2025-10-09T15:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.434183 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-kmbvp" podStartSLOduration=81.434168274 podStartE2EDuration="1m21.434168274s" podCreationTimestamp="2025-10-09 15:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:19:55.432604594 +0000 UTC m=+100.942315879" watchObservedRunningTime="2025-10-09 15:19:55.434168274 +0000 UTC m=+100.943879559" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.434323 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-sc5bv" podStartSLOduration=81.434319059 podStartE2EDuration="1m21.434319059s" podCreationTimestamp="2025-10-09 15:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:19:55.421171476 +0000 UTC m=+100.930882771" watchObservedRunningTime="2025-10-09 15:19:55.434319059 +0000 UTC m=+100.944030344" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.442506 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=31.442497132 podStartE2EDuration="31.442497132s" podCreationTimestamp="2025-10-09 15:19:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:19:55.442166291 +0000 UTC m=+100.951877576" watchObservedRunningTime="2025-10-09 15:19:55.442497132 +0000 UTC m=+100.952208417" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.529340 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.529624 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.529690 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.529754 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.529825 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:55Z","lastTransitionTime":"2025-10-09T15:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.632126 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.632420 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.632651 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.632743 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.632813 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:55Z","lastTransitionTime":"2025-10-09T15:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.735304 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.735378 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.735389 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.735405 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.735415 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:55Z","lastTransitionTime":"2025-10-09T15:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.837694 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.837888 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.837969 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.838036 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.838102 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:55Z","lastTransitionTime":"2025-10-09T15:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.940006 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.940041 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.940050 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.940066 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:55 crc kubenswrapper[4719]: I1009 15:19:55.940076 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:55Z","lastTransitionTime":"2025-10-09T15:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.042035 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.042068 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.042079 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.042093 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.042104 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:56Z","lastTransitionTime":"2025-10-09T15:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.144293 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.144333 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.144363 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.144380 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.144392 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:56Z","lastTransitionTime":"2025-10-09T15:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.160951 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:56 crc kubenswrapper[4719]: E1009 15:19:56.161056 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.162204 4719 scope.go:117] "RemoveContainer" containerID="4859b0f970ed0dea88b96ebd820f8f3806673c1ffff2ad8398b0934dec9535a8" Oct 09 15:19:56 crc kubenswrapper[4719]: E1009 15:19:56.162462 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.246719 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.247257 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.247404 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.247508 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.247594 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:56Z","lastTransitionTime":"2025-10-09T15:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.349848 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.349910 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.349921 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.349942 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.349956 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:56Z","lastTransitionTime":"2025-10-09T15:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.452330 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.452402 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.452414 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.452437 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.452456 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:56Z","lastTransitionTime":"2025-10-09T15:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.554413 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.554455 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.554469 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.554487 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.554498 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:56Z","lastTransitionTime":"2025-10-09T15:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.656592 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.656632 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.656640 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.656655 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.656664 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:56Z","lastTransitionTime":"2025-10-09T15:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.758767 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.758804 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.758812 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.758826 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.758835 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:56Z","lastTransitionTime":"2025-10-09T15:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.860714 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.860756 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.860769 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.860786 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.860799 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:56Z","lastTransitionTime":"2025-10-09T15:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.963754 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.963814 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.963825 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.963840 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:56 crc kubenswrapper[4719]: I1009 15:19:56.963851 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:56Z","lastTransitionTime":"2025-10-09T15:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.066331 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.066378 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.066389 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.066402 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.066411 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:57Z","lastTransitionTime":"2025-10-09T15:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.160565 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.160607 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.160660 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:57 crc kubenswrapper[4719]: E1009 15:19:57.160691 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:57 crc kubenswrapper[4719]: E1009 15:19:57.160806 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:57 crc kubenswrapper[4719]: E1009 15:19:57.160896 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.167775 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.167808 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.167818 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.167833 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.167845 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:57Z","lastTransitionTime":"2025-10-09T15:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.272905 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.272945 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.272955 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.272974 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.272986 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:57Z","lastTransitionTime":"2025-10-09T15:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.375527 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.375573 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.375590 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.375614 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.375630 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:57Z","lastTransitionTime":"2025-10-09T15:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.478817 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.478890 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.478913 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.478942 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.478964 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:57Z","lastTransitionTime":"2025-10-09T15:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.580628 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.580743 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.580760 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.580776 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.580787 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:57Z","lastTransitionTime":"2025-10-09T15:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.683393 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.683435 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.683451 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.683466 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.683477 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:57Z","lastTransitionTime":"2025-10-09T15:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.786119 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.786162 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.786173 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.786189 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.786201 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:57Z","lastTransitionTime":"2025-10-09T15:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.889889 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.890289 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.890509 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.890767 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.890966 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:57Z","lastTransitionTime":"2025-10-09T15:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.994831 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.994903 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.994928 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.994965 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:57 crc kubenswrapper[4719]: I1009 15:19:57.994988 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:57Z","lastTransitionTime":"2025-10-09T15:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.097919 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.098426 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.098534 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.098640 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.098765 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:58Z","lastTransitionTime":"2025-10-09T15:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.160784 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:19:58 crc kubenswrapper[4719]: E1009 15:19:58.160943 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.202607 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.202684 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.202697 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.202729 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.202744 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:58Z","lastTransitionTime":"2025-10-09T15:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.305239 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.305540 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.305610 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.305679 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.305739 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:58Z","lastTransitionTime":"2025-10-09T15:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.408405 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.408444 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.408454 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.408468 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.408478 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:58Z","lastTransitionTime":"2025-10-09T15:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.511878 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.512183 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.512277 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.512403 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.512500 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:58Z","lastTransitionTime":"2025-10-09T15:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.615255 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.615468 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.615561 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.615666 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.615763 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:58Z","lastTransitionTime":"2025-10-09T15:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.719621 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.719673 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.719690 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.719714 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.719734 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:58Z","lastTransitionTime":"2025-10-09T15:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.822070 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.822120 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.822142 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.822170 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.822192 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:58Z","lastTransitionTime":"2025-10-09T15:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.925087 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.925148 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.925164 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.925188 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:58 crc kubenswrapper[4719]: I1009 15:19:58.925206 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:58Z","lastTransitionTime":"2025-10-09T15:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.028296 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.028421 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.028444 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.028474 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.028501 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:59Z","lastTransitionTime":"2025-10-09T15:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.131586 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.131630 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.131644 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.131662 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.131676 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:59Z","lastTransitionTime":"2025-10-09T15:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.161018 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.161077 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.161028 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:19:59 crc kubenswrapper[4719]: E1009 15:19:59.161218 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:19:59 crc kubenswrapper[4719]: E1009 15:19:59.161318 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:19:59 crc kubenswrapper[4719]: E1009 15:19:59.161513 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.233885 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.233937 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.233952 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.233973 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.233985 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:59Z","lastTransitionTime":"2025-10-09T15:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.337089 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.337130 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.337142 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.337158 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.337170 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:59Z","lastTransitionTime":"2025-10-09T15:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.440270 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.440334 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.440372 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.440399 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.440422 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:59Z","lastTransitionTime":"2025-10-09T15:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.543890 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.543939 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.543950 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.543967 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.543977 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:59Z","lastTransitionTime":"2025-10-09T15:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.647289 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.647386 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.647401 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.647423 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.647436 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:59Z","lastTransitionTime":"2025-10-09T15:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.749894 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.749938 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.749948 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.749964 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.749975 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:59Z","lastTransitionTime":"2025-10-09T15:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.853995 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.854100 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.854132 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.854170 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.854258 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:59Z","lastTransitionTime":"2025-10-09T15:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.958393 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.958441 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.958450 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.958466 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:19:59 crc kubenswrapper[4719]: I1009 15:19:59.958477 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:19:59Z","lastTransitionTime":"2025-10-09T15:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.061485 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.061564 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.061584 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.061615 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.061633 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:00Z","lastTransitionTime":"2025-10-09T15:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.160505 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:20:00 crc kubenswrapper[4719]: E1009 15:20:00.160664 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.165647 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.165717 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.165735 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.165757 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.165772 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:00Z","lastTransitionTime":"2025-10-09T15:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.268530 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.268575 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.268585 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.268602 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.268614 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:00Z","lastTransitionTime":"2025-10-09T15:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.371566 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.371615 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.371630 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.371647 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.371659 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:00Z","lastTransitionTime":"2025-10-09T15:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.474503 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.474562 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.474578 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.474600 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.474618 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:00Z","lastTransitionTime":"2025-10-09T15:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.577953 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.578031 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.578043 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.578063 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.578078 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:00Z","lastTransitionTime":"2025-10-09T15:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.680536 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.680636 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.680652 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.680670 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.680681 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:00Z","lastTransitionTime":"2025-10-09T15:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.784550 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.784598 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.784608 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.784626 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.784637 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:00Z","lastTransitionTime":"2025-10-09T15:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.887766 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.887812 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.887822 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.887839 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.887849 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:00Z","lastTransitionTime":"2025-10-09T15:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.990749 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.990787 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.990795 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.990811 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:00 crc kubenswrapper[4719]: I1009 15:20:00.990819 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:00Z","lastTransitionTime":"2025-10-09T15:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.092818 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.092896 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.092915 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.092946 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.092966 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:01Z","lastTransitionTime":"2025-10-09T15:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.160788 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:20:01 crc kubenswrapper[4719]: E1009 15:20:01.160925 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.161093 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:20:01 crc kubenswrapper[4719]: E1009 15:20:01.161156 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.161265 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:20:01 crc kubenswrapper[4719]: E1009 15:20:01.161588 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.195152 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.195202 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.195222 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.195241 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.195252 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:01Z","lastTransitionTime":"2025-10-09T15:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.298582 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.298663 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.298685 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.298718 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.298741 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:01Z","lastTransitionTime":"2025-10-09T15:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.402676 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.402760 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.402782 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.402817 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.402840 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:01Z","lastTransitionTime":"2025-10-09T15:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.506211 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.506259 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.506272 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.506294 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.506306 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:01Z","lastTransitionTime":"2025-10-09T15:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.610463 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.610544 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.610563 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.610589 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.610606 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:01Z","lastTransitionTime":"2025-10-09T15:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.714931 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.714993 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.715011 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.715038 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.715052 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:01Z","lastTransitionTime":"2025-10-09T15:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.817640 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.817718 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.817736 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.817776 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.817799 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:01Z","lastTransitionTime":"2025-10-09T15:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.920835 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.920904 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.920927 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.920959 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:01 crc kubenswrapper[4719]: I1009 15:20:01.920982 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:01Z","lastTransitionTime":"2025-10-09T15:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.024787 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.024903 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.024915 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.024931 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.024942 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:02Z","lastTransitionTime":"2025-10-09T15:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.127998 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.128028 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.128036 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.128051 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.128080 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:02Z","lastTransitionTime":"2025-10-09T15:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.161081 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:20:02 crc kubenswrapper[4719]: E1009 15:20:02.161324 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.231842 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.231911 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.231933 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.231968 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.231989 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:02Z","lastTransitionTime":"2025-10-09T15:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.335669 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.335737 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.335760 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.335787 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.335811 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:02Z","lastTransitionTime":"2025-10-09T15:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.439330 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.439445 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.439464 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.439496 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.439516 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:02Z","lastTransitionTime":"2025-10-09T15:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.543213 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.543308 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.543329 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.543392 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.543409 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:02Z","lastTransitionTime":"2025-10-09T15:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.648046 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.648123 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.648147 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.648177 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.648196 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:02Z","lastTransitionTime":"2025-10-09T15:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.751975 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.752051 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.752070 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.752101 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.752119 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:02Z","lastTransitionTime":"2025-10-09T15:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.854697 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.854757 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.854772 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.854796 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.854811 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:02Z","lastTransitionTime":"2025-10-09T15:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.888429 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.888483 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.888495 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.888517 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.888532 4719 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T15:20:02Z","lastTransitionTime":"2025-10-09T15:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.954988 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9xd7"] Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.955887 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9xd7" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.958128 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.958827 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.960016 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.960055 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.982397 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-mtpbz" podStartSLOduration=88.982338462 podStartE2EDuration="1m28.982338462s" podCreationTimestamp="2025-10-09 15:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:02.98167394 +0000 UTC m=+108.491385245" watchObservedRunningTime="2025-10-09 15:20:02.982338462 +0000 UTC m=+108.492049777" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.987019 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3023e437-38ad-4f77-b8da-37b22de7b1bc-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-b9xd7\" (UID: \"3023e437-38ad-4f77-b8da-37b22de7b1bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9xd7" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.987076 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3023e437-38ad-4f77-b8da-37b22de7b1bc-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-b9xd7\" (UID: \"3023e437-38ad-4f77-b8da-37b22de7b1bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9xd7" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.987110 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3023e437-38ad-4f77-b8da-37b22de7b1bc-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-b9xd7\" (UID: \"3023e437-38ad-4f77-b8da-37b22de7b1bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9xd7" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.987153 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3023e437-38ad-4f77-b8da-37b22de7b1bc-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-b9xd7\" (UID: \"3023e437-38ad-4f77-b8da-37b22de7b1bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9xd7" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.987214 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3023e437-38ad-4f77-b8da-37b22de7b1bc-service-ca\") pod \"cluster-version-operator-5c965bbfc6-b9xd7\" (UID: \"3023e437-38ad-4f77-b8da-37b22de7b1bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9xd7" Oct 09 15:20:02 crc kubenswrapper[4719]: I1009 15:20:02.999785 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=83.999755402 podStartE2EDuration="1m23.999755402s" podCreationTimestamp="2025-10-09 15:18:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:02.999221685 +0000 UTC m=+108.508932990" watchObservedRunningTime="2025-10-09 15:20:02.999755402 +0000 UTC m=+108.509466687" Oct 09 15:20:03 crc kubenswrapper[4719]: I1009 15:20:03.088747 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3023e437-38ad-4f77-b8da-37b22de7b1bc-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-b9xd7\" (UID: \"3023e437-38ad-4f77-b8da-37b22de7b1bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9xd7" Oct 09 15:20:03 crc kubenswrapper[4719]: I1009 15:20:03.088861 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3023e437-38ad-4f77-b8da-37b22de7b1bc-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-b9xd7\" (UID: \"3023e437-38ad-4f77-b8da-37b22de7b1bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9xd7" Oct 09 15:20:03 crc kubenswrapper[4719]: I1009 15:20:03.088894 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3023e437-38ad-4f77-b8da-37b22de7b1bc-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-b9xd7\" (UID: \"3023e437-38ad-4f77-b8da-37b22de7b1bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9xd7" Oct 09 15:20:03 crc kubenswrapper[4719]: I1009 15:20:03.088944 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3023e437-38ad-4f77-b8da-37b22de7b1bc-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-b9xd7\" (UID: \"3023e437-38ad-4f77-b8da-37b22de7b1bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9xd7" Oct 09 15:20:03 crc kubenswrapper[4719]: I1009 15:20:03.089035 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3023e437-38ad-4f77-b8da-37b22de7b1bc-service-ca\") pod \"cluster-version-operator-5c965bbfc6-b9xd7\" (UID: \"3023e437-38ad-4f77-b8da-37b22de7b1bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9xd7" Oct 09 15:20:03 crc kubenswrapper[4719]: I1009 15:20:03.089492 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3023e437-38ad-4f77-b8da-37b22de7b1bc-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-b9xd7\" (UID: \"3023e437-38ad-4f77-b8da-37b22de7b1bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9xd7" Oct 09 15:20:03 crc kubenswrapper[4719]: I1009 15:20:03.089635 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3023e437-38ad-4f77-b8da-37b22de7b1bc-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-b9xd7\" (UID: \"3023e437-38ad-4f77-b8da-37b22de7b1bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9xd7" Oct 09 15:20:03 crc kubenswrapper[4719]: I1009 15:20:03.091310 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3023e437-38ad-4f77-b8da-37b22de7b1bc-service-ca\") pod \"cluster-version-operator-5c965bbfc6-b9xd7\" (UID: \"3023e437-38ad-4f77-b8da-37b22de7b1bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9xd7" Oct 09 15:20:03 crc kubenswrapper[4719]: I1009 15:20:03.098211 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3023e437-38ad-4f77-b8da-37b22de7b1bc-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-b9xd7\" (UID: \"3023e437-38ad-4f77-b8da-37b22de7b1bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9xd7" Oct 09 15:20:03 crc kubenswrapper[4719]: I1009 15:20:03.111686 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3023e437-38ad-4f77-b8da-37b22de7b1bc-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-b9xd7\" (UID: \"3023e437-38ad-4f77-b8da-37b22de7b1bc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9xd7" Oct 09 15:20:03 crc kubenswrapper[4719]: I1009 15:20:03.160295 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:20:03 crc kubenswrapper[4719]: I1009 15:20:03.160476 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:20:03 crc kubenswrapper[4719]: I1009 15:20:03.160590 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:20:03 crc kubenswrapper[4719]: E1009 15:20:03.160603 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:20:03 crc kubenswrapper[4719]: E1009 15:20:03.160720 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:20:03 crc kubenswrapper[4719]: E1009 15:20:03.160829 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:20:03 crc kubenswrapper[4719]: I1009 15:20:03.289532 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9xd7" Oct 09 15:20:03 crc kubenswrapper[4719]: W1009 15:20:03.313060 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3023e437_38ad_4f77_b8da_37b22de7b1bc.slice/crio-af88251e4ef4eec76cbc339069df0c1607ca90bb4a32dbe23763d2a59411a739 WatchSource:0}: Error finding container af88251e4ef4eec76cbc339069df0c1607ca90bb4a32dbe23763d2a59411a739: Status 404 returned error can't find the container with id af88251e4ef4eec76cbc339069df0c1607ca90bb4a32dbe23763d2a59411a739 Oct 09 15:20:03 crc kubenswrapper[4719]: I1009 15:20:03.698473 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9xd7" event={"ID":"3023e437-38ad-4f77-b8da-37b22de7b1bc","Type":"ContainerStarted","Data":"230a7c07edd20f6a1fabc419ef101680a2f9cead7c2cb815820866a003caf513"} Oct 09 15:20:03 crc kubenswrapper[4719]: I1009 15:20:03.698567 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9xd7" event={"ID":"3023e437-38ad-4f77-b8da-37b22de7b1bc","Type":"ContainerStarted","Data":"af88251e4ef4eec76cbc339069df0c1607ca90bb4a32dbe23763d2a59411a739"} Oct 09 15:20:03 crc kubenswrapper[4719]: I1009 15:20:03.717806 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=52.717779314 podStartE2EDuration="52.717779314s" podCreationTimestamp="2025-10-09 15:19:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:03.016444019 +0000 UTC m=+108.526155324" watchObservedRunningTime="2025-10-09 15:20:03.717779314 +0000 UTC m=+109.227490619" Oct 09 15:20:03 crc kubenswrapper[4719]: I1009 15:20:03.718378 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b9xd7" podStartSLOduration=89.718370183 podStartE2EDuration="1m29.718370183s" podCreationTimestamp="2025-10-09 15:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:03.716453692 +0000 UTC m=+109.226165057" watchObservedRunningTime="2025-10-09 15:20:03.718370183 +0000 UTC m=+109.228081488" Oct 09 15:20:04 crc kubenswrapper[4719]: I1009 15:20:04.160580 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:20:04 crc kubenswrapper[4719]: E1009 15:20:04.160724 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:20:05 crc kubenswrapper[4719]: I1009 15:20:05.161124 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:20:05 crc kubenswrapper[4719]: I1009 15:20:05.161167 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:20:05 crc kubenswrapper[4719]: I1009 15:20:05.162314 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:20:05 crc kubenswrapper[4719]: E1009 15:20:05.162301 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:20:05 crc kubenswrapper[4719]: E1009 15:20:05.162610 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:20:05 crc kubenswrapper[4719]: E1009 15:20:05.162795 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:20:06 crc kubenswrapper[4719]: I1009 15:20:06.160814 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:20:06 crc kubenswrapper[4719]: E1009 15:20:06.161307 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:20:07 crc kubenswrapper[4719]: I1009 15:20:07.160835 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:20:07 crc kubenswrapper[4719]: I1009 15:20:07.160886 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:20:07 crc kubenswrapper[4719]: I1009 15:20:07.160995 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:20:07 crc kubenswrapper[4719]: E1009 15:20:07.161144 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:20:07 crc kubenswrapper[4719]: E1009 15:20:07.161312 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:20:07 crc kubenswrapper[4719]: E1009 15:20:07.161397 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:20:08 crc kubenswrapper[4719]: I1009 15:20:08.160493 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:20:08 crc kubenswrapper[4719]: E1009 15:20:08.161079 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:20:08 crc kubenswrapper[4719]: I1009 15:20:08.715642 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kmbvp_6a7f4c67-0335-4c58-896a-b3059d9a9a3f/kube-multus/1.log" Oct 09 15:20:08 crc kubenswrapper[4719]: I1009 15:20:08.717139 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kmbvp_6a7f4c67-0335-4c58-896a-b3059d9a9a3f/kube-multus/0.log" Oct 09 15:20:08 crc kubenswrapper[4719]: I1009 15:20:08.717331 4719 generic.go:334] "Generic (PLEG): container finished" podID="6a7f4c67-0335-4c58-896a-b3059d9a9a3f" containerID="201751e1a01c1fefb61309835c66a89743c507dff1e0d6e75a5ecf3447831840" exitCode=1 Oct 09 15:20:08 crc kubenswrapper[4719]: I1009 15:20:08.717457 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kmbvp" event={"ID":"6a7f4c67-0335-4c58-896a-b3059d9a9a3f","Type":"ContainerDied","Data":"201751e1a01c1fefb61309835c66a89743c507dff1e0d6e75a5ecf3447831840"} Oct 09 15:20:08 crc kubenswrapper[4719]: I1009 15:20:08.717519 4719 scope.go:117] "RemoveContainer" containerID="11c3e9021193fb879f639a1c65bb6665d3c27b733029dfeb3fe5742b517a7783" Oct 09 15:20:08 crc kubenswrapper[4719]: I1009 15:20:08.718141 4719 scope.go:117] "RemoveContainer" containerID="201751e1a01c1fefb61309835c66a89743c507dff1e0d6e75a5ecf3447831840" Oct 09 15:20:08 crc kubenswrapper[4719]: E1009 15:20:08.718326 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-kmbvp_openshift-multus(6a7f4c67-0335-4c58-896a-b3059d9a9a3f)\"" pod="openshift-multus/multus-kmbvp" podUID="6a7f4c67-0335-4c58-896a-b3059d9a9a3f" Oct 09 15:20:09 crc kubenswrapper[4719]: I1009 15:20:09.160386 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:20:09 crc kubenswrapper[4719]: I1009 15:20:09.160413 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:20:09 crc kubenswrapper[4719]: I1009 15:20:09.160531 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:20:09 crc kubenswrapper[4719]: E1009 15:20:09.160657 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:20:09 crc kubenswrapper[4719]: E1009 15:20:09.160771 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:20:09 crc kubenswrapper[4719]: E1009 15:20:09.160906 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:20:09 crc kubenswrapper[4719]: I1009 15:20:09.720760 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kmbvp_6a7f4c67-0335-4c58-896a-b3059d9a9a3f/kube-multus/1.log" Oct 09 15:20:10 crc kubenswrapper[4719]: I1009 15:20:10.160718 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:20:10 crc kubenswrapper[4719]: E1009 15:20:10.160857 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:20:10 crc kubenswrapper[4719]: I1009 15:20:10.161696 4719 scope.go:117] "RemoveContainer" containerID="4859b0f970ed0dea88b96ebd820f8f3806673c1ffff2ad8398b0934dec9535a8" Oct 09 15:20:10 crc kubenswrapper[4719]: E1009 15:20:10.161859 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zv8jk_openshift-ovn-kubernetes(fea6a48c-769c-41bf-95ce-649cc31eb4e5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" Oct 09 15:20:11 crc kubenswrapper[4719]: I1009 15:20:11.160901 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:20:11 crc kubenswrapper[4719]: I1009 15:20:11.161023 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:20:11 crc kubenswrapper[4719]: E1009 15:20:11.161052 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:20:11 crc kubenswrapper[4719]: I1009 15:20:11.161093 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:20:11 crc kubenswrapper[4719]: E1009 15:20:11.161450 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:20:11 crc kubenswrapper[4719]: E1009 15:20:11.161627 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:20:12 crc kubenswrapper[4719]: I1009 15:20:12.161114 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:20:12 crc kubenswrapper[4719]: E1009 15:20:12.161253 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:20:13 crc kubenswrapper[4719]: I1009 15:20:13.160687 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:20:13 crc kubenswrapper[4719]: E1009 15:20:13.160819 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:20:13 crc kubenswrapper[4719]: I1009 15:20:13.161167 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:20:13 crc kubenswrapper[4719]: E1009 15:20:13.161236 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:20:13 crc kubenswrapper[4719]: I1009 15:20:13.161422 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:20:13 crc kubenswrapper[4719]: E1009 15:20:13.161471 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:20:14 crc kubenswrapper[4719]: I1009 15:20:14.160958 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:20:14 crc kubenswrapper[4719]: E1009 15:20:14.161085 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:20:15 crc kubenswrapper[4719]: I1009 15:20:15.161071 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:20:15 crc kubenswrapper[4719]: I1009 15:20:15.161976 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:20:15 crc kubenswrapper[4719]: E1009 15:20:15.162092 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:20:15 crc kubenswrapper[4719]: I1009 15:20:15.162176 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:20:15 crc kubenswrapper[4719]: E1009 15:20:15.162334 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:20:15 crc kubenswrapper[4719]: E1009 15:20:15.162463 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:20:15 crc kubenswrapper[4719]: E1009 15:20:15.181438 4719 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 09 15:20:15 crc kubenswrapper[4719]: E1009 15:20:15.269758 4719 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 15:20:16 crc kubenswrapper[4719]: I1009 15:20:16.160656 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:20:16 crc kubenswrapper[4719]: E1009 15:20:16.160812 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:20:17 crc kubenswrapper[4719]: I1009 15:20:17.160630 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:20:17 crc kubenswrapper[4719]: I1009 15:20:17.160782 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:20:17 crc kubenswrapper[4719]: E1009 15:20:17.160986 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:20:17 crc kubenswrapper[4719]: E1009 15:20:17.160796 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:20:17 crc kubenswrapper[4719]: I1009 15:20:17.161214 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:20:17 crc kubenswrapper[4719]: E1009 15:20:17.161295 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:20:18 crc kubenswrapper[4719]: I1009 15:20:18.160898 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:20:18 crc kubenswrapper[4719]: E1009 15:20:18.161275 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:20:19 crc kubenswrapper[4719]: I1009 15:20:19.160638 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:20:19 crc kubenswrapper[4719]: I1009 15:20:19.160701 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:20:19 crc kubenswrapper[4719]: E1009 15:20:19.160842 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:20:19 crc kubenswrapper[4719]: I1009 15:20:19.160865 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:20:19 crc kubenswrapper[4719]: E1009 15:20:19.160998 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:20:19 crc kubenswrapper[4719]: E1009 15:20:19.161161 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:20:20 crc kubenswrapper[4719]: I1009 15:20:20.160289 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:20:20 crc kubenswrapper[4719]: E1009 15:20:20.160696 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:20:20 crc kubenswrapper[4719]: I1009 15:20:20.161418 4719 scope.go:117] "RemoveContainer" containerID="201751e1a01c1fefb61309835c66a89743c507dff1e0d6e75a5ecf3447831840" Oct 09 15:20:20 crc kubenswrapper[4719]: E1009 15:20:20.270705 4719 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 15:20:20 crc kubenswrapper[4719]: I1009 15:20:20.753240 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kmbvp_6a7f4c67-0335-4c58-896a-b3059d9a9a3f/kube-multus/1.log" Oct 09 15:20:20 crc kubenswrapper[4719]: I1009 15:20:20.753302 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kmbvp" event={"ID":"6a7f4c67-0335-4c58-896a-b3059d9a9a3f","Type":"ContainerStarted","Data":"64908969d19b71a3974eeabf4e47002eb2af4a3eeee316c375b203ecfe43212f"} Oct 09 15:20:21 crc kubenswrapper[4719]: I1009 15:20:21.161093 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:20:21 crc kubenswrapper[4719]: E1009 15:20:21.161211 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:20:21 crc kubenswrapper[4719]: I1009 15:20:21.161235 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:20:21 crc kubenswrapper[4719]: E1009 15:20:21.161367 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:20:21 crc kubenswrapper[4719]: I1009 15:20:21.161255 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:20:21 crc kubenswrapper[4719]: E1009 15:20:21.161451 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:20:22 crc kubenswrapper[4719]: I1009 15:20:22.160534 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:20:22 crc kubenswrapper[4719]: E1009 15:20:22.160687 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:20:22 crc kubenswrapper[4719]: I1009 15:20:22.161580 4719 scope.go:117] "RemoveContainer" containerID="4859b0f970ed0dea88b96ebd820f8f3806673c1ffff2ad8398b0934dec9535a8" Oct 09 15:20:22 crc kubenswrapper[4719]: I1009 15:20:22.760481 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zv8jk_fea6a48c-769c-41bf-95ce-649cc31eb4e5/ovnkube-controller/3.log" Oct 09 15:20:22 crc kubenswrapper[4719]: I1009 15:20:22.762937 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerStarted","Data":"f682329c6f1662ef1c3d1654d5d65f347ebb1061a2e011ba9e36bbd51b862d22"} Oct 09 15:20:22 crc kubenswrapper[4719]: I1009 15:20:22.763293 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:20:22 crc kubenswrapper[4719]: I1009 15:20:22.789716 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" podStartSLOduration=108.789695599 podStartE2EDuration="1m48.789695599s" podCreationTimestamp="2025-10-09 15:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:22.788609133 +0000 UTC m=+128.298320438" watchObservedRunningTime="2025-10-09 15:20:22.789695599 +0000 UTC m=+128.299406884" Oct 09 15:20:22 crc kubenswrapper[4719]: I1009 15:20:22.928876 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-58bdp"] Oct 09 15:20:22 crc kubenswrapper[4719]: I1009 15:20:22.928966 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:20:22 crc kubenswrapper[4719]: E1009 15:20:22.929051 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:20:23 crc kubenswrapper[4719]: I1009 15:20:23.160590 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:20:23 crc kubenswrapper[4719]: I1009 15:20:23.160619 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:20:23 crc kubenswrapper[4719]: I1009 15:20:23.160647 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:20:23 crc kubenswrapper[4719]: E1009 15:20:23.160747 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:20:23 crc kubenswrapper[4719]: E1009 15:20:23.160825 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:20:23 crc kubenswrapper[4719]: E1009 15:20:23.160913 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:20:24 crc kubenswrapper[4719]: I1009 15:20:24.160382 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:20:24 crc kubenswrapper[4719]: E1009 15:20:24.160545 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-58bdp" podUID="d00237ae-ca20-4202-8e24-e4988fbf5269" Oct 09 15:20:25 crc kubenswrapper[4719]: I1009 15:20:25.160725 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:20:25 crc kubenswrapper[4719]: I1009 15:20:25.161672 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:20:25 crc kubenswrapper[4719]: I1009 15:20:25.161711 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:20:25 crc kubenswrapper[4719]: E1009 15:20:25.161837 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 15:20:25 crc kubenswrapper[4719]: E1009 15:20:25.161956 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 15:20:25 crc kubenswrapper[4719]: E1009 15:20:25.162022 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 15:20:26 crc kubenswrapper[4719]: I1009 15:20:26.160800 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:20:26 crc kubenswrapper[4719]: I1009 15:20:26.163140 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 09 15:20:26 crc kubenswrapper[4719]: I1009 15:20:26.164818 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 09 15:20:27 crc kubenswrapper[4719]: I1009 15:20:27.161093 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:20:27 crc kubenswrapper[4719]: I1009 15:20:27.162137 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:20:27 crc kubenswrapper[4719]: I1009 15:20:27.162572 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:20:27 crc kubenswrapper[4719]: I1009 15:20:27.168531 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 09 15:20:27 crc kubenswrapper[4719]: I1009 15:20:27.168527 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 09 15:20:27 crc kubenswrapper[4719]: I1009 15:20:27.170836 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 09 15:20:27 crc kubenswrapper[4719]: I1009 15:20:27.170848 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.620631 4719 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.674515 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-4mm2x"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.675086 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.675805 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-75q5v"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.676326 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-75q5v" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.679435 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lblwv"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.679951 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-q2vmh"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.680458 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.680541 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q2vmh" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.680473 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.680672 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.680473 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.681045 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.681612 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.681965 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.682162 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-22bgx"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.682529 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-22bgx" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.682999 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.683051 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6ql7w"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.683067 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.683329 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.684525 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.684543 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.684543 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.684747 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.686380 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-j74ct"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.686785 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-p7bmv"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.687230 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-p7bmv" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.687767 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.688106 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6ql7w" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.691489 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.696942 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.698254 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.704615 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.706713 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.708764 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-9599k"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.709439 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9599k" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.709609 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.710146 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.710250 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.710611 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.710728 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.710746 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.710893 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.711110 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.717215 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.719212 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.719341 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.720190 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.720409 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.720853 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.720950 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.721164 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.721314 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.721661 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.721387 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.742603 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2pph"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.743145 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-k5w5x"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.743466 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-xw899"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.743946 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-xw899" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.743982 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2pph" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.744256 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-k5w5x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.744379 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.745535 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.745570 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.745690 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.745773 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.745790 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.745880 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.745837 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.745842 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.745998 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.746042 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.746064 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.746143 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.746148 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.746177 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.746262 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.746277 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.746394 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.746494 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.746694 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.746710 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.746833 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.746842 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.747777 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-db9tz"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.748252 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.748391 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.752721 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.757716 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.759482 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qk8b7"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.760085 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qk8b7" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.760533 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cqrnr"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.761101 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.761507 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.761879 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fhfxg"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.762219 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.764283 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-z2lvl"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.764719 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-z2lvl" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.765111 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fhfxg" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.765446 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncvlk"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.766086 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncvlk" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.767424 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57b4h"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.767921 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57b4h" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.768226 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dwt2p"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.769088 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dwt2p" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.769335 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6g6x"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.770092 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6g6x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.775666 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t7zd8"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.776642 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hz62m"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.776921 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t7zd8" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.777194 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333715-pgfd2"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.777774 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hz62m" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.777985 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-79ms4"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.778295 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333715-pgfd2" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.778553 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-79ms4" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.778883 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ff97c"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.779506 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ff97c" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.779852 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ffm6c"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.780410 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ffm6c" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.801319 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2d4s7"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.801378 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.801446 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.802105 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.802609 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.803413 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.803940 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.804192 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.804273 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.804696 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.804850 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.804916 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.805002 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.805030 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.805047 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.805076 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.805142 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.805158 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.805190 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.805267 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.806479 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.806535 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.809102 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9jzcj"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.809970 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2d4s7" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.810064 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8v6g2"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.810655 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9jzcj" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.811528 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.812339 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.832667 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8v6g2" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.835068 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.835305 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.835626 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.835778 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.836075 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.836189 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.836989 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.837687 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.837914 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.838069 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.838135 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.838294 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.838324 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.838379 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.838572 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.839171 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.839263 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.839445 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.841405 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.844678 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-vdfqp"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.845317 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-qd6rx"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.845714 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-vdfqp" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.845859 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qd6rx" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.846854 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lpbj\" (UniqueName: \"kubernetes.io/projected/c895d97a-7287-49a8-9ac5-bc87e8bcf297-kube-api-access-4lpbj\") pod \"console-f9d7485db-j74ct\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.846878 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f07d2126-7037-4b5c-aa67-4d09bf873e07-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-75q5v\" (UID: \"f07d2126-7037-4b5c-aa67-4d09bf873e07\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-75q5v" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.846898 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53e8b265-c7b0-4b11-bcb0-225ada8332ce-metrics-tls\") pod \"dns-operator-744455d44c-xw899\" (UID: \"53e8b265-c7b0-4b11-bcb0-225ada8332ce\") " pod="openshift-dns-operator/dns-operator-744455d44c-xw899" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.846915 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/563b4e02-b3d3-4f24-b571-091d77871f9b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-9599k\" (UID: \"563b4e02-b3d3-4f24-b571-091d77871f9b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9599k" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.846918 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.846935 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf975172-874f-418c-947b-6226e1662647-trusted-ca\") pod \"console-operator-58897d9998-22bgx\" (UID: \"cf975172-874f-418c-947b-6226e1662647\") " pod="openshift-console-operator/console-operator-58897d9998-22bgx" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.847002 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd6md\" (UniqueName: \"kubernetes.io/projected/b28184c2-4cb3-4fe7-9c69-3fcf55a0d0e0-kube-api-access-jd6md\") pod \"cluster-samples-operator-665b6dd947-6ql7w\" (UID: \"b28184c2-4cb3-4fe7-9c69-3fcf55a0d0e0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6ql7w" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.847019 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2aba1eb4-b085-48e7-941b-403c160fb3f4-machine-approver-tls\") pod \"machine-approver-56656f9798-q2vmh\" (UID: \"2aba1eb4-b085-48e7-941b-403c160fb3f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q2vmh" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.847034 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2nzs\" (UniqueName: \"kubernetes.io/projected/f07d2126-7037-4b5c-aa67-4d09bf873e07-kube-api-access-x2nzs\") pod \"machine-api-operator-5694c8668f-75q5v\" (UID: \"f07d2126-7037-4b5c-aa67-4d09bf873e07\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-75q5v" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.847059 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/633935e5-0232-4844-8f77-e87a7d4385cd-audit-policies\") pod \"apiserver-7bbb656c7d-696rc\" (UID: \"633935e5-0232-4844-8f77-e87a7d4385cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.847076 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvpw4\" (UniqueName: \"kubernetes.io/projected/633935e5-0232-4844-8f77-e87a7d4385cd-kube-api-access-dvpw4\") pod \"apiserver-7bbb656c7d-696rc\" (UID: \"633935e5-0232-4844-8f77-e87a7d4385cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.847092 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs8sl\" (UniqueName: \"kubernetes.io/projected/26813cbf-0ed2-460a-b36f-f1a8895e68ec-kube-api-access-hs8sl\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.847109 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c895d97a-7287-49a8-9ac5-bc87e8bcf297-console-serving-cert\") pod \"console-f9d7485db-j74ct\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.847161 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a643603c-791b-4f68-a320-b6a2bcabf91f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-c2pph\" (UID: \"a643603c-791b-4f68-a320-b6a2bcabf91f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2pph" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.847187 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c895d97a-7287-49a8-9ac5-bc87e8bcf297-oauth-serving-cert\") pod \"console-f9d7485db-j74ct\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.847251 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc178477-fdbe-4189-80e7-5ceba0100dbd-serving-cert\") pod \"etcd-operator-b45778765-k5w5x\" (UID: \"bc178477-fdbe-4189-80e7-5ceba0100dbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k5w5x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.847270 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/26813cbf-0ed2-460a-b36f-f1a8895e68ec-audit-dir\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.847302 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a643603c-791b-4f68-a320-b6a2bcabf91f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-c2pph\" (UID: \"a643603c-791b-4f68-a320-b6a2bcabf91f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2pph" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.847632 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c895d97a-7287-49a8-9ac5-bc87e8bcf297-trusted-ca-bundle\") pod \"console-f9d7485db-j74ct\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.847675 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/26813cbf-0ed2-460a-b36f-f1a8895e68ec-node-pullsecrets\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.847738 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/26813cbf-0ed2-460a-b36f-f1a8895e68ec-audit\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.847765 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35e1fbe6-e210-4329-a27b-3341136d7dcd-serving-cert\") pod \"controller-manager-879f6c89f-lblwv\" (UID: \"35e1fbe6-e210-4329-a27b-3341136d7dcd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.848018 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26813cbf-0ed2-460a-b36f-f1a8895e68ec-trusted-ca-bundle\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.848079 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4hgc\" (UniqueName: \"kubernetes.io/projected/35e1fbe6-e210-4329-a27b-3341136d7dcd-kube-api-access-x4hgc\") pod \"controller-manager-879f6c89f-lblwv\" (UID: \"35e1fbe6-e210-4329-a27b-3341136d7dcd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.848186 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8sb6\" (UniqueName: \"kubernetes.io/projected/2aba1eb4-b085-48e7-941b-403c160fb3f4-kube-api-access-p8sb6\") pod \"machine-approver-56656f9798-q2vmh\" (UID: \"2aba1eb4-b085-48e7-941b-403c160fb3f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q2vmh" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.848320 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f07d2126-7037-4b5c-aa67-4d09bf873e07-images\") pod \"machine-api-operator-5694c8668f-75q5v\" (UID: \"f07d2126-7037-4b5c-aa67-4d09bf873e07\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-75q5v" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.848389 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/26813cbf-0ed2-460a-b36f-f1a8895e68ec-etcd-client\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.848413 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aba1eb4-b085-48e7-941b-403c160fb3f4-config\") pod \"machine-approver-56656f9798-q2vmh\" (UID: \"2aba1eb4-b085-48e7-941b-403c160fb3f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q2vmh" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.848429 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnf28\" (UniqueName: \"kubernetes.io/projected/cf975172-874f-418c-947b-6226e1662647-kube-api-access-rnf28\") pod \"console-operator-58897d9998-22bgx\" (UID: \"cf975172-874f-418c-947b-6226e1662647\") " pod="openshift-console-operator/console-operator-58897d9998-22bgx" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.848457 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf975172-874f-418c-947b-6226e1662647-config\") pod \"console-operator-58897d9998-22bgx\" (UID: \"cf975172-874f-418c-947b-6226e1662647\") " pod="openshift-console-operator/console-operator-58897d9998-22bgx" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.848483 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c895d97a-7287-49a8-9ac5-bc87e8bcf297-console-oauth-config\") pod \"console-f9d7485db-j74ct\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.848537 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/563b4e02-b3d3-4f24-b571-091d77871f9b-serving-cert\") pod \"openshift-config-operator-7777fb866f-9599k\" (UID: \"563b4e02-b3d3-4f24-b571-091d77871f9b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9599k" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.848553 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/26813cbf-0ed2-460a-b36f-f1a8895e68ec-encryption-config\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.848630 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/bc178477-fdbe-4189-80e7-5ceba0100dbd-etcd-ca\") pod \"etcd-operator-b45778765-k5w5x\" (UID: \"bc178477-fdbe-4189-80e7-5ceba0100dbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k5w5x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.848664 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.848688 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/35e1fbe6-e210-4329-a27b-3341136d7dcd-client-ca\") pod \"controller-manager-879f6c89f-lblwv\" (UID: \"35e1fbe6-e210-4329-a27b-3341136d7dcd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.848707 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf975172-874f-418c-947b-6226e1662647-serving-cert\") pod \"console-operator-58897d9998-22bgx\" (UID: \"cf975172-874f-418c-947b-6226e1662647\") " pod="openshift-console-operator/console-operator-58897d9998-22bgx" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.848723 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/633935e5-0232-4844-8f77-e87a7d4385cd-encryption-config\") pod \"apiserver-7bbb656c7d-696rc\" (UID: \"633935e5-0232-4844-8f77-e87a7d4385cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.848872 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/26813cbf-0ed2-460a-b36f-f1a8895e68ec-etcd-serving-ca\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.849113 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/35e1fbe6-e210-4329-a27b-3341136d7dcd-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lblwv\" (UID: \"35e1fbe6-e210-4329-a27b-3341136d7dcd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.849175 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b28184c2-4cb3-4fe7-9c69-3fcf55a0d0e0-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-6ql7w\" (UID: \"b28184c2-4cb3-4fe7-9c69-3fcf55a0d0e0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6ql7w" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.849397 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c895d97a-7287-49a8-9ac5-bc87e8bcf297-console-config\") pod \"console-f9d7485db-j74ct\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.849424 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bc178477-fdbe-4189-80e7-5ceba0100dbd-etcd-client\") pod \"etcd-operator-b45778765-k5w5x\" (UID: \"bc178477-fdbe-4189-80e7-5ceba0100dbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k5w5x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.849449 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zspkw\" (UniqueName: \"kubernetes.io/projected/a643603c-791b-4f68-a320-b6a2bcabf91f-kube-api-access-zspkw\") pod \"openshift-controller-manager-operator-756b6f6bc6-c2pph\" (UID: \"a643603c-791b-4f68-a320-b6a2bcabf91f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2pph" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.849468 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26813cbf-0ed2-460a-b36f-f1a8895e68ec-config\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.849483 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/633935e5-0232-4844-8f77-e87a7d4385cd-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-696rc\" (UID: \"633935e5-0232-4844-8f77-e87a7d4385cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.849498 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c895d97a-7287-49a8-9ac5-bc87e8bcf297-service-ca\") pod \"console-f9d7485db-j74ct\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.849511 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/26813cbf-0ed2-460a-b36f-f1a8895e68ec-image-import-ca\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.849562 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/633935e5-0232-4844-8f77-e87a7d4385cd-etcd-client\") pod \"apiserver-7bbb656c7d-696rc\" (UID: \"633935e5-0232-4844-8f77-e87a7d4385cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.849580 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/633935e5-0232-4844-8f77-e87a7d4385cd-audit-dir\") pod \"apiserver-7bbb656c7d-696rc\" (UID: \"633935e5-0232-4844-8f77-e87a7d4385cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.849611 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26813cbf-0ed2-460a-b36f-f1a8895e68ec-serving-cert\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.849626 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f07d2126-7037-4b5c-aa67-4d09bf873e07-config\") pod \"machine-api-operator-5694c8668f-75q5v\" (UID: \"f07d2126-7037-4b5c-aa67-4d09bf873e07\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-75q5v" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.849645 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrtsk\" (UniqueName: \"kubernetes.io/projected/53e8b265-c7b0-4b11-bcb0-225ada8332ce-kube-api-access-qrtsk\") pod \"dns-operator-744455d44c-xw899\" (UID: \"53e8b265-c7b0-4b11-bcb0-225ada8332ce\") " pod="openshift-dns-operator/dns-operator-744455d44c-xw899" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.849675 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2aba1eb4-b085-48e7-941b-403c160fb3f4-auth-proxy-config\") pod \"machine-approver-56656f9798-q2vmh\" (UID: \"2aba1eb4-b085-48e7-941b-403c160fb3f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q2vmh" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.849691 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np54w\" (UniqueName: \"kubernetes.io/projected/563b4e02-b3d3-4f24-b571-091d77871f9b-kube-api-access-np54w\") pod \"openshift-config-operator-7777fb866f-9599k\" (UID: \"563b4e02-b3d3-4f24-b571-091d77871f9b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9599k" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.849710 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc178477-fdbe-4189-80e7-5ceba0100dbd-config\") pod \"etcd-operator-b45778765-k5w5x\" (UID: \"bc178477-fdbe-4189-80e7-5ceba0100dbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k5w5x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.849725 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/633935e5-0232-4844-8f77-e87a7d4385cd-serving-cert\") pod \"apiserver-7bbb656c7d-696rc\" (UID: \"633935e5-0232-4844-8f77-e87a7d4385cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.849742 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/bc178477-fdbe-4189-80e7-5ceba0100dbd-etcd-service-ca\") pod \"etcd-operator-b45778765-k5w5x\" (UID: \"bc178477-fdbe-4189-80e7-5ceba0100dbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k5w5x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.849759 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/633935e5-0232-4844-8f77-e87a7d4385cd-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-696rc\" (UID: \"633935e5-0232-4844-8f77-e87a7d4385cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.849806 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f2pv\" (UniqueName: \"kubernetes.io/projected/4091b4ed-3afe-4bab-b41e-0bca5b6f58b0-kube-api-access-6f2pv\") pod \"downloads-7954f5f757-p7bmv\" (UID: \"4091b4ed-3afe-4bab-b41e-0bca5b6f58b0\") " pod="openshift-console/downloads-7954f5f757-p7bmv" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.849823 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftr8w\" (UniqueName: \"kubernetes.io/projected/bc178477-fdbe-4189-80e7-5ceba0100dbd-kube-api-access-ftr8w\") pod \"etcd-operator-b45778765-k5w5x\" (UID: \"bc178477-fdbe-4189-80e7-5ceba0100dbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k5w5x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.849838 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35e1fbe6-e210-4329-a27b-3341136d7dcd-config\") pod \"controller-manager-879f6c89f-lblwv\" (UID: \"35e1fbe6-e210-4329-a27b-3341136d7dcd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.852379 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-98tjn"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.852902 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.853382 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-98tjn" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.854294 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ppwzx"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.855001 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ppwzx" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.858037 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hf8sg"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.858925 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hf8sg" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.860144 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x7wsc"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.860688 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x7wsc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.863090 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.864416 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-4mm2x"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.868321 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-p7bmv"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.868697 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-75q5v"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.870484 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-d88hv"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.871321 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2pph"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.871528 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-d88hv" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.872132 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-22bgx"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.873645 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-j74ct"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.875967 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6ql7w"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.875992 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57b4h"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.877691 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.894526 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-9599k"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.895063 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-k5w5x"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.896449 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-z2lvl"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.897182 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qk8b7"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.909155 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2d4s7"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.910054 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.910231 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ff97c"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.913160 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lblwv"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.917700 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.933728 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t7zd8"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.935097 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-xw899"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.936259 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-n6hr2"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.937201 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-n6hr2" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.937528 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.942808 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.945476 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333715-pgfd2"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.946593 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-db9tz"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.948467 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncvlk"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.948952 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-98tjn"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.950576 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fhfxg"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.950961 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/816f9c0b-05db-4dfc-8edb-ba2e0a14d43d-trusted-ca\") pod \"ingress-operator-5b745b69d9-fhfxg\" (UID: \"816f9c0b-05db-4dfc-8edb-ba2e0a14d43d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fhfxg" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.951028 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74cde4fb-7e17-40dc-8537-54b5ecb898d7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8v6g2\" (UID: \"74cde4fb-7e17-40dc-8537-54b5ecb898d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8v6g2" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.951067 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4hgc\" (UniqueName: \"kubernetes.io/projected/35e1fbe6-e210-4329-a27b-3341136d7dcd-kube-api-access-x4hgc\") pod \"controller-manager-879f6c89f-lblwv\" (UID: \"35e1fbe6-e210-4329-a27b-3341136d7dcd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.951120 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8sb6\" (UniqueName: \"kubernetes.io/projected/2aba1eb4-b085-48e7-941b-403c160fb3f4-kube-api-access-p8sb6\") pod \"machine-approver-56656f9798-q2vmh\" (UID: \"2aba1eb4-b085-48e7-941b-403c160fb3f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q2vmh" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.951148 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b6a94c8-9172-472f-8d40-4c259b21c6b9-config\") pod \"service-ca-operator-777779d784-qd6rx\" (UID: \"7b6a94c8-9172-472f-8d40-4c259b21c6b9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qd6rx" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.951190 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b4724c8-6007-4df3-b822-42d08ea33fde-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2d4s7\" (UID: \"4b4724c8-6007-4df3-b822-42d08ea33fde\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2d4s7" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.951215 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/33659bcf-6d50-402b-a0da-7610749b535c-profile-collector-cert\") pod \"catalog-operator-68c6474976-ncvlk\" (UID: \"33659bcf-6d50-402b-a0da-7610749b535c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncvlk" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.951254 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19fada05-fea3-4a67-a138-1cf460e31a2b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9jzcj\" (UID: \"19fada05-fea3-4a67-a138-1cf460e31a2b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9jzcj" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.951278 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z26lm\" (UniqueName: \"kubernetes.io/projected/10f4c85b-1896-4882-9ee0-8117ddf6a7a6-kube-api-access-z26lm\") pod \"openshift-apiserver-operator-796bbdcf4f-t7zd8\" (UID: \"10f4c85b-1896-4882-9ee0-8117ddf6a7a6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t7zd8" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.951303 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhjxp\" (UniqueName: \"kubernetes.io/projected/97eeda6e-c63f-4b48-a8bd-05e673d79117-kube-api-access-qhjxp\") pod \"machine-config-operator-74547568cd-hz62m\" (UID: \"97eeda6e-c63f-4b48-a8bd-05e673d79117\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hz62m" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.951366 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aba1eb4-b085-48e7-941b-403c160fb3f4-config\") pod \"machine-approver-56656f9798-q2vmh\" (UID: \"2aba1eb4-b085-48e7-941b-403c160fb3f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q2vmh" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.951392 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnf28\" (UniqueName: \"kubernetes.io/projected/cf975172-874f-418c-947b-6226e1662647-kube-api-access-rnf28\") pod \"console-operator-58897d9998-22bgx\" (UID: \"cf975172-874f-418c-947b-6226e1662647\") " pod="openshift-console-operator/console-operator-58897d9998-22bgx" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.951416 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf975172-874f-418c-947b-6226e1662647-config\") pod \"console-operator-58897d9998-22bgx\" (UID: \"cf975172-874f-418c-947b-6226e1662647\") " pod="openshift-console-operator/console-operator-58897d9998-22bgx" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.951458 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c895d97a-7287-49a8-9ac5-bc87e8bcf297-console-oauth-config\") pod \"console-f9d7485db-j74ct\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.951481 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/563b4e02-b3d3-4f24-b571-091d77871f9b-serving-cert\") pod \"openshift-config-operator-7777fb866f-9599k\" (UID: \"563b4e02-b3d3-4f24-b571-091d77871f9b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9599k" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.951502 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/633935e5-0232-4844-8f77-e87a7d4385cd-encryption-config\") pod \"apiserver-7bbb656c7d-696rc\" (UID: \"633935e5-0232-4844-8f77-e87a7d4385cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.951541 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/bc178477-fdbe-4189-80e7-5ceba0100dbd-etcd-ca\") pod \"etcd-operator-b45778765-k5w5x\" (UID: \"bc178477-fdbe-4189-80e7-5ceba0100dbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k5w5x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.951566 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/35e1fbe6-e210-4329-a27b-3341136d7dcd-client-ca\") pod \"controller-manager-879f6c89f-lblwv\" (UID: \"35e1fbe6-e210-4329-a27b-3341136d7dcd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.952265 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dwt2p"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.952817 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.952860 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/312693cc-d986-462d-ac50-f4e44dbc8cf1-tmpfs\") pod \"packageserver-d55dfcdfc-79ms4\" (UID: \"312693cc-d986-462d-ac50-f4e44dbc8cf1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-79ms4" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.953124 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/35e1fbe6-e210-4329-a27b-3341136d7dcd-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lblwv\" (UID: \"35e1fbe6-e210-4329-a27b-3341136d7dcd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.953152 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c06c731d-cb94-49f2-8afb-899c7c6e7724-serving-cert\") pod \"route-controller-manager-6576b87f9c-rwhcb\" (UID: \"c06c731d-cb94-49f2-8afb-899c7c6e7724\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.953181 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf975172-874f-418c-947b-6226e1662647-config\") pod \"console-operator-58897d9998-22bgx\" (UID: \"cf975172-874f-418c-947b-6226e1662647\") " pod="openshift-console-operator/console-operator-58897d9998-22bgx" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.953195 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.953287 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.953320 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/35e1fbe6-e210-4329-a27b-3341136d7dcd-client-ca\") pod \"controller-manager-879f6c89f-lblwv\" (UID: \"35e1fbe6-e210-4329-a27b-3341136d7dcd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.953335 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c895d97a-7287-49a8-9ac5-bc87e8bcf297-console-config\") pod \"console-f9d7485db-j74ct\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.953390 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97eeda6e-c63f-4b48-a8bd-05e673d79117-proxy-tls\") pod \"machine-config-operator-74547568cd-hz62m\" (UID: \"97eeda6e-c63f-4b48-a8bd-05e673d79117\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hz62m" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.953421 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bc178477-fdbe-4189-80e7-5ceba0100dbd-etcd-client\") pod \"etcd-operator-b45778765-k5w5x\" (UID: \"bc178477-fdbe-4189-80e7-5ceba0100dbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k5w5x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.953459 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zspkw\" (UniqueName: \"kubernetes.io/projected/a643603c-791b-4f68-a320-b6a2bcabf91f-kube-api-access-zspkw\") pod \"openshift-controller-manager-operator-756b6f6bc6-c2pph\" (UID: \"a643603c-791b-4f68-a320-b6a2bcabf91f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2pph" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.953485 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/633935e5-0232-4844-8f77-e87a7d4385cd-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-696rc\" (UID: \"633935e5-0232-4844-8f77-e87a7d4385cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.953512 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/26813cbf-0ed2-460a-b36f-f1a8895e68ec-image-import-ca\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.953535 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/633935e5-0232-4844-8f77-e87a7d4385cd-etcd-client\") pod \"apiserver-7bbb656c7d-696rc\" (UID: \"633935e5-0232-4844-8f77-e87a7d4385cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.953563 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjsj6\" (UniqueName: \"kubernetes.io/projected/816f9c0b-05db-4dfc-8edb-ba2e0a14d43d-kube-api-access-qjsj6\") pod \"ingress-operator-5b745b69d9-fhfxg\" (UID: \"816f9c0b-05db-4dfc-8edb-ba2e0a14d43d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fhfxg" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.953588 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f07d2126-7037-4b5c-aa67-4d09bf873e07-config\") pod \"machine-api-operator-5694c8668f-75q5v\" (UID: \"f07d2126-7037-4b5c-aa67-4d09bf873e07\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-75q5v" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.953614 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f1c3d8c5-6a51-4c89-96e9-f7fb62e5685b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-57b4h\" (UID: \"f1c3d8c5-6a51-4c89-96e9-f7fb62e5685b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57b4h" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.953640 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26813cbf-0ed2-460a-b36f-f1a8895e68ec-serving-cert\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.953665 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hw7k\" (UniqueName: \"kubernetes.io/projected/f1c3d8c5-6a51-4c89-96e9-f7fb62e5685b-kube-api-access-7hw7k\") pod \"olm-operator-6b444d44fb-57b4h\" (UID: \"f1c3d8c5-6a51-4c89-96e9-f7fb62e5685b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57b4h" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.953690 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8nch\" (UniqueName: \"kubernetes.io/projected/c06c731d-cb94-49f2-8afb-899c7c6e7724-kube-api-access-x8nch\") pod \"route-controller-manager-6576b87f9c-rwhcb\" (UID: \"c06c731d-cb94-49f2-8afb-899c7c6e7724\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.953718 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.953747 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np54w\" (UniqueName: \"kubernetes.io/projected/563b4e02-b3d3-4f24-b571-091d77871f9b-kube-api-access-np54w\") pod \"openshift-config-operator-7777fb866f-9599k\" (UID: \"563b4e02-b3d3-4f24-b571-091d77871f9b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9599k" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.953773 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/633935e5-0232-4844-8f77-e87a7d4385cd-serving-cert\") pod \"apiserver-7bbb656c7d-696rc\" (UID: \"633935e5-0232-4844-8f77-e87a7d4385cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.953916 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2a62908-86f6-4b7f-9169-cb7a9ef1ece8-service-ca-bundle\") pod \"router-default-5444994796-vdfqp\" (UID: \"d2a62908-86f6-4b7f-9169-cb7a9ef1ece8\") " pod="openshift-ingress/router-default-5444994796-vdfqp" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.953951 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79tkf\" (UniqueName: \"kubernetes.io/projected/312693cc-d986-462d-ac50-f4e44dbc8cf1-kube-api-access-79tkf\") pod \"packageserver-d55dfcdfc-79ms4\" (UID: \"312693cc-d986-462d-ac50-f4e44dbc8cf1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-79ms4" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.953998 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/33659bcf-6d50-402b-a0da-7610749b535c-srv-cert\") pod \"catalog-operator-68c6474976-ncvlk\" (UID: \"33659bcf-6d50-402b-a0da-7610749b535c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncvlk" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954028 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftr8w\" (UniqueName: \"kubernetes.io/projected/bc178477-fdbe-4189-80e7-5ceba0100dbd-kube-api-access-ftr8w\") pod \"etcd-operator-b45778765-k5w5x\" (UID: \"bc178477-fdbe-4189-80e7-5ceba0100dbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k5w5x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954058 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35e1fbe6-e210-4329-a27b-3341136d7dcd-config\") pod \"controller-manager-879f6c89f-lblwv\" (UID: \"35e1fbe6-e210-4329-a27b-3341136d7dcd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954094 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954117 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhjlm\" (UniqueName: \"kubernetes.io/projected/d2a62908-86f6-4b7f-9169-cb7a9ef1ece8-kube-api-access-dhjlm\") pod \"router-default-5444994796-vdfqp\" (UID: \"d2a62908-86f6-4b7f-9169-cb7a9ef1ece8\") " pod="openshift-ingress/router-default-5444994796-vdfqp" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954145 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lpbj\" (UniqueName: \"kubernetes.io/projected/c895d97a-7287-49a8-9ac5-bc87e8bcf297-kube-api-access-4lpbj\") pod \"console-f9d7485db-j74ct\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954169 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndb69\" (UniqueName: \"kubernetes.io/projected/7b6a94c8-9172-472f-8d40-4c259b21c6b9-kube-api-access-ndb69\") pod \"service-ca-operator-777779d784-qd6rx\" (UID: \"7b6a94c8-9172-472f-8d40-4c259b21c6b9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qd6rx" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954184 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c895d97a-7287-49a8-9ac5-bc87e8bcf297-console-config\") pod \"console-f9d7485db-j74ct\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954196 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2nzs\" (UniqueName: \"kubernetes.io/projected/f07d2126-7037-4b5c-aa67-4d09bf873e07-kube-api-access-x2nzs\") pod \"machine-api-operator-5694c8668f-75q5v\" (UID: \"f07d2126-7037-4b5c-aa67-4d09bf873e07\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-75q5v" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954243 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19fada05-fea3-4a67-a138-1cf460e31a2b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9jzcj\" (UID: \"19fada05-fea3-4a67-a138-1cf460e31a2b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9jzcj" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954272 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c06c731d-cb94-49f2-8afb-899c7c6e7724-client-ca\") pod \"route-controller-manager-6576b87f9c-rwhcb\" (UID: \"c06c731d-cb94-49f2-8afb-899c7c6e7724\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954293 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2a4743d0-e646-45ab-a225-816c0d99246a-audit-policies\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954332 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/312693cc-d986-462d-ac50-f4e44dbc8cf1-webhook-cert\") pod \"packageserver-d55dfcdfc-79ms4\" (UID: \"312693cc-d986-462d-ac50-f4e44dbc8cf1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-79ms4" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954381 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a643603c-791b-4f68-a320-b6a2bcabf91f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-c2pph\" (UID: \"a643603c-791b-4f68-a320-b6a2bcabf91f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2pph" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954403 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c895d97a-7287-49a8-9ac5-bc87e8bcf297-oauth-serving-cert\") pod \"console-f9d7485db-j74ct\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954423 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/26813cbf-0ed2-460a-b36f-f1a8895e68ec-audit-dir\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954445 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954470 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsdfm\" (UniqueName: \"kubernetes.io/projected/2a4743d0-e646-45ab-a225-816c0d99246a-kube-api-access-vsdfm\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954499 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24cng\" (UniqueName: \"kubernetes.io/projected/4b4724c8-6007-4df3-b822-42d08ea33fde-kube-api-access-24cng\") pod \"control-plane-machine-set-operator-78cbb6b69f-2d4s7\" (UID: \"4b4724c8-6007-4df3-b822-42d08ea33fde\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2d4s7" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954538 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c895d97a-7287-49a8-9ac5-bc87e8bcf297-trusted-ca-bundle\") pod \"console-f9d7485db-j74ct\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954570 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954576 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/35e1fbe6-e210-4329-a27b-3341136d7dcd-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lblwv\" (UID: \"35e1fbe6-e210-4329-a27b-3341136d7dcd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954592 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35e1fbe6-e210-4329-a27b-3341136d7dcd-serving-cert\") pod \"controller-manager-879f6c89f-lblwv\" (UID: \"35e1fbe6-e210-4329-a27b-3341136d7dcd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954735 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/633935e5-0232-4844-8f77-e87a7d4385cd-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-696rc\" (UID: \"633935e5-0232-4844-8f77-e87a7d4385cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954747 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2a4743d0-e646-45ab-a225-816c0d99246a-audit-dir\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954793 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b6a94c8-9172-472f-8d40-4c259b21c6b9-serving-cert\") pod \"service-ca-operator-777779d784-qd6rx\" (UID: \"7b6a94c8-9172-472f-8d40-4c259b21c6b9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qd6rx" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954856 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f07d2126-7037-4b5c-aa67-4d09bf873e07-config\") pod \"machine-api-operator-5694c8668f-75q5v\" (UID: \"f07d2126-7037-4b5c-aa67-4d09bf873e07\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-75q5v" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.954914 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26813cbf-0ed2-460a-b36f-f1a8895e68ec-trusted-ca-bundle\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.955001 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aba1eb4-b085-48e7-941b-403c160fb3f4-config\") pod \"machine-approver-56656f9798-q2vmh\" (UID: \"2aba1eb4-b085-48e7-941b-403c160fb3f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q2vmh" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.955046 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f07d2126-7037-4b5c-aa67-4d09bf873e07-images\") pod \"machine-api-operator-5694c8668f-75q5v\" (UID: \"f07d2126-7037-4b5c-aa67-4d09bf873e07\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-75q5v" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.955076 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.955109 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/816f9c0b-05db-4dfc-8edb-ba2e0a14d43d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fhfxg\" (UID: \"816f9c0b-05db-4dfc-8edb-ba2e0a14d43d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fhfxg" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.955136 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/26813cbf-0ed2-460a-b36f-f1a8895e68ec-etcd-client\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.955158 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/26813cbf-0ed2-460a-b36f-f1a8895e68ec-encryption-config\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.955178 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.955251 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf975172-874f-418c-947b-6226e1662647-serving-cert\") pod \"console-operator-58897d9998-22bgx\" (UID: \"cf975172-874f-418c-947b-6226e1662647\") " pod="openshift-console-operator/console-operator-58897d9998-22bgx" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.955272 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/26813cbf-0ed2-460a-b36f-f1a8895e68ec-etcd-serving-ca\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.955322 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b28184c2-4cb3-4fe7-9c69-3fcf55a0d0e0-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-6ql7w\" (UID: \"b28184c2-4cb3-4fe7-9c69-3fcf55a0d0e0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6ql7w" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.955345 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f1c3d8c5-6a51-4c89-96e9-f7fb62e5685b-srv-cert\") pod \"olm-operator-6b444d44fb-57b4h\" (UID: \"f1c3d8c5-6a51-4c89-96e9-f7fb62e5685b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57b4h" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.955435 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d2a62908-86f6-4b7f-9169-cb7a9ef1ece8-stats-auth\") pod \"router-default-5444994796-vdfqp\" (UID: \"d2a62908-86f6-4b7f-9169-cb7a9ef1ece8\") " pod="openshift-ingress/router-default-5444994796-vdfqp" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.955456 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/97eeda6e-c63f-4b48-a8bd-05e673d79117-images\") pod \"machine-config-operator-74547568cd-hz62m\" (UID: \"97eeda6e-c63f-4b48-a8bd-05e673d79117\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hz62m" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.955479 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26813cbf-0ed2-460a-b36f-f1a8895e68ec-config\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.955501 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f4c85b-1896-4882-9ee0-8117ddf6a7a6-config\") pod \"openshift-apiserver-operator-796bbdcf4f-t7zd8\" (UID: \"10f4c85b-1896-4882-9ee0-8117ddf6a7a6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t7zd8" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.955522 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c895d97a-7287-49a8-9ac5-bc87e8bcf297-service-ca\") pod \"console-f9d7485db-j74ct\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.955541 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/633935e5-0232-4844-8f77-e87a7d4385cd-audit-dir\") pod \"apiserver-7bbb656c7d-696rc\" (UID: \"633935e5-0232-4844-8f77-e87a7d4385cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.955563 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8fpx\" (UniqueName: \"kubernetes.io/projected/33659bcf-6d50-402b-a0da-7610749b535c-kube-api-access-g8fpx\") pod \"catalog-operator-68c6474976-ncvlk\" (UID: \"33659bcf-6d50-402b-a0da-7610749b535c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncvlk" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.955627 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a643603c-791b-4f68-a320-b6a2bcabf91f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-c2pph\" (UID: \"a643603c-791b-4f68-a320-b6a2bcabf91f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2pph" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.955643 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f07d2126-7037-4b5c-aa67-4d09bf873e07-images\") pod \"machine-api-operator-5694c8668f-75q5v\" (UID: \"f07d2126-7037-4b5c-aa67-4d09bf873e07\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-75q5v" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.955720 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hz62m"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.956292 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26813cbf-0ed2-460a-b36f-f1a8895e68ec-trusted-ca-bundle\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.958020 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.958274 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/26813cbf-0ed2-460a-b36f-f1a8895e68ec-audit-dir\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.959032 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/26813cbf-0ed2-460a-b36f-f1a8895e68ec-etcd-serving-ca\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.959073 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/633935e5-0232-4844-8f77-e87a7d4385cd-audit-dir\") pod \"apiserver-7bbb656c7d-696rc\" (UID: \"633935e5-0232-4844-8f77-e87a7d4385cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.959172 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/bc178477-fdbe-4189-80e7-5ceba0100dbd-etcd-ca\") pod \"etcd-operator-b45778765-k5w5x\" (UID: \"bc178477-fdbe-4189-80e7-5ceba0100dbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k5w5x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.959240 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26813cbf-0ed2-460a-b36f-f1a8895e68ec-config\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.959328 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrtsk\" (UniqueName: \"kubernetes.io/projected/53e8b265-c7b0-4b11-bcb0-225ada8332ce-kube-api-access-qrtsk\") pod \"dns-operator-744455d44c-xw899\" (UID: \"53e8b265-c7b0-4b11-bcb0-225ada8332ce\") " pod="openshift-dns-operator/dns-operator-744455d44c-xw899" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.959493 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74cde4fb-7e17-40dc-8537-54b5ecb898d7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8v6g2\" (UID: \"74cde4fb-7e17-40dc-8537-54b5ecb898d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8v6g2" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.959626 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2aba1eb4-b085-48e7-941b-403c160fb3f4-auth-proxy-config\") pod \"machine-approver-56656f9798-q2vmh\" (UID: \"2aba1eb4-b085-48e7-941b-403c160fb3f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q2vmh" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.959723 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10f4c85b-1896-4882-9ee0-8117ddf6a7a6-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-t7zd8\" (UID: \"10f4c85b-1896-4882-9ee0-8117ddf6a7a6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t7zd8" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.959834 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74cde4fb-7e17-40dc-8537-54b5ecb898d7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8v6g2\" (UID: \"74cde4fb-7e17-40dc-8537-54b5ecb898d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8v6g2" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.959930 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc178477-fdbe-4189-80e7-5ceba0100dbd-config\") pod \"etcd-operator-b45778765-k5w5x\" (UID: \"bc178477-fdbe-4189-80e7-5ceba0100dbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k5w5x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.960026 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/312693cc-d986-462d-ac50-f4e44dbc8cf1-apiservice-cert\") pod \"packageserver-d55dfcdfc-79ms4\" (UID: \"312693cc-d986-462d-ac50-f4e44dbc8cf1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-79ms4" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.960129 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/bc178477-fdbe-4189-80e7-5ceba0100dbd-etcd-service-ca\") pod \"etcd-operator-b45778765-k5w5x\" (UID: \"bc178477-fdbe-4189-80e7-5ceba0100dbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k5w5x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.960219 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c06c731d-cb94-49f2-8afb-899c7c6e7724-config\") pod \"route-controller-manager-6576b87f9c-rwhcb\" (UID: \"c06c731d-cb94-49f2-8afb-899c7c6e7724\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.960323 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/633935e5-0232-4844-8f77-e87a7d4385cd-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-696rc\" (UID: \"633935e5-0232-4844-8f77-e87a7d4385cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.960513 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f2pv\" (UniqueName: \"kubernetes.io/projected/4091b4ed-3afe-4bab-b41e-0bca5b6f58b0-kube-api-access-6f2pv\") pod \"downloads-7954f5f757-p7bmv\" (UID: \"4091b4ed-3afe-4bab-b41e-0bca5b6f58b0\") " pod="openshift-console/downloads-7954f5f757-p7bmv" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.960617 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/24507f61-1a02-438c-b1ca-82515867e605-secret-volume\") pod \"collect-profiles-29333715-pgfd2\" (UID: \"24507f61-1a02-438c-b1ca-82515867e605\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333715-pgfd2" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.960715 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.960816 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19fada05-fea3-4a67-a138-1cf460e31a2b-config\") pod \"kube-apiserver-operator-766d6c64bb-9jzcj\" (UID: \"19fada05-fea3-4a67-a138-1cf460e31a2b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9jzcj" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.960913 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d2a62908-86f6-4b7f-9169-cb7a9ef1ece8-default-certificate\") pod \"router-default-5444994796-vdfqp\" (UID: \"d2a62908-86f6-4b7f-9169-cb7a9ef1ece8\") " pod="openshift-ingress/router-default-5444994796-vdfqp" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.961021 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/816f9c0b-05db-4dfc-8edb-ba2e0a14d43d-metrics-tls\") pod \"ingress-operator-5b745b69d9-fhfxg\" (UID: \"816f9c0b-05db-4dfc-8edb-ba2e0a14d43d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fhfxg" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.961116 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2a62908-86f6-4b7f-9169-cb7a9ef1ece8-metrics-certs\") pod \"router-default-5444994796-vdfqp\" (UID: \"d2a62908-86f6-4b7f-9169-cb7a9ef1ece8\") " pod="openshift-ingress/router-default-5444994796-vdfqp" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.961227 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f07d2126-7037-4b5c-aa67-4d09bf873e07-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-75q5v\" (UID: \"f07d2126-7037-4b5c-aa67-4d09bf873e07\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-75q5v" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.961315 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53e8b265-c7b0-4b11-bcb0-225ada8332ce-metrics-tls\") pod \"dns-operator-744455d44c-xw899\" (UID: \"53e8b265-c7b0-4b11-bcb0-225ada8332ce\") " pod="openshift-dns-operator/dns-operator-744455d44c-xw899" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.961447 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/563b4e02-b3d3-4f24-b571-091d77871f9b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-9599k\" (UID: \"563b4e02-b3d3-4f24-b571-091d77871f9b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9599k" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.960572 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35e1fbe6-e210-4329-a27b-3341136d7dcd-config\") pod \"controller-manager-879f6c89f-lblwv\" (UID: \"35e1fbe6-e210-4329-a27b-3341136d7dcd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.959859 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9jzcj"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.960676 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc178477-fdbe-4189-80e7-5ceba0100dbd-config\") pod \"etcd-operator-b45778765-k5w5x\" (UID: \"bc178477-fdbe-4189-80e7-5ceba0100dbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k5w5x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.961181 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c895d97a-7287-49a8-9ac5-bc87e8bcf297-trusted-ca-bundle\") pod \"console-f9d7485db-j74ct\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.960526 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2aba1eb4-b085-48e7-941b-403c160fb3f4-auth-proxy-config\") pod \"machine-approver-56656f9798-q2vmh\" (UID: \"2aba1eb4-b085-48e7-941b-403c160fb3f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q2vmh" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.960820 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/633935e5-0232-4844-8f77-e87a7d4385cd-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-696rc\" (UID: \"633935e5-0232-4844-8f77-e87a7d4385cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.959837 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c895d97a-7287-49a8-9ac5-bc87e8bcf297-service-ca\") pod \"console-f9d7485db-j74ct\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.961784 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/563b4e02-b3d3-4f24-b571-091d77871f9b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-9599k\" (UID: \"563b4e02-b3d3-4f24-b571-091d77871f9b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9599k" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.961938 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-79ms4"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.961554 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf975172-874f-418c-947b-6226e1662647-trusted-ca\") pod \"console-operator-58897d9998-22bgx\" (UID: \"cf975172-874f-418c-947b-6226e1662647\") " pod="openshift-console-operator/console-operator-58897d9998-22bgx" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.962182 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd6md\" (UniqueName: \"kubernetes.io/projected/b28184c2-4cb3-4fe7-9c69-3fcf55a0d0e0-kube-api-access-jd6md\") pod \"cluster-samples-operator-665b6dd947-6ql7w\" (UID: \"b28184c2-4cb3-4fe7-9c69-3fcf55a0d0e0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6ql7w" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.962269 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2aba1eb4-b085-48e7-941b-403c160fb3f4-machine-approver-tls\") pod \"machine-approver-56656f9798-q2vmh\" (UID: \"2aba1eb4-b085-48e7-941b-403c160fb3f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q2vmh" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.962142 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b28184c2-4cb3-4fe7-9c69-3fcf55a0d0e0-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-6ql7w\" (UID: \"b28184c2-4cb3-4fe7-9c69-3fcf55a0d0e0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6ql7w" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.962388 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.962471 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/633935e5-0232-4844-8f77-e87a7d4385cd-audit-policies\") pod \"apiserver-7bbb656c7d-696rc\" (UID: \"633935e5-0232-4844-8f77-e87a7d4385cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.962497 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvpw4\" (UniqueName: \"kubernetes.io/projected/633935e5-0232-4844-8f77-e87a7d4385cd-kube-api-access-dvpw4\") pod \"apiserver-7bbb656c7d-696rc\" (UID: \"633935e5-0232-4844-8f77-e87a7d4385cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.962522 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c895d97a-7287-49a8-9ac5-bc87e8bcf297-console-serving-cert\") pod \"console-f9d7485db-j74ct\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.962548 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs8sl\" (UniqueName: \"kubernetes.io/projected/26813cbf-0ed2-460a-b36f-f1a8895e68ec-kube-api-access-hs8sl\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.962562 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf975172-874f-418c-947b-6226e1662647-trusted-ca\") pod \"console-operator-58897d9998-22bgx\" (UID: \"cf975172-874f-418c-947b-6226e1662647\") " pod="openshift-console-operator/console-operator-58897d9998-22bgx" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.962582 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24507f61-1a02-438c-b1ca-82515867e605-config-volume\") pod \"collect-profiles-29333715-pgfd2\" (UID: \"24507f61-1a02-438c-b1ca-82515867e605\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333715-pgfd2" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.962617 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc178477-fdbe-4189-80e7-5ceba0100dbd-serving-cert\") pod \"etcd-operator-b45778765-k5w5x\" (UID: \"bc178477-fdbe-4189-80e7-5ceba0100dbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k5w5x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.962645 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/97eeda6e-c63f-4b48-a8bd-05e673d79117-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hz62m\" (UID: \"97eeda6e-c63f-4b48-a8bd-05e673d79117\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hz62m" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.962668 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69jd7\" (UniqueName: \"kubernetes.io/projected/24507f61-1a02-438c-b1ca-82515867e605-kube-api-access-69jd7\") pod \"collect-profiles-29333715-pgfd2\" (UID: \"24507f61-1a02-438c-b1ca-82515867e605\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333715-pgfd2" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.962694 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a643603c-791b-4f68-a320-b6a2bcabf91f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-c2pph\" (UID: \"a643603c-791b-4f68-a320-b6a2bcabf91f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2pph" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.962718 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/26813cbf-0ed2-460a-b36f-f1a8895e68ec-node-pullsecrets\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.962745 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/26813cbf-0ed2-460a-b36f-f1a8895e68ec-audit\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.963025 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/26813cbf-0ed2-460a-b36f-f1a8895e68ec-node-pullsecrets\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.963133 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/633935e5-0232-4844-8f77-e87a7d4385cd-audit-policies\") pod \"apiserver-7bbb656c7d-696rc\" (UID: \"633935e5-0232-4844-8f77-e87a7d4385cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.963303 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/26813cbf-0ed2-460a-b36f-f1a8895e68ec-audit\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.963429 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hf8sg"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.963968 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/563b4e02-b3d3-4f24-b571-091d77871f9b-serving-cert\") pod \"openshift-config-operator-7777fb866f-9599k\" (UID: \"563b4e02-b3d3-4f24-b571-091d77871f9b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9599k" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.964013 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/633935e5-0232-4844-8f77-e87a7d4385cd-serving-cert\") pod \"apiserver-7bbb656c7d-696rc\" (UID: \"633935e5-0232-4844-8f77-e87a7d4385cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.964185 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/633935e5-0232-4844-8f77-e87a7d4385cd-encryption-config\") pod \"apiserver-7bbb656c7d-696rc\" (UID: \"633935e5-0232-4844-8f77-e87a7d4385cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.964249 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35e1fbe6-e210-4329-a27b-3341136d7dcd-serving-cert\") pod \"controller-manager-879f6c89f-lblwv\" (UID: \"35e1fbe6-e210-4329-a27b-3341136d7dcd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.964670 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c895d97a-7287-49a8-9ac5-bc87e8bcf297-console-oauth-config\") pod \"console-f9d7485db-j74ct\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.964754 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53e8b265-c7b0-4b11-bcb0-225ada8332ce-metrics-tls\") pod \"dns-operator-744455d44c-xw899\" (UID: \"53e8b265-c7b0-4b11-bcb0-225ada8332ce\") " pod="openshift-dns-operator/dns-operator-744455d44c-xw899" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.964866 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf975172-874f-418c-947b-6226e1662647-serving-cert\") pod \"console-operator-58897d9998-22bgx\" (UID: \"cf975172-874f-418c-947b-6226e1662647\") " pod="openshift-console-operator/console-operator-58897d9998-22bgx" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.965098 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c895d97a-7287-49a8-9ac5-bc87e8bcf297-console-serving-cert\") pod \"console-f9d7485db-j74ct\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.965261 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/26813cbf-0ed2-460a-b36f-f1a8895e68ec-encryption-config\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.965269 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/26813cbf-0ed2-460a-b36f-f1a8895e68ec-etcd-client\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.965434 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26813cbf-0ed2-460a-b36f-f1a8895e68ec-serving-cert\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.965577 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/633935e5-0232-4844-8f77-e87a7d4385cd-etcd-client\") pod \"apiserver-7bbb656c7d-696rc\" (UID: \"633935e5-0232-4844-8f77-e87a7d4385cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.965747 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/bc178477-fdbe-4189-80e7-5ceba0100dbd-etcd-service-ca\") pod \"etcd-operator-b45778765-k5w5x\" (UID: \"bc178477-fdbe-4189-80e7-5ceba0100dbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k5w5x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.966292 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bc178477-fdbe-4189-80e7-5ceba0100dbd-etcd-client\") pod \"etcd-operator-b45778765-k5w5x\" (UID: \"bc178477-fdbe-4189-80e7-5ceba0100dbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k5w5x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.966460 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2aba1eb4-b085-48e7-941b-403c160fb3f4-machine-approver-tls\") pod \"machine-approver-56656f9798-q2vmh\" (UID: \"2aba1eb4-b085-48e7-941b-403c160fb3f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q2vmh" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.966773 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/26813cbf-0ed2-460a-b36f-f1a8895e68ec-image-import-ca\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.966857 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.967105 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f07d2126-7037-4b5c-aa67-4d09bf873e07-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-75q5v\" (UID: \"f07d2126-7037-4b5c-aa67-4d09bf873e07\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-75q5v" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.967451 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc178477-fdbe-4189-80e7-5ceba0100dbd-serving-cert\") pod \"etcd-operator-b45778765-k5w5x\" (UID: \"bc178477-fdbe-4189-80e7-5ceba0100dbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k5w5x" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.967462 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c895d97a-7287-49a8-9ac5-bc87e8bcf297-oauth-serving-cert\") pod \"console-f9d7485db-j74ct\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.967856 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-qd6rx"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.969128 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a643603c-791b-4f68-a320-b6a2bcabf91f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-c2pph\" (UID: \"a643603c-791b-4f68-a320-b6a2bcabf91f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2pph" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.969177 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cqrnr"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.970219 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ppwzx"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.971236 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8v6g2"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.972181 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6g6x"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.973449 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x7wsc"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.974457 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ffm6c"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.975659 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-762nw"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.976474 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-x8kq2"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.976651 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-762nw" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.977569 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-d88hv"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.977628 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.986118 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.987837 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-762nw"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.987891 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-x8kq2"] Oct 09 15:20:33 crc kubenswrapper[4719]: I1009 15:20:33.998903 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.017456 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.036637 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.057205 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.063663 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/312693cc-d986-462d-ac50-f4e44dbc8cf1-tmpfs\") pod \"packageserver-d55dfcdfc-79ms4\" (UID: \"312693cc-d986-462d-ac50-f4e44dbc8cf1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-79ms4" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.063692 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.063725 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c06c731d-cb94-49f2-8afb-899c7c6e7724-serving-cert\") pod \"route-controller-manager-6576b87f9c-rwhcb\" (UID: \"c06c731d-cb94-49f2-8afb-899c7c6e7724\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.063741 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.063760 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.063776 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97eeda6e-c63f-4b48-a8bd-05e673d79117-proxy-tls\") pod \"machine-config-operator-74547568cd-hz62m\" (UID: \"97eeda6e-c63f-4b48-a8bd-05e673d79117\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hz62m" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.063837 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjsj6\" (UniqueName: \"kubernetes.io/projected/816f9c0b-05db-4dfc-8edb-ba2e0a14d43d-kube-api-access-qjsj6\") pod \"ingress-operator-5b745b69d9-fhfxg\" (UID: \"816f9c0b-05db-4dfc-8edb-ba2e0a14d43d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fhfxg" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.063853 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f1c3d8c5-6a51-4c89-96e9-f7fb62e5685b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-57b4h\" (UID: \"f1c3d8c5-6a51-4c89-96e9-f7fb62e5685b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57b4h" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.063869 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hw7k\" (UniqueName: \"kubernetes.io/projected/f1c3d8c5-6a51-4c89-96e9-f7fb62e5685b-kube-api-access-7hw7k\") pod \"olm-operator-6b444d44fb-57b4h\" (UID: \"f1c3d8c5-6a51-4c89-96e9-f7fb62e5685b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57b4h" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.063886 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8nch\" (UniqueName: \"kubernetes.io/projected/c06c731d-cb94-49f2-8afb-899c7c6e7724-kube-api-access-x8nch\") pod \"route-controller-manager-6576b87f9c-rwhcb\" (UID: \"c06c731d-cb94-49f2-8afb-899c7c6e7724\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.063904 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.063926 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2a62908-86f6-4b7f-9169-cb7a9ef1ece8-service-ca-bundle\") pod \"router-default-5444994796-vdfqp\" (UID: \"d2a62908-86f6-4b7f-9169-cb7a9ef1ece8\") " pod="openshift-ingress/router-default-5444994796-vdfqp" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.063942 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79tkf\" (UniqueName: \"kubernetes.io/projected/312693cc-d986-462d-ac50-f4e44dbc8cf1-kube-api-access-79tkf\") pod \"packageserver-d55dfcdfc-79ms4\" (UID: \"312693cc-d986-462d-ac50-f4e44dbc8cf1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-79ms4" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.063960 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/33659bcf-6d50-402b-a0da-7610749b535c-srv-cert\") pod \"catalog-operator-68c6474976-ncvlk\" (UID: \"33659bcf-6d50-402b-a0da-7610749b535c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncvlk" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.063990 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.064006 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhjlm\" (UniqueName: \"kubernetes.io/projected/d2a62908-86f6-4b7f-9169-cb7a9ef1ece8-kube-api-access-dhjlm\") pod \"router-default-5444994796-vdfqp\" (UID: \"d2a62908-86f6-4b7f-9169-cb7a9ef1ece8\") " pod="openshift-ingress/router-default-5444994796-vdfqp" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.064030 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndb69\" (UniqueName: \"kubernetes.io/projected/7b6a94c8-9172-472f-8d40-4c259b21c6b9-kube-api-access-ndb69\") pod \"service-ca-operator-777779d784-qd6rx\" (UID: \"7b6a94c8-9172-472f-8d40-4c259b21c6b9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qd6rx" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.064055 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19fada05-fea3-4a67-a138-1cf460e31a2b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9jzcj\" (UID: \"19fada05-fea3-4a67-a138-1cf460e31a2b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9jzcj" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.064070 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c06c731d-cb94-49f2-8afb-899c7c6e7724-client-ca\") pod \"route-controller-manager-6576b87f9c-rwhcb\" (UID: \"c06c731d-cb94-49f2-8afb-899c7c6e7724\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.064094 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/312693cc-d986-462d-ac50-f4e44dbc8cf1-webhook-cert\") pod \"packageserver-d55dfcdfc-79ms4\" (UID: \"312693cc-d986-462d-ac50-f4e44dbc8cf1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-79ms4" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.064110 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2a4743d0-e646-45ab-a225-816c0d99246a-audit-policies\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.064125 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/312693cc-d986-462d-ac50-f4e44dbc8cf1-tmpfs\") pod \"packageserver-d55dfcdfc-79ms4\" (UID: \"312693cc-d986-462d-ac50-f4e44dbc8cf1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-79ms4" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.064148 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.064177 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsdfm\" (UniqueName: \"kubernetes.io/projected/2a4743d0-e646-45ab-a225-816c0d99246a-kube-api-access-vsdfm\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.064198 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24cng\" (UniqueName: \"kubernetes.io/projected/4b4724c8-6007-4df3-b822-42d08ea33fde-kube-api-access-24cng\") pod \"control-plane-machine-set-operator-78cbb6b69f-2d4s7\" (UID: \"4b4724c8-6007-4df3-b822-42d08ea33fde\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2d4s7" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.064218 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.064242 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2a4743d0-e646-45ab-a225-816c0d99246a-audit-dir\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.064260 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b6a94c8-9172-472f-8d40-4c259b21c6b9-serving-cert\") pod \"service-ca-operator-777779d784-qd6rx\" (UID: \"7b6a94c8-9172-472f-8d40-4c259b21c6b9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qd6rx" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.064282 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.064301 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/816f9c0b-05db-4dfc-8edb-ba2e0a14d43d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fhfxg\" (UID: \"816f9c0b-05db-4dfc-8edb-ba2e0a14d43d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fhfxg" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.064324 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.064366 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f1c3d8c5-6a51-4c89-96e9-f7fb62e5685b-srv-cert\") pod \"olm-operator-6b444d44fb-57b4h\" (UID: \"f1c3d8c5-6a51-4c89-96e9-f7fb62e5685b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57b4h" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.064723 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.064837 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d2a62908-86f6-4b7f-9169-cb7a9ef1ece8-stats-auth\") pod \"router-default-5444994796-vdfqp\" (UID: \"d2a62908-86f6-4b7f-9169-cb7a9ef1ece8\") " pod="openshift-ingress/router-default-5444994796-vdfqp" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.064846 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2a4743d0-e646-45ab-a225-816c0d99246a-audit-dir\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.064873 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/97eeda6e-c63f-4b48-a8bd-05e673d79117-images\") pod \"machine-config-operator-74547568cd-hz62m\" (UID: \"97eeda6e-c63f-4b48-a8bd-05e673d79117\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hz62m" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.064897 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f4c85b-1896-4882-9ee0-8117ddf6a7a6-config\") pod \"openshift-apiserver-operator-796bbdcf4f-t7zd8\" (UID: \"10f4c85b-1896-4882-9ee0-8117ddf6a7a6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t7zd8" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.064900 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2a4743d0-e646-45ab-a225-816c0d99246a-audit-policies\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.064958 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8fpx\" (UniqueName: \"kubernetes.io/projected/33659bcf-6d50-402b-a0da-7610749b535c-kube-api-access-g8fpx\") pod \"catalog-operator-68c6474976-ncvlk\" (UID: \"33659bcf-6d50-402b-a0da-7610749b535c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncvlk" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.064992 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74cde4fb-7e17-40dc-8537-54b5ecb898d7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8v6g2\" (UID: \"74cde4fb-7e17-40dc-8537-54b5ecb898d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8v6g2" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.065013 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10f4c85b-1896-4882-9ee0-8117ddf6a7a6-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-t7zd8\" (UID: \"10f4c85b-1896-4882-9ee0-8117ddf6a7a6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t7zd8" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.065038 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74cde4fb-7e17-40dc-8537-54b5ecb898d7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8v6g2\" (UID: \"74cde4fb-7e17-40dc-8537-54b5ecb898d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8v6g2" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.065062 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/312693cc-d986-462d-ac50-f4e44dbc8cf1-apiservice-cert\") pod \"packageserver-d55dfcdfc-79ms4\" (UID: \"312693cc-d986-462d-ac50-f4e44dbc8cf1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-79ms4" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.065091 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c06c731d-cb94-49f2-8afb-899c7c6e7724-config\") pod \"route-controller-manager-6576b87f9c-rwhcb\" (UID: \"c06c731d-cb94-49f2-8afb-899c7c6e7724\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.065116 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.065137 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/24507f61-1a02-438c-b1ca-82515867e605-secret-volume\") pod \"collect-profiles-29333715-pgfd2\" (UID: \"24507f61-1a02-438c-b1ca-82515867e605\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333715-pgfd2" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.065164 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19fada05-fea3-4a67-a138-1cf460e31a2b-config\") pod \"kube-apiserver-operator-766d6c64bb-9jzcj\" (UID: \"19fada05-fea3-4a67-a138-1cf460e31a2b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9jzcj" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.065189 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/816f9c0b-05db-4dfc-8edb-ba2e0a14d43d-metrics-tls\") pod \"ingress-operator-5b745b69d9-fhfxg\" (UID: \"816f9c0b-05db-4dfc-8edb-ba2e0a14d43d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fhfxg" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.065210 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d2a62908-86f6-4b7f-9169-cb7a9ef1ece8-default-certificate\") pod \"router-default-5444994796-vdfqp\" (UID: \"d2a62908-86f6-4b7f-9169-cb7a9ef1ece8\") " pod="openshift-ingress/router-default-5444994796-vdfqp" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.065230 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2a62908-86f6-4b7f-9169-cb7a9ef1ece8-metrics-certs\") pod \"router-default-5444994796-vdfqp\" (UID: \"d2a62908-86f6-4b7f-9169-cb7a9ef1ece8\") " pod="openshift-ingress/router-default-5444994796-vdfqp" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.065263 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.065299 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24507f61-1a02-438c-b1ca-82515867e605-config-volume\") pod \"collect-profiles-29333715-pgfd2\" (UID: \"24507f61-1a02-438c-b1ca-82515867e605\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333715-pgfd2" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.065321 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/97eeda6e-c63f-4b48-a8bd-05e673d79117-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hz62m\" (UID: \"97eeda6e-c63f-4b48-a8bd-05e673d79117\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hz62m" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.065388 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69jd7\" (UniqueName: \"kubernetes.io/projected/24507f61-1a02-438c-b1ca-82515867e605-kube-api-access-69jd7\") pod \"collect-profiles-29333715-pgfd2\" (UID: \"24507f61-1a02-438c-b1ca-82515867e605\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333715-pgfd2" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.065412 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/816f9c0b-05db-4dfc-8edb-ba2e0a14d43d-trusted-ca\") pod \"ingress-operator-5b745b69d9-fhfxg\" (UID: \"816f9c0b-05db-4dfc-8edb-ba2e0a14d43d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fhfxg" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.065449 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b6a94c8-9172-472f-8d40-4c259b21c6b9-config\") pod \"service-ca-operator-777779d784-qd6rx\" (UID: \"7b6a94c8-9172-472f-8d40-4c259b21c6b9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qd6rx" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.065470 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74cde4fb-7e17-40dc-8537-54b5ecb898d7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8v6g2\" (UID: \"74cde4fb-7e17-40dc-8537-54b5ecb898d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8v6g2" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.065492 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b4724c8-6007-4df3-b822-42d08ea33fde-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2d4s7\" (UID: \"4b4724c8-6007-4df3-b822-42d08ea33fde\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2d4s7" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.065517 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/33659bcf-6d50-402b-a0da-7610749b535c-profile-collector-cert\") pod \"catalog-operator-68c6474976-ncvlk\" (UID: \"33659bcf-6d50-402b-a0da-7610749b535c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncvlk" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.065539 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19fada05-fea3-4a67-a138-1cf460e31a2b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9jzcj\" (UID: \"19fada05-fea3-4a67-a138-1cf460e31a2b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9jzcj" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.065560 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhjxp\" (UniqueName: \"kubernetes.io/projected/97eeda6e-c63f-4b48-a8bd-05e673d79117-kube-api-access-qhjxp\") pod \"machine-config-operator-74547568cd-hz62m\" (UID: \"97eeda6e-c63f-4b48-a8bd-05e673d79117\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hz62m" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.065583 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.065591 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z26lm\" (UniqueName: \"kubernetes.io/projected/10f4c85b-1896-4882-9ee0-8117ddf6a7a6-kube-api-access-z26lm\") pod \"openshift-apiserver-operator-796bbdcf4f-t7zd8\" (UID: \"10f4c85b-1896-4882-9ee0-8117ddf6a7a6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t7zd8" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.066944 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.067245 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.068426 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f1c3d8c5-6a51-4c89-96e9-f7fb62e5685b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-57b4h\" (UID: \"f1c3d8c5-6a51-4c89-96e9-f7fb62e5685b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57b4h" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.068658 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.068742 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/816f9c0b-05db-4dfc-8edb-ba2e0a14d43d-trusted-ca\") pod \"ingress-operator-5b745b69d9-fhfxg\" (UID: \"816f9c0b-05db-4dfc-8edb-ba2e0a14d43d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fhfxg" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.068888 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.069460 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/97eeda6e-c63f-4b48-a8bd-05e673d79117-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hz62m\" (UID: \"97eeda6e-c63f-4b48-a8bd-05e673d79117\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hz62m" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.069543 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.069708 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.070100 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/816f9c0b-05db-4dfc-8edb-ba2e0a14d43d-metrics-tls\") pod \"ingress-operator-5b745b69d9-fhfxg\" (UID: \"816f9c0b-05db-4dfc-8edb-ba2e0a14d43d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fhfxg" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.070130 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/33659bcf-6d50-402b-a0da-7610749b535c-srv-cert\") pod \"catalog-operator-68c6474976-ncvlk\" (UID: \"33659bcf-6d50-402b-a0da-7610749b535c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncvlk" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.070391 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.070841 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/33659bcf-6d50-402b-a0da-7610749b535c-profile-collector-cert\") pod \"catalog-operator-68c6474976-ncvlk\" (UID: \"33659bcf-6d50-402b-a0da-7610749b535c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncvlk" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.073654 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.076607 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.077105 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.078050 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/24507f61-1a02-438c-b1ca-82515867e605-secret-volume\") pod \"collect-profiles-29333715-pgfd2\" (UID: \"24507f61-1a02-438c-b1ca-82515867e605\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333715-pgfd2" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.088683 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f1c3d8c5-6a51-4c89-96e9-f7fb62e5685b-srv-cert\") pod \"olm-operator-6b444d44fb-57b4h\" (UID: \"f1c3d8c5-6a51-4c89-96e9-f7fb62e5685b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57b4h" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.098654 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.117008 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.137508 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.156908 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.183551 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.197097 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.216984 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.236617 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.257422 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.277515 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.289064 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10f4c85b-1896-4882-9ee0-8117ddf6a7a6-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-t7zd8\" (UID: \"10f4c85b-1896-4882-9ee0-8117ddf6a7a6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t7zd8" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.297409 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.306328 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f4c85b-1896-4882-9ee0-8117ddf6a7a6-config\") pod \"openshift-apiserver-operator-796bbdcf4f-t7zd8\" (UID: \"10f4c85b-1896-4882-9ee0-8117ddf6a7a6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t7zd8" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.316649 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.336838 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.346082 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/97eeda6e-c63f-4b48-a8bd-05e673d79117-images\") pod \"machine-config-operator-74547568cd-hz62m\" (UID: \"97eeda6e-c63f-4b48-a8bd-05e673d79117\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hz62m" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.356923 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.377531 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.387242 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97eeda6e-c63f-4b48-a8bd-05e673d79117-proxy-tls\") pod \"machine-config-operator-74547568cd-hz62m\" (UID: \"97eeda6e-c63f-4b48-a8bd-05e673d79117\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hz62m" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.396819 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.398060 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24507f61-1a02-438c-b1ca-82515867e605-config-volume\") pod \"collect-profiles-29333715-pgfd2\" (UID: \"24507f61-1a02-438c-b1ca-82515867e605\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333715-pgfd2" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.416555 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.436631 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.450081 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/312693cc-d986-462d-ac50-f4e44dbc8cf1-apiservice-cert\") pod \"packageserver-d55dfcdfc-79ms4\" (UID: \"312693cc-d986-462d-ac50-f4e44dbc8cf1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-79ms4" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.450808 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/312693cc-d986-462d-ac50-f4e44dbc8cf1-webhook-cert\") pod \"packageserver-d55dfcdfc-79ms4\" (UID: \"312693cc-d986-462d-ac50-f4e44dbc8cf1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-79ms4" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.456973 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.477102 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.497487 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.517497 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.536970 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.557072 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.577171 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.596980 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.609918 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b4724c8-6007-4df3-b822-42d08ea33fde-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2d4s7\" (UID: \"4b4724c8-6007-4df3-b822-42d08ea33fde\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2d4s7" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.617131 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.637169 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.656578 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.677750 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.696269 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.699811 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19fada05-fea3-4a67-a138-1cf460e31a2b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9jzcj\" (UID: \"19fada05-fea3-4a67-a138-1cf460e31a2b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9jzcj" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.717203 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.726130 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19fada05-fea3-4a67-a138-1cf460e31a2b-config\") pod \"kube-apiserver-operator-766d6c64bb-9jzcj\" (UID: \"19fada05-fea3-4a67-a138-1cf460e31a2b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9jzcj" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.736695 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.756714 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.765843 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c06c731d-cb94-49f2-8afb-899c7c6e7724-client-ca\") pod \"route-controller-manager-6576b87f9c-rwhcb\" (UID: \"c06c731d-cb94-49f2-8afb-899c7c6e7724\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.777104 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.796582 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.810213 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74cde4fb-7e17-40dc-8537-54b5ecb898d7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8v6g2\" (UID: \"74cde4fb-7e17-40dc-8537-54b5ecb898d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8v6g2" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.817482 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.826275 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74cde4fb-7e17-40dc-8537-54b5ecb898d7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8v6g2\" (UID: \"74cde4fb-7e17-40dc-8537-54b5ecb898d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8v6g2" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.835624 4719 request.go:700] Waited for 1.002563957s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/secrets?fieldSelector=metadata.name%3Droute-controller-manager-sa-dockercfg-h2zr2&limit=500&resourceVersion=0 Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.836871 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.857251 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.867474 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c06c731d-cb94-49f2-8afb-899c7c6e7724-serving-cert\") pod \"route-controller-manager-6576b87f9c-rwhcb\" (UID: \"c06c731d-cb94-49f2-8afb-899c7c6e7724\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.876641 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.897250 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.916402 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.928225 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c06c731d-cb94-49f2-8afb-899c7c6e7724-config\") pod \"route-controller-manager-6576b87f9c-rwhcb\" (UID: \"c06c731d-cb94-49f2-8afb-899c7c6e7724\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.937274 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.957039 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.969068 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d2a62908-86f6-4b7f-9169-cb7a9ef1ece8-stats-auth\") pod \"router-default-5444994796-vdfqp\" (UID: \"d2a62908-86f6-4b7f-9169-cb7a9ef1ece8\") " pod="openshift-ingress/router-default-5444994796-vdfqp" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.979097 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.990415 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2a62908-86f6-4b7f-9169-cb7a9ef1ece8-metrics-certs\") pod \"router-default-5444994796-vdfqp\" (UID: \"d2a62908-86f6-4b7f-9169-cb7a9ef1ece8\") " pod="openshift-ingress/router-default-5444994796-vdfqp" Oct 09 15:20:34 crc kubenswrapper[4719]: I1009 15:20:34.996838 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.017037 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.021181 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d2a62908-86f6-4b7f-9169-cb7a9ef1ece8-default-certificate\") pod \"router-default-5444994796-vdfqp\" (UID: \"d2a62908-86f6-4b7f-9169-cb7a9ef1ece8\") " pod="openshift-ingress/router-default-5444994796-vdfqp" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.036954 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.045504 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2a62908-86f6-4b7f-9169-cb7a9ef1ece8-service-ca-bundle\") pod \"router-default-5444994796-vdfqp\" (UID: \"d2a62908-86f6-4b7f-9169-cb7a9ef1ece8\") " pod="openshift-ingress/router-default-5444994796-vdfqp" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.057234 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 09 15:20:35 crc kubenswrapper[4719]: E1009 15:20:35.065608 4719 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 09 15:20:35 crc kubenswrapper[4719]: E1009 15:20:35.065689 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b6a94c8-9172-472f-8d40-4c259b21c6b9-serving-cert podName:7b6a94c8-9172-472f-8d40-4c259b21c6b9 nodeName:}" failed. No retries permitted until 2025-10-09 15:20:35.565667506 +0000 UTC m=+141.075378791 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/7b6a94c8-9172-472f-8d40-4c259b21c6b9-serving-cert") pod "service-ca-operator-777779d784-qd6rx" (UID: "7b6a94c8-9172-472f-8d40-4c259b21c6b9") : failed to sync secret cache: timed out waiting for the condition Oct 09 15:20:35 crc kubenswrapper[4719]: E1009 15:20:35.066748 4719 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Oct 09 15:20:35 crc kubenswrapper[4719]: E1009 15:20:35.066828 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b6a94c8-9172-472f-8d40-4c259b21c6b9-config podName:7b6a94c8-9172-472f-8d40-4c259b21c6b9 nodeName:}" failed. No retries permitted until 2025-10-09 15:20:35.566809223 +0000 UTC m=+141.076520518 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/7b6a94c8-9172-472f-8d40-4c259b21c6b9-config") pod "service-ca-operator-777779d784-qd6rx" (UID: "7b6a94c8-9172-472f-8d40-4c259b21c6b9") : failed to sync configmap cache: timed out waiting for the condition Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.077017 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.096932 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.117282 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.136829 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.159143 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.197451 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.217137 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.237028 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.257715 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.276714 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.297125 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.336862 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.357299 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.377793 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.397413 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.417596 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.437644 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.457427 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.479093 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.497242 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.517342 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.537854 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.557518 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.577599 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.587037 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b6a94c8-9172-472f-8d40-4c259b21c6b9-config\") pod \"service-ca-operator-777779d784-qd6rx\" (UID: \"7b6a94c8-9172-472f-8d40-4c259b21c6b9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qd6rx" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.587329 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b6a94c8-9172-472f-8d40-4c259b21c6b9-serving-cert\") pod \"service-ca-operator-777779d784-qd6rx\" (UID: \"7b6a94c8-9172-472f-8d40-4c259b21c6b9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qd6rx" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.588539 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b6a94c8-9172-472f-8d40-4c259b21c6b9-config\") pod \"service-ca-operator-777779d784-qd6rx\" (UID: \"7b6a94c8-9172-472f-8d40-4c259b21c6b9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qd6rx" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.590899 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b6a94c8-9172-472f-8d40-4c259b21c6b9-serving-cert\") pod \"service-ca-operator-777779d784-qd6rx\" (UID: \"7b6a94c8-9172-472f-8d40-4c259b21c6b9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qd6rx" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.597264 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.630330 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4hgc\" (UniqueName: \"kubernetes.io/projected/35e1fbe6-e210-4329-a27b-3341136d7dcd-kube-api-access-x4hgc\") pod \"controller-manager-879f6c89f-lblwv\" (UID: \"35e1fbe6-e210-4329-a27b-3341136d7dcd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.650982 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8sb6\" (UniqueName: \"kubernetes.io/projected/2aba1eb4-b085-48e7-941b-403c160fb3f4-kube-api-access-p8sb6\") pod \"machine-approver-56656f9798-q2vmh\" (UID: \"2aba1eb4-b085-48e7-941b-403c160fb3f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q2vmh" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.671445 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnf28\" (UniqueName: \"kubernetes.io/projected/cf975172-874f-418c-947b-6226e1662647-kube-api-access-rnf28\") pod \"console-operator-58897d9998-22bgx\" (UID: \"cf975172-874f-418c-947b-6226e1662647\") " pod="openshift-console-operator/console-operator-58897d9998-22bgx" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.695867 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2nzs\" (UniqueName: \"kubernetes.io/projected/f07d2126-7037-4b5c-aa67-4d09bf873e07-kube-api-access-x2nzs\") pod \"machine-api-operator-5694c8668f-75q5v\" (UID: \"f07d2126-7037-4b5c-aa67-4d09bf873e07\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-75q5v" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.709939 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np54w\" (UniqueName: \"kubernetes.io/projected/563b4e02-b3d3-4f24-b571-091d77871f9b-kube-api-access-np54w\") pod \"openshift-config-operator-7777fb866f-9599k\" (UID: \"563b4e02-b3d3-4f24-b571-091d77871f9b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9599k" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.729870 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zspkw\" (UniqueName: \"kubernetes.io/projected/a643603c-791b-4f68-a320-b6a2bcabf91f-kube-api-access-zspkw\") pod \"openshift-controller-manager-operator-756b6f6bc6-c2pph\" (UID: \"a643603c-791b-4f68-a320-b6a2bcabf91f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2pph" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.750113 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftr8w\" (UniqueName: \"kubernetes.io/projected/bc178477-fdbe-4189-80e7-5ceba0100dbd-kube-api-access-ftr8w\") pod \"etcd-operator-b45778765-k5w5x\" (UID: \"bc178477-fdbe-4189-80e7-5ceba0100dbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k5w5x" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.771381 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lpbj\" (UniqueName: \"kubernetes.io/projected/c895d97a-7287-49a8-9ac5-bc87e8bcf297-kube-api-access-4lpbj\") pod \"console-f9d7485db-j74ct\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.791510 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrtsk\" (UniqueName: \"kubernetes.io/projected/53e8b265-c7b0-4b11-bcb0-225ada8332ce-kube-api-access-qrtsk\") pod \"dns-operator-744455d44c-xw899\" (UID: \"53e8b265-c7b0-4b11-bcb0-225ada8332ce\") " pod="openshift-dns-operator/dns-operator-744455d44c-xw899" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.811725 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-75q5v" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.812440 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f2pv\" (UniqueName: \"kubernetes.io/projected/4091b4ed-3afe-4bab-b41e-0bca5b6f58b0-kube-api-access-6f2pv\") pod \"downloads-7954f5f757-p7bmv\" (UID: \"4091b4ed-3afe-4bab-b41e-0bca5b6f58b0\") " pod="openshift-console/downloads-7954f5f757-p7bmv" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.831038 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd6md\" (UniqueName: \"kubernetes.io/projected/b28184c2-4cb3-4fe7-9c69-3fcf55a0d0e0-kube-api-access-jd6md\") pod \"cluster-samples-operator-665b6dd947-6ql7w\" (UID: \"b28184c2-4cb3-4fe7-9c69-3fcf55a0d0e0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6ql7w" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.835986 4719 request.go:700] Waited for 1.873240574s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver/serviceaccounts/openshift-apiserver-sa/token Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.851069 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs8sl\" (UniqueName: \"kubernetes.io/projected/26813cbf-0ed2-460a-b36f-f1a8895e68ec-kube-api-access-hs8sl\") pod \"apiserver-76f77b778f-4mm2x\" (UID: \"26813cbf-0ed2-460a-b36f-f1a8895e68ec\") " pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.858992 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.872089 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvpw4\" (UniqueName: \"kubernetes.io/projected/633935e5-0232-4844-8f77-e87a7d4385cd-kube-api-access-dvpw4\") pod \"apiserver-7bbb656c7d-696rc\" (UID: \"633935e5-0232-4844-8f77-e87a7d4385cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.877660 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.880344 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q2vmh" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.897749 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.917268 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.936737 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.937051 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-22bgx" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.947220 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-p7bmv" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.955995 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.964815 4719 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.964828 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6ql7w" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.976815 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.977550 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.986875 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9599k" Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.988901 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-75q5v"] Oct 09 15:20:35 crc kubenswrapper[4719]: I1009 15:20:35.992921 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-xw899" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.008117 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2pph" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.011071 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjsj6\" (UniqueName: \"kubernetes.io/projected/816f9c0b-05db-4dfc-8edb-ba2e0a14d43d-kube-api-access-qjsj6\") pod \"ingress-operator-5b745b69d9-fhfxg\" (UID: \"816f9c0b-05db-4dfc-8edb-ba2e0a14d43d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fhfxg" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.017639 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-k5w5x" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.047830 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lblwv"] Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.051627 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8nch\" (UniqueName: \"kubernetes.io/projected/c06c731d-cb94-49f2-8afb-899c7c6e7724-kube-api-access-x8nch\") pod \"route-controller-manager-6576b87f9c-rwhcb\" (UID: \"c06c731d-cb94-49f2-8afb-899c7c6e7724\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.055475 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79tkf\" (UniqueName: \"kubernetes.io/projected/312693cc-d986-462d-ac50-f4e44dbc8cf1-kube-api-access-79tkf\") pod \"packageserver-d55dfcdfc-79ms4\" (UID: \"312693cc-d986-462d-ac50-f4e44dbc8cf1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-79ms4" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.072523 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19fada05-fea3-4a67-a138-1cf460e31a2b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9jzcj\" (UID: \"19fada05-fea3-4a67-a138-1cf460e31a2b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9jzcj" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.098638 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.099369 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24cng\" (UniqueName: \"kubernetes.io/projected/4b4724c8-6007-4df3-b822-42d08ea33fde-kube-api-access-24cng\") pod \"control-plane-machine-set-operator-78cbb6b69f-2d4s7\" (UID: \"4b4724c8-6007-4df3-b822-42d08ea33fde\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2d4s7" Oct 09 15:20:36 crc kubenswrapper[4719]: W1009 15:20:36.108744 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35e1fbe6_e210_4329_a27b_3341136d7dcd.slice/crio-4de81b59bd054f4d846c8ca65fb5dc7618e65c430bf7042f693f7c1d6b8d9e15 WatchSource:0}: Error finding container 4de81b59bd054f4d846c8ca65fb5dc7618e65c430bf7042f693f7c1d6b8d9e15: Status 404 returned error can't find the container with id 4de81b59bd054f4d846c8ca65fb5dc7618e65c430bf7042f693f7c1d6b8d9e15 Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.112335 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhjlm\" (UniqueName: \"kubernetes.io/projected/d2a62908-86f6-4b7f-9169-cb7a9ef1ece8-kube-api-access-dhjlm\") pod \"router-default-5444994796-vdfqp\" (UID: \"d2a62908-86f6-4b7f-9169-cb7a9ef1ece8\") " pod="openshift-ingress/router-default-5444994796-vdfqp" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.156096 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-79ms4" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.163222 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hw7k\" (UniqueName: \"kubernetes.io/projected/f1c3d8c5-6a51-4c89-96e9-f7fb62e5685b-kube-api-access-7hw7k\") pod \"olm-operator-6b444d44fb-57b4h\" (UID: \"f1c3d8c5-6a51-4c89-96e9-f7fb62e5685b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57b4h" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.173284 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndb69\" (UniqueName: \"kubernetes.io/projected/7b6a94c8-9172-472f-8d40-4c259b21c6b9-kube-api-access-ndb69\") pod \"service-ca-operator-777779d784-qd6rx\" (UID: \"7b6a94c8-9172-472f-8d40-4c259b21c6b9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qd6rx" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.182944 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/816f9c0b-05db-4dfc-8edb-ba2e0a14d43d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fhfxg\" (UID: \"816f9c0b-05db-4dfc-8edb-ba2e0a14d43d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fhfxg" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.190063 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsdfm\" (UniqueName: \"kubernetes.io/projected/2a4743d0-e646-45ab-a225-816c0d99246a-kube-api-access-vsdfm\") pod \"oauth-openshift-558db77b4-db9tz\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.212186 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8fpx\" (UniqueName: \"kubernetes.io/projected/33659bcf-6d50-402b-a0da-7610749b535c-kube-api-access-g8fpx\") pod \"catalog-operator-68c6474976-ncvlk\" (UID: \"33659bcf-6d50-402b-a0da-7610749b535c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncvlk" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.215301 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2d4s7" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.221823 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9jzcj" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.239843 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.240660 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z26lm\" (UniqueName: \"kubernetes.io/projected/10f4c85b-1896-4882-9ee0-8117ddf6a7a6-kube-api-access-z26lm\") pod \"openshift-apiserver-operator-796bbdcf4f-t7zd8\" (UID: \"10f4c85b-1896-4882-9ee0-8117ddf6a7a6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t7zd8" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.244754 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-vdfqp" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.252284 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qd6rx" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.254672 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74cde4fb-7e17-40dc-8537-54b5ecb898d7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8v6g2\" (UID: \"74cde4fb-7e17-40dc-8537-54b5ecb898d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8v6g2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.275270 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhjxp\" (UniqueName: \"kubernetes.io/projected/97eeda6e-c63f-4b48-a8bd-05e673d79117-kube-api-access-qhjxp\") pod \"machine-config-operator-74547568cd-hz62m\" (UID: \"97eeda6e-c63f-4b48-a8bd-05e673d79117\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hz62m" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.306042 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69jd7\" (UniqueName: \"kubernetes.io/projected/24507f61-1a02-438c-b1ca-82515867e605-kube-api-access-69jd7\") pod \"collect-profiles-29333715-pgfd2\" (UID: \"24507f61-1a02-438c-b1ca-82515867e605\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333715-pgfd2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.326509 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.402467 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fhfxg" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.403285 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dwt2p\" (UID: \"2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7\") " pod="openshift-marketplace/marketplace-operator-79b997595-dwt2p" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.403322 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dwt2p\" (UID: \"2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7\") " pod="openshift-marketplace/marketplace-operator-79b997595-dwt2p" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.403367 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92f1494f-b7f7-4e94-90ce-132cc3a14a62-bound-sa-token\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.403390 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92f1494f-b7f7-4e94-90ce-132cc3a14a62-trusted-ca\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.403411 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7flcr\" (UniqueName: \"kubernetes.io/projected/2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7-kube-api-access-7flcr\") pod \"marketplace-operator-79b997595-dwt2p\" (UID: \"2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7\") " pod="openshift-marketplace/marketplace-operator-79b997595-dwt2p" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.403428 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/92f1494f-b7f7-4e94-90ce-132cc3a14a62-registry-certificates\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.403463 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff0f956-c475-4d8b-9ef0-8dab346c53f6-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-x6g6x\" (UID: \"7ff0f956-c475-4d8b-9ef0-8dab346c53f6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6g6x" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.403489 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/92f1494f-b7f7-4e94-90ce-132cc3a14a62-installation-pull-secrets\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.403507 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4-config\") pod \"authentication-operator-69f744f599-z2lvl\" (UID: \"daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z2lvl" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.403544 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/070fb955-7eaa-4ad0-a9a0-cda60314743f-signing-key\") pod \"service-ca-9c57cc56f-ffm6c\" (UID: \"070fb955-7eaa-4ad0-a9a0-cda60314743f\") " pod="openshift-service-ca/service-ca-9c57cc56f-ffm6c" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.403564 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/732da19d-17b5-4af0-b9bc-13c30ee6b5f5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-qk8b7\" (UID: \"732da19d-17b5-4af0-b9bc-13c30ee6b5f5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qk8b7" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.403587 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4-serving-cert\") pod \"authentication-operator-69f744f599-z2lvl\" (UID: \"daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z2lvl" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.403607 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/732da19d-17b5-4af0-b9bc-13c30ee6b5f5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-qk8b7\" (UID: \"732da19d-17b5-4af0-b9bc-13c30ee6b5f5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qk8b7" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.403642 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxr5n\" (UniqueName: \"kubernetes.io/projected/92f1494f-b7f7-4e94-90ce-132cc3a14a62-kube-api-access-nxr5n\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.403661 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hks5x\" (UniqueName: \"kubernetes.io/projected/070fb955-7eaa-4ad0-a9a0-cda60314743f-kube-api-access-hks5x\") pod \"service-ca-9c57cc56f-ffm6c\" (UID: \"070fb955-7eaa-4ad0-a9a0-cda60314743f\") " pod="openshift-service-ca/service-ca-9c57cc56f-ffm6c" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.403688 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-z2lvl\" (UID: \"daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z2lvl" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.403703 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4-service-ca-bundle\") pod \"authentication-operator-69f744f599-z2lvl\" (UID: \"daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z2lvl" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.403729 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.403746 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/92f1494f-b7f7-4e94-90ce-132cc3a14a62-registry-tls\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.403762 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/92f1494f-b7f7-4e94-90ce-132cc3a14a62-ca-trust-extracted\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.403780 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppmtx\" (UniqueName: \"kubernetes.io/projected/bc529b0d-9e56-4818-b017-7d035de4f2da-kube-api-access-ppmtx\") pod \"migrator-59844c95c7-ff97c\" (UID: \"bc529b0d-9e56-4818-b017-7d035de4f2da\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ff97c" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.403867 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/070fb955-7eaa-4ad0-a9a0-cda60314743f-signing-cabundle\") pod \"service-ca-9c57cc56f-ffm6c\" (UID: \"070fb955-7eaa-4ad0-a9a0-cda60314743f\") " pod="openshift-service-ca/service-ca-9c57cc56f-ffm6c" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.404044 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/732da19d-17b5-4af0-b9bc-13c30ee6b5f5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-qk8b7\" (UID: \"732da19d-17b5-4af0-b9bc-13c30ee6b5f5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qk8b7" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.404176 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncvlk" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.404331 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7vj6\" (UniqueName: \"kubernetes.io/projected/7ff0f956-c475-4d8b-9ef0-8dab346c53f6-kube-api-access-k7vj6\") pod \"package-server-manager-789f6589d5-x6g6x\" (UID: \"7ff0f956-c475-4d8b-9ef0-8dab346c53f6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6g6x" Oct 09 15:20:36 crc kubenswrapper[4719]: E1009 15:20:36.404384 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:36.904340324 +0000 UTC m=+142.414051609 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.404422 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s45p9\" (UniqueName: \"kubernetes.io/projected/daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4-kube-api-access-s45p9\") pod \"authentication-operator-69f744f599-z2lvl\" (UID: \"daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z2lvl" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.404500 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nc46\" (UniqueName: \"kubernetes.io/projected/732da19d-17b5-4af0-b9bc-13c30ee6b5f5-kube-api-access-6nc46\") pod \"cluster-image-registry-operator-dc59b4c8b-qk8b7\" (UID: \"732da19d-17b5-4af0-b9bc-13c30ee6b5f5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qk8b7" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.412025 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57b4h" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.417937 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-xw899"] Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.429693 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6ql7w"] Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.435508 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t7zd8" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.441931 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hz62m" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.449048 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333715-pgfd2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.460145 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-22bgx"] Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.479132 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-p7bmv"] Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.505321 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:36 crc kubenswrapper[4719]: E1009 15:20:36.505514 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:37.005479703 +0000 UTC m=+142.515190988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.505570 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvv9g\" (UniqueName: \"kubernetes.io/projected/8384cfeb-4188-4b7e-b371-3b9e1032781f-kube-api-access-jvv9g\") pod \"multus-admission-controller-857f4d67dd-98tjn\" (UID: \"8384cfeb-4188-4b7e-b371-3b9e1032781f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-98tjn" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.505626 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f679e74b-ce46-427e-928b-305b4579ca44-socket-dir\") pod \"csi-hostpathplugin-x8kq2\" (UID: \"f679e74b-ce46-427e-928b-305b4579ca44\") " pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.505651 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f2f05e5-cf9a-4f30-8b2e-8a2b87f26ad4-proxy-tls\") pod \"machine-config-controller-84d6567774-hf8sg\" (UID: \"4f2f05e5-cf9a-4f30-8b2e-8a2b87f26ad4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hf8sg" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.505725 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dwt2p\" (UID: \"2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7\") " pod="openshift-marketplace/marketplace-operator-79b997595-dwt2p" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.505757 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dwt2p\" (UID: \"2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7\") " pod="openshift-marketplace/marketplace-operator-79b997595-dwt2p" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.505801 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4f7457b2-c2c6-444c-8bc9-563f02df2183-metrics-tls\") pod \"dns-default-762nw\" (UID: \"4f7457b2-c2c6-444c-8bc9-563f02df2183\") " pod="openshift-dns/dns-default-762nw" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.505877 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92f1494f-b7f7-4e94-90ce-132cc3a14a62-bound-sa-token\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.505895 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/36c83e0c-8489-4fc6-9343-9fa35c1e75cb-cert\") pod \"ingress-canary-d88hv\" (UID: \"36c83e0c-8489-4fc6-9343-9fa35c1e75cb\") " pod="openshift-ingress-canary/ingress-canary-d88hv" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.505976 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f679e74b-ce46-427e-928b-305b4579ca44-mountpoint-dir\") pod \"csi-hostpathplugin-x8kq2\" (UID: \"f679e74b-ce46-427e-928b-305b4579ca44\") " pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.506008 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92f1494f-b7f7-4e94-90ce-132cc3a14a62-trusted-ca\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.506036 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jhtk\" (UniqueName: \"kubernetes.io/projected/4f2f05e5-cf9a-4f30-8b2e-8a2b87f26ad4-kube-api-access-2jhtk\") pod \"machine-config-controller-84d6567774-hf8sg\" (UID: \"4f2f05e5-cf9a-4f30-8b2e-8a2b87f26ad4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hf8sg" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.506056 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f7457b2-c2c6-444c-8bc9-563f02df2183-config-volume\") pod \"dns-default-762nw\" (UID: \"4f7457b2-c2c6-444c-8bc9-563f02df2183\") " pod="openshift-dns/dns-default-762nw" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.506117 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfxxg\" (UniqueName: \"kubernetes.io/projected/78762e7d-4683-45ec-b6ac-c646dc1eb8e8-kube-api-access-sfxxg\") pod \"kube-storage-version-migrator-operator-b67b599dd-x7wsc\" (UID: \"78762e7d-4683-45ec-b6ac-c646dc1eb8e8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x7wsc" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.506184 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/92f1494f-b7f7-4e94-90ce-132cc3a14a62-registry-certificates\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.506207 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7flcr\" (UniqueName: \"kubernetes.io/projected/2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7-kube-api-access-7flcr\") pod \"marketplace-operator-79b997595-dwt2p\" (UID: \"2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7\") " pod="openshift-marketplace/marketplace-operator-79b997595-dwt2p" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.506266 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3df7532d-cc40-4430-aa65-d2abaaa9f2a1-certs\") pod \"machine-config-server-n6hr2\" (UID: \"3df7532d-cc40-4430-aa65-d2abaaa9f2a1\") " pod="openshift-machine-config-operator/machine-config-server-n6hr2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.506294 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff0f956-c475-4d8b-9ef0-8dab346c53f6-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-x6g6x\" (UID: \"7ff0f956-c475-4d8b-9ef0-8dab346c53f6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6g6x" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.506378 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f679e74b-ce46-427e-928b-305b4579ca44-plugins-dir\") pod \"csi-hostpathplugin-x8kq2\" (UID: \"f679e74b-ce46-427e-928b-305b4579ca44\") " pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.506410 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/92f1494f-b7f7-4e94-90ce-132cc3a14a62-installation-pull-secrets\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.506428 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4-config\") pod \"authentication-operator-69f744f599-z2lvl\" (UID: \"daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z2lvl" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.506491 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e99b117c-49b1-4d38-abb8-1f0d397781c2-config\") pod \"kube-controller-manager-operator-78b949d7b-ppwzx\" (UID: \"e99b117c-49b1-4d38-abb8-1f0d397781c2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ppwzx" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.506619 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/070fb955-7eaa-4ad0-a9a0-cda60314743f-signing-key\") pod \"service-ca-9c57cc56f-ffm6c\" (UID: \"070fb955-7eaa-4ad0-a9a0-cda60314743f\") " pod="openshift-service-ca/service-ca-9c57cc56f-ffm6c" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.506644 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/732da19d-17b5-4af0-b9bc-13c30ee6b5f5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-qk8b7\" (UID: \"732da19d-17b5-4af0-b9bc-13c30ee6b5f5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qk8b7" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.506662 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c85w6\" (UniqueName: \"kubernetes.io/projected/3df7532d-cc40-4430-aa65-d2abaaa9f2a1-kube-api-access-c85w6\") pod \"machine-config-server-n6hr2\" (UID: \"3df7532d-cc40-4430-aa65-d2abaaa9f2a1\") " pod="openshift-machine-config-operator/machine-config-server-n6hr2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.506739 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4-serving-cert\") pod \"authentication-operator-69f744f599-z2lvl\" (UID: \"daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z2lvl" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.506757 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78762e7d-4683-45ec-b6ac-c646dc1eb8e8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x7wsc\" (UID: \"78762e7d-4683-45ec-b6ac-c646dc1eb8e8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x7wsc" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.506839 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/732da19d-17b5-4af0-b9bc-13c30ee6b5f5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-qk8b7\" (UID: \"732da19d-17b5-4af0-b9bc-13c30ee6b5f5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qk8b7" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.506889 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmjbc\" (UniqueName: \"kubernetes.io/projected/36c83e0c-8489-4fc6-9343-9fa35c1e75cb-kube-api-access-qmjbc\") pod \"ingress-canary-d88hv\" (UID: \"36c83e0c-8489-4fc6-9343-9fa35c1e75cb\") " pod="openshift-ingress-canary/ingress-canary-d88hv" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.506971 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxr5n\" (UniqueName: \"kubernetes.io/projected/92f1494f-b7f7-4e94-90ce-132cc3a14a62-kube-api-access-nxr5n\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.506990 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hks5x\" (UniqueName: \"kubernetes.io/projected/070fb955-7eaa-4ad0-a9a0-cda60314743f-kube-api-access-hks5x\") pod \"service-ca-9c57cc56f-ffm6c\" (UID: \"070fb955-7eaa-4ad0-a9a0-cda60314743f\") " pod="openshift-service-ca/service-ca-9c57cc56f-ffm6c" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.507022 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e99b117c-49b1-4d38-abb8-1f0d397781c2-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ppwzx\" (UID: \"e99b117c-49b1-4d38-abb8-1f0d397781c2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ppwzx" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.507064 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e99b117c-49b1-4d38-abb8-1f0d397781c2-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ppwzx\" (UID: \"e99b117c-49b1-4d38-abb8-1f0d397781c2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ppwzx" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.507140 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-z2lvl\" (UID: \"daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z2lvl" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.507209 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4-service-ca-bundle\") pod \"authentication-operator-69f744f599-z2lvl\" (UID: \"daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z2lvl" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.507271 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.507304 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/92f1494f-b7f7-4e94-90ce-132cc3a14a62-ca-trust-extracted\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.507336 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/92f1494f-b7f7-4e94-90ce-132cc3a14a62-registry-tls\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.507410 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppmtx\" (UniqueName: \"kubernetes.io/projected/bc529b0d-9e56-4818-b017-7d035de4f2da-kube-api-access-ppmtx\") pod \"migrator-59844c95c7-ff97c\" (UID: \"bc529b0d-9e56-4818-b017-7d035de4f2da\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ff97c" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.507486 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/070fb955-7eaa-4ad0-a9a0-cda60314743f-signing-cabundle\") pod \"service-ca-9c57cc56f-ffm6c\" (UID: \"070fb955-7eaa-4ad0-a9a0-cda60314743f\") " pod="openshift-service-ca/service-ca-9c57cc56f-ffm6c" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.507504 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78762e7d-4683-45ec-b6ac-c646dc1eb8e8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x7wsc\" (UID: \"78762e7d-4683-45ec-b6ac-c646dc1eb8e8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x7wsc" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.507549 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/732da19d-17b5-4af0-b9bc-13c30ee6b5f5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-qk8b7\" (UID: \"732da19d-17b5-4af0-b9bc-13c30ee6b5f5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qk8b7" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.507568 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3df7532d-cc40-4430-aa65-d2abaaa9f2a1-node-bootstrap-token\") pod \"machine-config-server-n6hr2\" (UID: \"3df7532d-cc40-4430-aa65-d2abaaa9f2a1\") " pod="openshift-machine-config-operator/machine-config-server-n6hr2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.507621 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f679e74b-ce46-427e-928b-305b4579ca44-csi-data-dir\") pod \"csi-hostpathplugin-x8kq2\" (UID: \"f679e74b-ce46-427e-928b-305b4579ca44\") " pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.507640 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xcpw\" (UniqueName: \"kubernetes.io/projected/f679e74b-ce46-427e-928b-305b4579ca44-kube-api-access-4xcpw\") pod \"csi-hostpathplugin-x8kq2\" (UID: \"f679e74b-ce46-427e-928b-305b4579ca44\") " pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.507721 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7vj6\" (UniqueName: \"kubernetes.io/projected/7ff0f956-c475-4d8b-9ef0-8dab346c53f6-kube-api-access-k7vj6\") pod \"package-server-manager-789f6589d5-x6g6x\" (UID: \"7ff0f956-c475-4d8b-9ef0-8dab346c53f6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6g6x" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.507739 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4f2f05e5-cf9a-4f30-8b2e-8a2b87f26ad4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hf8sg\" (UID: \"4f2f05e5-cf9a-4f30-8b2e-8a2b87f26ad4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hf8sg" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.513639 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/92f1494f-b7f7-4e94-90ce-132cc3a14a62-ca-trust-extracted\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.513895 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dwt2p\" (UID: \"2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7\") " pod="openshift-marketplace/marketplace-operator-79b997595-dwt2p" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.515403 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s45p9\" (UniqueName: \"kubernetes.io/projected/daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4-kube-api-access-s45p9\") pod \"authentication-operator-69f744f599-z2lvl\" (UID: \"daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z2lvl" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.515501 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w827b\" (UniqueName: \"kubernetes.io/projected/4f7457b2-c2c6-444c-8bc9-563f02df2183-kube-api-access-w827b\") pod \"dns-default-762nw\" (UID: \"4f7457b2-c2c6-444c-8bc9-563f02df2183\") " pod="openshift-dns/dns-default-762nw" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.515848 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nc46\" (UniqueName: \"kubernetes.io/projected/732da19d-17b5-4af0-b9bc-13c30ee6b5f5-kube-api-access-6nc46\") pod \"cluster-image-registry-operator-dc59b4c8b-qk8b7\" (UID: \"732da19d-17b5-4af0-b9bc-13c30ee6b5f5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qk8b7" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.515870 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f679e74b-ce46-427e-928b-305b4579ca44-registration-dir\") pod \"csi-hostpathplugin-x8kq2\" (UID: \"f679e74b-ce46-427e-928b-305b4579ca44\") " pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.515908 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8384cfeb-4188-4b7e-b371-3b9e1032781f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-98tjn\" (UID: \"8384cfeb-4188-4b7e-b371-3b9e1032781f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-98tjn" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.521137 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/070fb955-7eaa-4ad0-a9a0-cda60314743f-signing-cabundle\") pod \"service-ca-9c57cc56f-ffm6c\" (UID: \"070fb955-7eaa-4ad0-a9a0-cda60314743f\") " pod="openshift-service-ca/service-ca-9c57cc56f-ffm6c" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.523787 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4-service-ca-bundle\") pod \"authentication-operator-69f744f599-z2lvl\" (UID: \"daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z2lvl" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.523825 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dwt2p\" (UID: \"2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7\") " pod="openshift-marketplace/marketplace-operator-79b997595-dwt2p" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.524072 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/732da19d-17b5-4af0-b9bc-13c30ee6b5f5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-qk8b7\" (UID: \"732da19d-17b5-4af0-b9bc-13c30ee6b5f5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qk8b7" Oct 09 15:20:36 crc kubenswrapper[4719]: E1009 15:20:36.524276 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:37.024258158 +0000 UTC m=+142.533969523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.526603 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4-config\") pod \"authentication-operator-69f744f599-z2lvl\" (UID: \"daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z2lvl" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.528223 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-z2lvl\" (UID: \"daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z2lvl" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.530257 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8v6g2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.531910 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/92f1494f-b7f7-4e94-90ce-132cc3a14a62-registry-certificates\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.537918 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/070fb955-7eaa-4ad0-a9a0-cda60314743f-signing-key\") pod \"service-ca-9c57cc56f-ffm6c\" (UID: \"070fb955-7eaa-4ad0-a9a0-cda60314743f\") " pod="openshift-service-ca/service-ca-9c57cc56f-ffm6c" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.542343 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92f1494f-b7f7-4e94-90ce-132cc3a14a62-trusted-ca\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.547087 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/92f1494f-b7f7-4e94-90ce-132cc3a14a62-installation-pull-secrets\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.549626 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-j74ct"] Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.549850 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/92f1494f-b7f7-4e94-90ce-132cc3a14a62-registry-tls\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.550232 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff0f956-c475-4d8b-9ef0-8dab346c53f6-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-x6g6x\" (UID: \"7ff0f956-c475-4d8b-9ef0-8dab346c53f6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6g6x" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.552604 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/732da19d-17b5-4af0-b9bc-13c30ee6b5f5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-qk8b7\" (UID: \"732da19d-17b5-4af0-b9bc-13c30ee6b5f5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qk8b7" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.557453 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hks5x\" (UniqueName: \"kubernetes.io/projected/070fb955-7eaa-4ad0-a9a0-cda60314743f-kube-api-access-hks5x\") pod \"service-ca-9c57cc56f-ffm6c\" (UID: \"070fb955-7eaa-4ad0-a9a0-cda60314743f\") " pod="openshift-service-ca/service-ca-9c57cc56f-ffm6c" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.559857 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc"] Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.564746 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4-serving-cert\") pod \"authentication-operator-69f744f599-z2lvl\" (UID: \"daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z2lvl" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.581138 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppmtx\" (UniqueName: \"kubernetes.io/projected/bc529b0d-9e56-4818-b017-7d035de4f2da-kube-api-access-ppmtx\") pod \"migrator-59844c95c7-ff97c\" (UID: \"bc529b0d-9e56-4818-b017-7d035de4f2da\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ff97c" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.581250 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxr5n\" (UniqueName: \"kubernetes.io/projected/92f1494f-b7f7-4e94-90ce-132cc3a14a62-kube-api-access-nxr5n\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.591053 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-k5w5x"] Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.619980 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/732da19d-17b5-4af0-b9bc-13c30ee6b5f5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-qk8b7\" (UID: \"732da19d-17b5-4af0-b9bc-13c30ee6b5f5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qk8b7" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.622075 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2pph"] Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.624556 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:36 crc kubenswrapper[4719]: E1009 15:20:36.625619 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:37.125577543 +0000 UTC m=+142.635288838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.626983 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f679e74b-ce46-427e-928b-305b4579ca44-registration-dir\") pod \"csi-hostpathplugin-x8kq2\" (UID: \"f679e74b-ce46-427e-928b-305b4579ca44\") " pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.627504 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f679e74b-ce46-427e-928b-305b4579ca44-registration-dir\") pod \"csi-hostpathplugin-x8kq2\" (UID: \"f679e74b-ce46-427e-928b-305b4579ca44\") " pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.627578 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8384cfeb-4188-4b7e-b371-3b9e1032781f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-98tjn\" (UID: \"8384cfeb-4188-4b7e-b371-3b9e1032781f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-98tjn" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.627628 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvv9g\" (UniqueName: \"kubernetes.io/projected/8384cfeb-4188-4b7e-b371-3b9e1032781f-kube-api-access-jvv9g\") pod \"multus-admission-controller-857f4d67dd-98tjn\" (UID: \"8384cfeb-4188-4b7e-b371-3b9e1032781f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-98tjn" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.627668 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f2f05e5-cf9a-4f30-8b2e-8a2b87f26ad4-proxy-tls\") pod \"machine-config-controller-84d6567774-hf8sg\" (UID: \"4f2f05e5-cf9a-4f30-8b2e-8a2b87f26ad4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hf8sg" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.627695 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f679e74b-ce46-427e-928b-305b4579ca44-socket-dir\") pod \"csi-hostpathplugin-x8kq2\" (UID: \"f679e74b-ce46-427e-928b-305b4579ca44\") " pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.627742 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4f7457b2-c2c6-444c-8bc9-563f02df2183-metrics-tls\") pod \"dns-default-762nw\" (UID: \"4f7457b2-c2c6-444c-8bc9-563f02df2183\") " pod="openshift-dns/dns-default-762nw" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.627817 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/36c83e0c-8489-4fc6-9343-9fa35c1e75cb-cert\") pod \"ingress-canary-d88hv\" (UID: \"36c83e0c-8489-4fc6-9343-9fa35c1e75cb\") " pod="openshift-ingress-canary/ingress-canary-d88hv" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.627860 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f679e74b-ce46-427e-928b-305b4579ca44-mountpoint-dir\") pod \"csi-hostpathplugin-x8kq2\" (UID: \"f679e74b-ce46-427e-928b-305b4579ca44\") " pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.627890 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jhtk\" (UniqueName: \"kubernetes.io/projected/4f2f05e5-cf9a-4f30-8b2e-8a2b87f26ad4-kube-api-access-2jhtk\") pod \"machine-config-controller-84d6567774-hf8sg\" (UID: \"4f2f05e5-cf9a-4f30-8b2e-8a2b87f26ad4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hf8sg" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.627917 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f7457b2-c2c6-444c-8bc9-563f02df2183-config-volume\") pod \"dns-default-762nw\" (UID: \"4f7457b2-c2c6-444c-8bc9-563f02df2183\") " pod="openshift-dns/dns-default-762nw" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.627957 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfxxg\" (UniqueName: \"kubernetes.io/projected/78762e7d-4683-45ec-b6ac-c646dc1eb8e8-kube-api-access-sfxxg\") pod \"kube-storage-version-migrator-operator-b67b599dd-x7wsc\" (UID: \"78762e7d-4683-45ec-b6ac-c646dc1eb8e8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x7wsc" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.628013 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3df7532d-cc40-4430-aa65-d2abaaa9f2a1-certs\") pod \"machine-config-server-n6hr2\" (UID: \"3df7532d-cc40-4430-aa65-d2abaaa9f2a1\") " pod="openshift-machine-config-operator/machine-config-server-n6hr2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.628053 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f679e74b-ce46-427e-928b-305b4579ca44-plugins-dir\") pod \"csi-hostpathplugin-x8kq2\" (UID: \"f679e74b-ce46-427e-928b-305b4579ca44\") " pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.629501 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f679e74b-ce46-427e-928b-305b4579ca44-mountpoint-dir\") pod \"csi-hostpathplugin-x8kq2\" (UID: \"f679e74b-ce46-427e-928b-305b4579ca44\") " pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.631506 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8384cfeb-4188-4b7e-b371-3b9e1032781f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-98tjn\" (UID: \"8384cfeb-4188-4b7e-b371-3b9e1032781f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-98tjn" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.635051 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f679e74b-ce46-427e-928b-305b4579ca44-socket-dir\") pod \"csi-hostpathplugin-x8kq2\" (UID: \"f679e74b-ce46-427e-928b-305b4579ca44\") " pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.635066 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4f7457b2-c2c6-444c-8bc9-563f02df2183-metrics-tls\") pod \"dns-default-762nw\" (UID: \"4f7457b2-c2c6-444c-8bc9-563f02df2183\") " pod="openshift-dns/dns-default-762nw" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.637112 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f7457b2-c2c6-444c-8bc9-563f02df2183-config-volume\") pod \"dns-default-762nw\" (UID: \"4f7457b2-c2c6-444c-8bc9-563f02df2183\") " pod="openshift-dns/dns-default-762nw" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.638253 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f679e74b-ce46-427e-928b-305b4579ca44-plugins-dir\") pod \"csi-hostpathplugin-x8kq2\" (UID: \"f679e74b-ce46-427e-928b-305b4579ca44\") " pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.645152 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f2f05e5-cf9a-4f30-8b2e-8a2b87f26ad4-proxy-tls\") pod \"machine-config-controller-84d6567774-hf8sg\" (UID: \"4f2f05e5-cf9a-4f30-8b2e-8a2b87f26ad4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hf8sg" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.646828 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb"] Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.652635 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e99b117c-49b1-4d38-abb8-1f0d397781c2-config\") pod \"kube-controller-manager-operator-78b949d7b-ppwzx\" (UID: \"e99b117c-49b1-4d38-abb8-1f0d397781c2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ppwzx" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.652687 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c85w6\" (UniqueName: \"kubernetes.io/projected/3df7532d-cc40-4430-aa65-d2abaaa9f2a1-kube-api-access-c85w6\") pod \"machine-config-server-n6hr2\" (UID: \"3df7532d-cc40-4430-aa65-d2abaaa9f2a1\") " pod="openshift-machine-config-operator/machine-config-server-n6hr2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.652727 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78762e7d-4683-45ec-b6ac-c646dc1eb8e8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x7wsc\" (UID: \"78762e7d-4683-45ec-b6ac-c646dc1eb8e8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x7wsc" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.652769 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmjbc\" (UniqueName: \"kubernetes.io/projected/36c83e0c-8489-4fc6-9343-9fa35c1e75cb-kube-api-access-qmjbc\") pod \"ingress-canary-d88hv\" (UID: \"36c83e0c-8489-4fc6-9343-9fa35c1e75cb\") " pod="openshift-ingress-canary/ingress-canary-d88hv" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.652848 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e99b117c-49b1-4d38-abb8-1f0d397781c2-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ppwzx\" (UID: \"e99b117c-49b1-4d38-abb8-1f0d397781c2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ppwzx" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.652888 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e99b117c-49b1-4d38-abb8-1f0d397781c2-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ppwzx\" (UID: \"e99b117c-49b1-4d38-abb8-1f0d397781c2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ppwzx" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.652957 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.653017 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78762e7d-4683-45ec-b6ac-c646dc1eb8e8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x7wsc\" (UID: \"78762e7d-4683-45ec-b6ac-c646dc1eb8e8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x7wsc" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.653048 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3df7532d-cc40-4430-aa65-d2abaaa9f2a1-node-bootstrap-token\") pod \"machine-config-server-n6hr2\" (UID: \"3df7532d-cc40-4430-aa65-d2abaaa9f2a1\") " pod="openshift-machine-config-operator/machine-config-server-n6hr2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.653082 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f679e74b-ce46-427e-928b-305b4579ca44-csi-data-dir\") pod \"csi-hostpathplugin-x8kq2\" (UID: \"f679e74b-ce46-427e-928b-305b4579ca44\") " pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.653113 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xcpw\" (UniqueName: \"kubernetes.io/projected/f679e74b-ce46-427e-928b-305b4579ca44-kube-api-access-4xcpw\") pod \"csi-hostpathplugin-x8kq2\" (UID: \"f679e74b-ce46-427e-928b-305b4579ca44\") " pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.653170 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4f2f05e5-cf9a-4f30-8b2e-8a2b87f26ad4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hf8sg\" (UID: \"4f2f05e5-cf9a-4f30-8b2e-8a2b87f26ad4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hf8sg" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.653209 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w827b\" (UniqueName: \"kubernetes.io/projected/4f7457b2-c2c6-444c-8bc9-563f02df2183-kube-api-access-w827b\") pod \"dns-default-762nw\" (UID: \"4f7457b2-c2c6-444c-8bc9-563f02df2183\") " pod="openshift-dns/dns-default-762nw" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.653974 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7flcr\" (UniqueName: \"kubernetes.io/projected/2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7-kube-api-access-7flcr\") pod \"marketplace-operator-79b997595-dwt2p\" (UID: \"2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7\") " pod="openshift-marketplace/marketplace-operator-79b997595-dwt2p" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.654698 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e99b117c-49b1-4d38-abb8-1f0d397781c2-config\") pod \"kube-controller-manager-operator-78b949d7b-ppwzx\" (UID: \"e99b117c-49b1-4d38-abb8-1f0d397781c2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ppwzx" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.656864 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78762e7d-4683-45ec-b6ac-c646dc1eb8e8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x7wsc\" (UID: \"78762e7d-4683-45ec-b6ac-c646dc1eb8e8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x7wsc" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.657583 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/36c83e0c-8489-4fc6-9343-9fa35c1e75cb-cert\") pod \"ingress-canary-d88hv\" (UID: \"36c83e0c-8489-4fc6-9343-9fa35c1e75cb\") " pod="openshift-ingress-canary/ingress-canary-d88hv" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.658191 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f679e74b-ce46-427e-928b-305b4579ca44-csi-data-dir\") pod \"csi-hostpathplugin-x8kq2\" (UID: \"f679e74b-ce46-427e-928b-305b4579ca44\") " pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.658923 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4f2f05e5-cf9a-4f30-8b2e-8a2b87f26ad4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hf8sg\" (UID: \"4f2f05e5-cf9a-4f30-8b2e-8a2b87f26ad4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hf8sg" Oct 09 15:20:36 crc kubenswrapper[4719]: E1009 15:20:36.659572 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:37.159555289 +0000 UTC m=+142.669266574 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.668888 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-4mm2x"] Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.673272 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7vj6\" (UniqueName: \"kubernetes.io/projected/7ff0f956-c475-4d8b-9ef0-8dab346c53f6-kube-api-access-k7vj6\") pod \"package-server-manager-789f6589d5-x6g6x\" (UID: \"7ff0f956-c475-4d8b-9ef0-8dab346c53f6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6g6x" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.673985 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e99b117c-49b1-4d38-abb8-1f0d397781c2-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ppwzx\" (UID: \"e99b117c-49b1-4d38-abb8-1f0d397781c2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ppwzx" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.675544 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3df7532d-cc40-4430-aa65-d2abaaa9f2a1-node-bootstrap-token\") pod \"machine-config-server-n6hr2\" (UID: \"3df7532d-cc40-4430-aa65-d2abaaa9f2a1\") " pod="openshift-machine-config-operator/machine-config-server-n6hr2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.676650 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3df7532d-cc40-4430-aa65-d2abaaa9f2a1-certs\") pod \"machine-config-server-n6hr2\" (UID: \"3df7532d-cc40-4430-aa65-d2abaaa9f2a1\") " pod="openshift-machine-config-operator/machine-config-server-n6hr2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.685314 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s45p9\" (UniqueName: \"kubernetes.io/projected/daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4-kube-api-access-s45p9\") pod \"authentication-operator-69f744f599-z2lvl\" (UID: \"daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z2lvl" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.691627 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78762e7d-4683-45ec-b6ac-c646dc1eb8e8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x7wsc\" (UID: \"78762e7d-4683-45ec-b6ac-c646dc1eb8e8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x7wsc" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.696843 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-9599k"] Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.696893 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9jzcj"] Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.702446 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92f1494f-b7f7-4e94-90ce-132cc3a14a62-bound-sa-token\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.719099 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dwt2p" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.724507 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-qd6rx"] Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.724738 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6g6x" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.735406 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nc46\" (UniqueName: \"kubernetes.io/projected/732da19d-17b5-4af0-b9bc-13c30ee6b5f5-kube-api-access-6nc46\") pod \"cluster-image-registry-operator-dc59b4c8b-qk8b7\" (UID: \"732da19d-17b5-4af0-b9bc-13c30ee6b5f5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qk8b7" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.746203 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2d4s7"] Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.753139 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-79ms4"] Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.754646 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:36 crc kubenswrapper[4719]: E1009 15:20:36.754824 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:37.254795618 +0000 UTC m=+142.764506903 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.756176 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: E1009 15:20:36.756576 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:37.256560284 +0000 UTC m=+142.766271569 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.765808 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvv9g\" (UniqueName: \"kubernetes.io/projected/8384cfeb-4188-4b7e-b371-3b9e1032781f-kube-api-access-jvv9g\") pod \"multus-admission-controller-857f4d67dd-98tjn\" (UID: \"8384cfeb-4188-4b7e-b371-3b9e1032781f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-98tjn" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.766583 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ff97c" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.783598 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jhtk\" (UniqueName: \"kubernetes.io/projected/4f2f05e5-cf9a-4f30-8b2e-8a2b87f26ad4-kube-api-access-2jhtk\") pod \"machine-config-controller-84d6567774-hf8sg\" (UID: \"4f2f05e5-cf9a-4f30-8b2e-8a2b87f26ad4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hf8sg" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.796328 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfxxg\" (UniqueName: \"kubernetes.io/projected/78762e7d-4683-45ec-b6ac-c646dc1eb8e8-kube-api-access-sfxxg\") pod \"kube-storage-version-migrator-operator-b67b599dd-x7wsc\" (UID: \"78762e7d-4683-45ec-b6ac-c646dc1eb8e8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x7wsc" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.804643 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-db9tz"] Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.808205 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ffm6c" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.822244 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w827b\" (UniqueName: \"kubernetes.io/projected/4f7457b2-c2c6-444c-8bc9-563f02df2183-kube-api-access-w827b\") pod \"dns-default-762nw\" (UID: \"4f7457b2-c2c6-444c-8bc9-563f02df2183\") " pod="openshift-dns/dns-default-762nw" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.829670 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-79ms4" event={"ID":"312693cc-d986-462d-ac50-f4e44dbc8cf1","Type":"ContainerStarted","Data":"b32c1c2d1e2964f17e20ab82365fd3c72858b2761e6918bc43e1bd25bd16b212"} Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.833488 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9jzcj" event={"ID":"19fada05-fea3-4a67-a138-1cf460e31a2b","Type":"ContainerStarted","Data":"414bf1e374ba3a7659fa1f3d06eaeebd7d84c0eb61cbb114ad852fa9ecd9c494"} Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.840441 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6ql7w" event={"ID":"b28184c2-4cb3-4fe7-9c69-3fcf55a0d0e0","Type":"ContainerStarted","Data":"010af1dcf4bd6123174ecffd33a2333d83b6831bd3dd4c425e2c665196383f47"} Oct 09 15:20:36 crc kubenswrapper[4719]: W1009 15:20:36.840562 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b6a94c8_9172_472f_8d40_4c259b21c6b9.slice/crio-179aa9734ed975b4c80ae6aa0aa55cca32fb01b3df979a9604cb61ba57cdbfe8 WatchSource:0}: Error finding container 179aa9734ed975b4c80ae6aa0aa55cca32fb01b3df979a9604cb61ba57cdbfe8: Status 404 returned error can't find the container with id 179aa9734ed975b4c80ae6aa0aa55cca32fb01b3df979a9604cb61ba57cdbfe8 Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.841644 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c85w6\" (UniqueName: \"kubernetes.io/projected/3df7532d-cc40-4430-aa65-d2abaaa9f2a1-kube-api-access-c85w6\") pod \"machine-config-server-n6hr2\" (UID: \"3df7532d-cc40-4430-aa65-d2abaaa9f2a1\") " pod="openshift-machine-config-operator/machine-config-server-n6hr2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.850852 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-j74ct" event={"ID":"c895d97a-7287-49a8-9ac5-bc87e8bcf297","Type":"ContainerStarted","Data":"4cfe9df5f6dbdd6ffea299620b878b3dd4a1e62cb037248a1a19f5e51acb1db0"} Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.853529 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-k5w5x" event={"ID":"bc178477-fdbe-4189-80e7-5ceba0100dbd","Type":"ContainerStarted","Data":"97957a07b223e8812686eb568094e9b26a85e3e383e87566629b352553dcf41d"} Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.856749 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:36 crc kubenswrapper[4719]: E1009 15:20:36.857522 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:37.357450836 +0000 UTC m=+142.867162121 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.861703 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2pph" event={"ID":"a643603c-791b-4f68-a320-b6a2bcabf91f","Type":"ContainerStarted","Data":"406c23451be01135fce7fa59ce94a4f2397f2f675b0fcd267c01f851565080aa"} Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.863410 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-98tjn" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.863927 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2d4s7" event={"ID":"4b4724c8-6007-4df3-b822-42d08ea33fde","Type":"ContainerStarted","Data":"dd43b58e509b8bf5b85adcdf87c5855067cf1e9caece6831c4c65643a460b4f0"} Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.864576 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmjbc\" (UniqueName: \"kubernetes.io/projected/36c83e0c-8489-4fc6-9343-9fa35c1e75cb-kube-api-access-qmjbc\") pod \"ingress-canary-d88hv\" (UID: \"36c83e0c-8489-4fc6-9343-9fa35c1e75cb\") " pod="openshift-ingress-canary/ingress-canary-d88hv" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.866115 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q2vmh" event={"ID":"2aba1eb4-b085-48e7-941b-403c160fb3f4","Type":"ContainerStarted","Data":"b71137d1bf349c329585993177375dd73d3622c612835eddbd32b0c8d4950590"} Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.866141 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q2vmh" event={"ID":"2aba1eb4-b085-48e7-941b-403c160fb3f4","Type":"ContainerStarted","Data":"7eb162744ab21c1897b605d78efe98556810505f025b182165dd6d1207c5a6ac"} Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.867238 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-p7bmv" event={"ID":"4091b4ed-3afe-4bab-b41e-0bca5b6f58b0","Type":"ContainerStarted","Data":"870b3f6b87f97199e966fad0278caabc637943f324dd9655e77773b93e85d583"} Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.873769 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-vdfqp" event={"ID":"d2a62908-86f6-4b7f-9169-cb7a9ef1ece8","Type":"ContainerStarted","Data":"09a9e615cd556fbb49b3ace2fe21fb1e0640b93fed7ba288e4643a30472ad2db"} Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.873814 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-vdfqp" event={"ID":"d2a62908-86f6-4b7f-9169-cb7a9ef1ece8","Type":"ContainerStarted","Data":"204ea1a9c6f37691f74deafb019777fb781a50969d59348954d490d36da08bb4"} Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.874314 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xcpw\" (UniqueName: \"kubernetes.io/projected/f679e74b-ce46-427e-928b-305b4579ca44-kube-api-access-4xcpw\") pod \"csi-hostpathplugin-x8kq2\" (UID: \"f679e74b-ce46-427e-928b-305b4579ca44\") " pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.877914 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hf8sg" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.878992 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb" event={"ID":"c06c731d-cb94-49f2-8afb-899c7c6e7724","Type":"ContainerStarted","Data":"35c40da366289aeeac5eaa006789c2e45207c7f7b4fc3a562482ddd68a22d2b8"} Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.881488 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-xw899" event={"ID":"53e8b265-c7b0-4b11-bcb0-225ada8332ce","Type":"ContainerStarted","Data":"34b07e2f3fe071e3886f92377c7b1363f36f2e68ccf064114098dd4f508d7890"} Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.886627 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x7wsc" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.894545 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-d88hv" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.898772 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-75q5v" event={"ID":"f07d2126-7037-4b5c-aa67-4d09bf873e07","Type":"ContainerStarted","Data":"a3789358d1ae2bd770aebc82f14352b22fb01793f4c971c0f4f62c1928595a03"} Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.898804 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-75q5v" event={"ID":"f07d2126-7037-4b5c-aa67-4d09bf873e07","Type":"ContainerStarted","Data":"04b53d5f5b0e95c456484b6214e084376313d96956d2416940682ec62b7c443b"} Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.901388 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-n6hr2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.906704 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e99b117c-49b1-4d38-abb8-1f0d397781c2-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ppwzx\" (UID: \"e99b117c-49b1-4d38-abb8-1f0d397781c2\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ppwzx" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.909649 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-762nw" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.931705 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9599k" event={"ID":"563b4e02-b3d3-4f24-b571-091d77871f9b","Type":"ContainerStarted","Data":"a2821625bb2844a1ed46dfb8751602cededa16ba53d7cf07ddb403088053d4e4"} Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.934636 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qk8b7" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.934980 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.935495 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" event={"ID":"633935e5-0232-4844-8f77-e87a7d4385cd","Type":"ContainerStarted","Data":"a61e97dbd0f4d0fcc9800c7cab7c6386089290b1ee8c617bd6569c745f6e36d9"} Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.950415 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333715-pgfd2"] Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.952246 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-z2lvl" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.958666 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:36 crc kubenswrapper[4719]: E1009 15:20:36.959850 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:37.459831355 +0000 UTC m=+142.969542710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.967822 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" event={"ID":"35e1fbe6-e210-4329-a27b-3341136d7dcd","Type":"ContainerStarted","Data":"970b351ff620528e1ba8050c7be20f902b2ed86d83bdf2734e36a85ec2becd21"} Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.967863 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" event={"ID":"35e1fbe6-e210-4329-a27b-3341136d7dcd","Type":"ContainerStarted","Data":"4de81b59bd054f4d846c8ca65fb5dc7618e65c430bf7042f693f7c1d6b8d9e15"} Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.968097 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.969113 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-22bgx" event={"ID":"cf975172-874f-418c-947b-6226e1662647","Type":"ContainerStarted","Data":"7448ea46573a19890740a8ca78f32bcfcfeab99085cf4bc3292b27f3c5e36491"} Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.970538 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57b4h"] Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.973381 4719 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-lblwv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.973421 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" podUID="35e1fbe6-e210-4329-a27b-3341136d7dcd" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.980199 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.980235 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:20:36 crc kubenswrapper[4719]: I1009 15:20:36.980955 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" event={"ID":"26813cbf-0ed2-460a-b36f-f1a8895e68ec","Type":"ContainerStarted","Data":"c7bdda87b6be0b91edb6312819400b08f4c510d8f48670a42254ff7f9a25f3f1"} Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.036373 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncvlk"] Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.062987 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:37 crc kubenswrapper[4719]: E1009 15:20:37.064916 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:37.564899451 +0000 UTC m=+143.074610736 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.111419 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fhfxg"] Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.166079 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.166988 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ppwzx" Oct 09 15:20:37 crc kubenswrapper[4719]: E1009 15:20:37.167388 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:37.667341562 +0000 UTC m=+143.177052847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.228904 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hz62m"] Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.246466 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-vdfqp" Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.263573 4719 patch_prober.go:28] interesting pod/router-default-5444994796-vdfqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 15:20:37 crc kubenswrapper[4719]: [-]has-synced failed: reason withheld Oct 09 15:20:37 crc kubenswrapper[4719]: [+]process-running ok Oct 09 15:20:37 crc kubenswrapper[4719]: healthz check failed Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.263679 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vdfqp" podUID="d2a62908-86f6-4b7f-9169-cb7a9ef1ece8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.267621 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:37 crc kubenswrapper[4719]: E1009 15:20:37.268294 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:37.768262574 +0000 UTC m=+143.277973859 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.369532 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:37 crc kubenswrapper[4719]: E1009 15:20:37.369803 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:37.869791146 +0000 UTC m=+143.379502421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.470635 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:37 crc kubenswrapper[4719]: E1009 15:20:37.471404 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:37.97137781 +0000 UTC m=+143.481089095 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.564971 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dwt2p"] Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.571777 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t7zd8"] Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.574521 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:37 crc kubenswrapper[4719]: E1009 15:20:37.574860 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:38.074846384 +0000 UTC m=+143.584557669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.622490 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ffm6c"] Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.649571 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8v6g2"] Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.651617 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x7wsc"] Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.660650 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ff97c"] Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.675995 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:37 crc kubenswrapper[4719]: E1009 15:20:37.676449 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:38.176431677 +0000 UTC m=+143.686142962 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:37 crc kubenswrapper[4719]: W1009 15:20:37.694976 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b1a8c7a_d66f_45fe_b870_5c0f38b38fc7.slice/crio-9040c5f0f2fbdaf36169f27fb223bbd93aae6a616f4c52e7363530c3e8860be0 WatchSource:0}: Error finding container 9040c5f0f2fbdaf36169f27fb223bbd93aae6a616f4c52e7363530c3e8860be0: Status 404 returned error can't find the container with id 9040c5f0f2fbdaf36169f27fb223bbd93aae6a616f4c52e7363530c3e8860be0 Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.703093 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-762nw"] Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.704823 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6g6x"] Oct 09 15:20:37 crc kubenswrapper[4719]: W1009 15:20:37.727810 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod070fb955_7eaa_4ad0_a9a0_cda60314743f.slice/crio-e20f06b011696873b35d9515d3c1bd839ba412edd131b3cdcb0ac19ad3555253 WatchSource:0}: Error finding container e20f06b011696873b35d9515d3c1bd839ba412edd131b3cdcb0ac19ad3555253: Status 404 returned error can't find the container with id e20f06b011696873b35d9515d3c1bd839ba412edd131b3cdcb0ac19ad3555253 Oct 09 15:20:37 crc kubenswrapper[4719]: W1009 15:20:37.730278 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10f4c85b_1896_4882_9ee0_8117ddf6a7a6.slice/crio-4c7d8a94b8dfc1324fadfa433eaeb0e78178baef086e55228f3a5c2ae3cdf5d0 WatchSource:0}: Error finding container 4c7d8a94b8dfc1324fadfa433eaeb0e78178baef086e55228f3a5c2ae3cdf5d0: Status 404 returned error can't find the container with id 4c7d8a94b8dfc1324fadfa433eaeb0e78178baef086e55228f3a5c2ae3cdf5d0 Oct 09 15:20:37 crc kubenswrapper[4719]: W1009 15:20:37.735753 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74cde4fb_7e17_40dc_8537_54b5ecb898d7.slice/crio-7131f0460a2ff186af35e7a3e4d57891ab7800f2d8590925eb3378e813e3907c WatchSource:0}: Error finding container 7131f0460a2ff186af35e7a3e4d57891ab7800f2d8590925eb3378e813e3907c: Status 404 returned error can't find the container with id 7131f0460a2ff186af35e7a3e4d57891ab7800f2d8590925eb3378e813e3907c Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.751248 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-d88hv"] Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.774541 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hf8sg"] Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.775334 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-98tjn"] Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.805004 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:37 crc kubenswrapper[4719]: E1009 15:20:37.805472 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:38.305459645 +0000 UTC m=+143.815170930 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.828596 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-z2lvl"] Oct 09 15:20:37 crc kubenswrapper[4719]: I1009 15:20:37.911281 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:37 crc kubenswrapper[4719]: E1009 15:20:37.911795 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:38.411778601 +0000 UTC m=+143.921489886 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.007775 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qk8b7"] Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.013178 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:38 crc kubenswrapper[4719]: E1009 15:20:38.013570 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:38.513555061 +0000 UTC m=+144.023266346 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:38 crc kubenswrapper[4719]: W1009 15:20:38.017265 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaa41d9a_ad1b_4c1b_8e40_07e4f0d3b3e4.slice/crio-17dfad8f354f7501748f4118b4bd99db7386975d27701474f2be6f4e6cbe4e24 WatchSource:0}: Error finding container 17dfad8f354f7501748f4118b4bd99db7386975d27701474f2be6f4e6cbe4e24: Status 404 returned error can't find the container with id 17dfad8f354f7501748f4118b4bd99db7386975d27701474f2be6f4e6cbe4e24 Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.029128 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6ql7w" event={"ID":"b28184c2-4cb3-4fe7-9c69-3fcf55a0d0e0","Type":"ContainerStarted","Data":"a4e8afd9318bd8fb7fbb2b0f500bc57b227c6ed67210cbeaf827e8ca3888fcbe"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.062226 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8v6g2" event={"ID":"74cde4fb-7e17-40dc-8537-54b5ecb898d7","Type":"ContainerStarted","Data":"7131f0460a2ff186af35e7a3e4d57891ab7800f2d8590925eb3378e813e3907c"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.098151 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-xw899" event={"ID":"53e8b265-c7b0-4b11-bcb0-225ada8332ce","Type":"ContainerStarted","Data":"a61022fa05ab65c88886fdaff1d948ae8d214062bfabed613c77fd2b46c1f4b9"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.102227 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qd6rx" event={"ID":"7b6a94c8-9172-472f-8d40-4c259b21c6b9","Type":"ContainerStarted","Data":"bb017fe40050349bbab18bb543d468b645832df8af8a9da74c1c6d2fec8b54e7"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.102260 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qd6rx" event={"ID":"7b6a94c8-9172-472f-8d40-4c259b21c6b9","Type":"ContainerStarted","Data":"179aa9734ed975b4c80ae6aa0aa55cca32fb01b3df979a9604cb61ba57cdbfe8"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.107689 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-p7bmv" event={"ID":"4091b4ed-3afe-4bab-b41e-0bca5b6f58b0","Type":"ContainerStarted","Data":"40ba870a6e182c9e772779aff7b5e4078103a37b81d3db99a598593ce52eccc2"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.108548 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-p7bmv" Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.110506 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-x8kq2"] Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.110624 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57b4h" event={"ID":"f1c3d8c5-6a51-4c89-96e9-f7fb62e5685b","Type":"ContainerStarted","Data":"91538e1bd90f9f5bedafc214180dedb93d6b05d3374939a3751597450cbc20a1"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.114809 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:38 crc kubenswrapper[4719]: E1009 15:20:38.115134 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:38.615119424 +0000 UTC m=+144.124830709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.116903 4719 patch_prober.go:28] interesting pod/downloads-7954f5f757-p7bmv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.116947 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-p7bmv" podUID="4091b4ed-3afe-4bab-b41e-0bca5b6f58b0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.125134 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ffm6c" event={"ID":"070fb955-7eaa-4ad0-a9a0-cda60314743f","Type":"ContainerStarted","Data":"e20f06b011696873b35d9515d3c1bd839ba412edd131b3cdcb0ac19ad3555253"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.133821 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2pph" event={"ID":"a643603c-791b-4f68-a320-b6a2bcabf91f","Type":"ContainerStarted","Data":"6da1bc66a8f7a7042ed59a909c73756b77a84746d2d639c6e2de719070f48657"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.143550 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333715-pgfd2" event={"ID":"24507f61-1a02-438c-b1ca-82515867e605","Type":"ContainerStarted","Data":"8dbe68ee06f635859c0f96ae0629d94ea2c2d8db1acb53020adb461514eb0de4"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.159360 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-j74ct" event={"ID":"c895d97a-7287-49a8-9ac5-bc87e8bcf297","Type":"ContainerStarted","Data":"51924a57cc9932369b7f474ecdb0e3d189eaaa35d8c93809d09ba0a50706fc03"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.168594 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q2vmh" event={"ID":"2aba1eb4-b085-48e7-941b-403c160fb3f4","Type":"ContainerStarted","Data":"04ec078a761d19fd98d99f59d5876546e5e3d767db937a748833347f50e5e439"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.180413 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-79ms4" event={"ID":"312693cc-d986-462d-ac50-f4e44dbc8cf1","Type":"ContainerStarted","Data":"da63ecbcdc152da1e3768a40ade10b946d5e58f2b1d74b3d2ac656f7aa1db1fc"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.181435 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-79ms4" Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.184344 4719 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-79ms4 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:5443/healthz\": dial tcp 10.217.0.26:5443: connect: connection refused" start-of-body= Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.184402 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-79ms4" podUID="312693cc-d986-462d-ac50-f4e44dbc8cf1" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.26:5443/healthz\": dial tcp 10.217.0.26:5443: connect: connection refused" Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.185594 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ff97c" event={"ID":"bc529b0d-9e56-4818-b017-7d035de4f2da","Type":"ContainerStarted","Data":"3a9ce373028ed03c4d857982c998e4dfd737340038ae709812708a4e41249433"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.189377 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dwt2p" event={"ID":"2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7","Type":"ContainerStarted","Data":"9040c5f0f2fbdaf36169f27fb223bbd93aae6a616f4c52e7363530c3e8860be0"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.192993 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-75q5v" event={"ID":"f07d2126-7037-4b5c-aa67-4d09bf873e07","Type":"ContainerStarted","Data":"cdc0259a95f36fb7ec08786455207d572904dc6cc99e30382d53d952ab9c77a7"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.195559 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2d4s7" event={"ID":"4b4724c8-6007-4df3-b822-42d08ea33fde","Type":"ContainerStarted","Data":"c1290f777ca499da2a74115797a2cc7f92baf7a5e4f09d2992fe5acdb638fed9"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.204423 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-k5w5x" event={"ID":"bc178477-fdbe-4189-80e7-5ceba0100dbd","Type":"ContainerStarted","Data":"3e89998b10a583e1596f1508f7a8f71e7ce929d424313f4f07a43876f2f43f0c"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.216851 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:38 crc kubenswrapper[4719]: E1009 15:20:38.219762 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:38.719750185 +0000 UTC m=+144.229461470 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.230759 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-762nw" event={"ID":"4f7457b2-c2c6-444c-8bc9-563f02df2183","Type":"ContainerStarted","Data":"73a9537357437de425ddc8d9d714e4efb65250257f5b352f5e7d36dcc3f77d75"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.233511 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ppwzx"] Oct 09 15:20:38 crc kubenswrapper[4719]: W1009 15:20:38.234729 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf679e74b_ce46_427e_928b_305b4579ca44.slice/crio-d4895f9cb3dcb6de7f235919cd64b233c0d0b8e2cbdc2e11fdaa8fd94a3b91d1 WatchSource:0}: Error finding container d4895f9cb3dcb6de7f235919cd64b233c0d0b8e2cbdc2e11fdaa8fd94a3b91d1: Status 404 returned error can't find the container with id d4895f9cb3dcb6de7f235919cd64b233c0d0b8e2cbdc2e11fdaa8fd94a3b91d1 Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.237709 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncvlk" event={"ID":"33659bcf-6d50-402b-a0da-7610749b535c","Type":"ContainerStarted","Data":"2abfc1fa4bae23f5703b5c0874aa4f61afa0e72cce9a93bb6c4afe7a09818bbe"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.245997 4719 generic.go:334] "Generic (PLEG): container finished" podID="26813cbf-0ed2-460a-b36f-f1a8895e68ec" containerID="f401497341742a19481b5fb37ad5fbd4638636dd324fa5539868ea9cf5a1dc03" exitCode=0 Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.246060 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" event={"ID":"26813cbf-0ed2-460a-b36f-f1a8895e68ec","Type":"ContainerDied","Data":"f401497341742a19481b5fb37ad5fbd4638636dd324fa5539868ea9cf5a1dc03"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.252021 4719 patch_prober.go:28] interesting pod/router-default-5444994796-vdfqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 15:20:38 crc kubenswrapper[4719]: [-]has-synced failed: reason withheld Oct 09 15:20:38 crc kubenswrapper[4719]: [+]process-running ok Oct 09 15:20:38 crc kubenswrapper[4719]: healthz check failed Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.256476 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vdfqp" podUID="d2a62908-86f6-4b7f-9169-cb7a9ef1ece8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.255668 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x7wsc" event={"ID":"78762e7d-4683-45ec-b6ac-c646dc1eb8e8","Type":"ContainerStarted","Data":"3666ccc93b1225372ce917d5f4b698c233ee8ee71d1bf51515e992c5f7c663cc"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.281609 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t7zd8" event={"ID":"10f4c85b-1896-4882-9ee0-8117ddf6a7a6","Type":"ContainerStarted","Data":"4c7d8a94b8dfc1324fadfa433eaeb0e78178baef086e55228f3a5c2ae3cdf5d0"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.289223 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9jzcj" event={"ID":"19fada05-fea3-4a67-a138-1cf460e31a2b","Type":"ContainerStarted","Data":"df9e9a639ab669260220d1fde15f5c3d4fd5d6d39b292a0ece36c8b913a695e8"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.296860 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" event={"ID":"2a4743d0-e646-45ab-a225-816c0d99246a","Type":"ContainerStarted","Data":"0d62ffa5d2b292d56e53efa5265a5a666660a9a2c862e3a12faf8ee2f5e7331c"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.309065 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-98tjn" event={"ID":"8384cfeb-4188-4b7e-b371-3b9e1032781f","Type":"ContainerStarted","Data":"90e9c97635118cb12c123df6909d90e15387484c28e26467dc6a27ca50aa7009"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.318268 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:38 crc kubenswrapper[4719]: E1009 15:20:38.337805 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:38.837785549 +0000 UTC m=+144.347496834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.345293 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-22bgx" event={"ID":"cf975172-874f-418c-947b-6226e1662647","Type":"ContainerStarted","Data":"ae5bc5e24d2a2b9a0ddf2655261b7a34f27e9f13563bc8f329b61c3bfa210dcc"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.347216 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-22bgx" Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.355782 4719 patch_prober.go:28] interesting pod/console-operator-58897d9998-22bgx container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.355858 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-22bgx" podUID="cf975172-874f-418c-947b-6226e1662647" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.382286 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" podStartSLOduration=124.382260142 podStartE2EDuration="2m4.382260142s" podCreationTimestamp="2025-10-09 15:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:38.360813641 +0000 UTC m=+143.870524956" watchObservedRunningTime="2025-10-09 15:20:38.382260142 +0000 UTC m=+143.891971437" Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.382980 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9599k" event={"ID":"563b4e02-b3d3-4f24-b571-091d77871f9b","Type":"ContainerStarted","Data":"fe987262f5233b04b0ecd15fad65acb63f5d899875f64dcdabd2f27b7bb583ce"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.403840 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb" event={"ID":"c06c731d-cb94-49f2-8afb-899c7c6e7724","Type":"ContainerStarted","Data":"61d007c207535c18c24bb726f5a501553f4dfc85b9e0e1b2812a8247af687744"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.404465 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb" Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.412513 4719 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-rwhcb container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.412578 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb" podUID="c06c731d-cb94-49f2-8afb-899c7c6e7724" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.414333 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hf8sg" event={"ID":"4f2f05e5-cf9a-4f30-8b2e-8a2b87f26ad4","Type":"ContainerStarted","Data":"391ed06b185dae824c3d3411d65dde3ae63f593ef63584343df21585f95adec0"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.420102 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.422338 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-n6hr2" event={"ID":"3df7532d-cc40-4430-aa65-d2abaaa9f2a1","Type":"ContainerStarted","Data":"8446994e7b38e6e2d106322440f9d41cde7205a112a5bd9be95cf9c74e8fd81b"} Oct 09 15:20:38 crc kubenswrapper[4719]: E1009 15:20:38.425168 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:38.925149014 +0000 UTC m=+144.434860489 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.428310 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hz62m" event={"ID":"97eeda6e-c63f-4b48-a8bd-05e673d79117","Type":"ContainerStarted","Data":"f4be172e3d191e1690d08ea3a09b48a913124a8ec111cd39dd79c8528b02d10b"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.434068 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-d88hv" event={"ID":"36c83e0c-8489-4fc6-9343-9fa35c1e75cb","Type":"ContainerStarted","Data":"6cdc082b878432a72dee285c8a677f6a98c9d9d04f6f1eb851e1f90b968eedcb"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.437210 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6g6x" event={"ID":"7ff0f956-c475-4d8b-9ef0-8dab346c53f6","Type":"ContainerStarted","Data":"f15fb8ce179252fee54779e43e369380796efc10edb5a9fcf3d3b0d9316f28b1"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.441267 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fhfxg" event={"ID":"816f9c0b-05db-4dfc-8edb-ba2e0a14d43d","Type":"ContainerStarted","Data":"f94e53a86b69d379f52a6c4589dd89b3b1d5d0ae04fa354fd6b5f27c1e1e8b67"} Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.475882 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.520908 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:38 crc kubenswrapper[4719]: E1009 15:20:38.522087 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:39.022065947 +0000 UTC m=+144.531777232 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.622117 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:38 crc kubenswrapper[4719]: E1009 15:20:38.622484 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:39.122456633 +0000 UTC m=+144.632167918 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.723803 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:38 crc kubenswrapper[4719]: E1009 15:20:38.724106 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:39.224080907 +0000 UTC m=+144.733792182 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.724290 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:38 crc kubenswrapper[4719]: E1009 15:20:38.724959 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:39.224951175 +0000 UTC m=+144.734662460 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.731142 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-vdfqp" podStartSLOduration=123.731122184 podStartE2EDuration="2m3.731122184s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:38.727564579 +0000 UTC m=+144.237275864" watchObservedRunningTime="2025-10-09 15:20:38.731122184 +0000 UTC m=+144.240833469" Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.828300 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:38 crc kubenswrapper[4719]: E1009 15:20:38.828850 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:39.328828203 +0000 UTC m=+144.838539498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.828888 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:38 crc kubenswrapper[4719]: E1009 15:20:38.832389 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:39.332375147 +0000 UTC m=+144.842086422 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.934587 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:38 crc kubenswrapper[4719]: E1009 15:20:38.935251 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:39.435231851 +0000 UTC m=+144.944943136 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:38 crc kubenswrapper[4719]: I1009 15:20:38.968330 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-k5w5x" podStartSLOduration=124.968307467 podStartE2EDuration="2m4.968307467s" podCreationTimestamp="2025-10-09 15:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:38.964964479 +0000 UTC m=+144.474675784" watchObservedRunningTime="2025-10-09 15:20:38.968307467 +0000 UTC m=+144.478018752" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.024805 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-79ms4" podStartSLOduration=124.024783557 podStartE2EDuration="2m4.024783557s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:39.023861257 +0000 UTC m=+144.533572542" watchObservedRunningTime="2025-10-09 15:20:39.024783557 +0000 UTC m=+144.534494842" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.026558 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-22bgx" podStartSLOduration=125.026532043 podStartE2EDuration="2m5.026532043s" podCreationTimestamp="2025-10-09 15:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:39.002623463 +0000 UTC m=+144.512334748" watchObservedRunningTime="2025-10-09 15:20:39.026532043 +0000 UTC m=+144.536243328" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.041159 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:39 crc kubenswrapper[4719]: E1009 15:20:39.041449 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:39.541438204 +0000 UTC m=+145.051149479 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.081061 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2d4s7" podStartSLOduration=124.08104248 podStartE2EDuration="2m4.08104248s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:39.064322051 +0000 UTC m=+144.574033336" watchObservedRunningTime="2025-10-09 15:20:39.08104248 +0000 UTC m=+144.590753795" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.096261 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-75q5v" podStartSLOduration=124.096242639 podStartE2EDuration="2m4.096242639s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:39.094752792 +0000 UTC m=+144.604464077" watchObservedRunningTime="2025-10-09 15:20:39.096242639 +0000 UTC m=+144.605953924" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.135742 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qd6rx" podStartSLOduration=124.135720742 podStartE2EDuration="2m4.135720742s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:39.133139769 +0000 UTC m=+144.642851074" watchObservedRunningTime="2025-10-09 15:20:39.135720742 +0000 UTC m=+144.645432037" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.147938 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:39 crc kubenswrapper[4719]: E1009 15:20:39.148132 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:39.648105261 +0000 UTC m=+145.157816536 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.148247 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:39 crc kubenswrapper[4719]: E1009 15:20:39.148655 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:39.648640099 +0000 UTC m=+145.158351394 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.263018 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:39 crc kubenswrapper[4719]: E1009 15:20:39.263490 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:39.763471219 +0000 UTC m=+145.273182514 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.276735 4719 patch_prober.go:28] interesting pod/router-default-5444994796-vdfqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 15:20:39 crc kubenswrapper[4719]: [-]has-synced failed: reason withheld Oct 09 15:20:39 crc kubenswrapper[4719]: [+]process-running ok Oct 09 15:20:39 crc kubenswrapper[4719]: healthz check failed Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.276784 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vdfqp" podUID="d2a62908-86f6-4b7f-9169-cb7a9ef1ece8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.285412 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-p7bmv" podStartSLOduration=125.285389825 podStartE2EDuration="2m5.285389825s" podCreationTimestamp="2025-10-09 15:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:39.282623736 +0000 UTC m=+144.792335041" watchObservedRunningTime="2025-10-09 15:20:39.285389825 +0000 UTC m=+144.795101120" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.366084 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:39 crc kubenswrapper[4719]: E1009 15:20:39.366726 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:39.866714106 +0000 UTC m=+145.376425391 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.408771 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-j74ct" podStartSLOduration=125.40875294 podStartE2EDuration="2m5.40875294s" podCreationTimestamp="2025-10-09 15:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:39.344434748 +0000 UTC m=+144.854146043" watchObservedRunningTime="2025-10-09 15:20:39.40875294 +0000 UTC m=+144.918464225" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.467058 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c2pph" podStartSLOduration=125.467038429 podStartE2EDuration="2m5.467038429s" podCreationTimestamp="2025-10-09 15:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:39.444924806 +0000 UTC m=+144.954636101" watchObservedRunningTime="2025-10-09 15:20:39.467038429 +0000 UTC m=+144.976749714" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.479324 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:39 crc kubenswrapper[4719]: E1009 15:20:39.479888 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:39.979873462 +0000 UTC m=+145.489584747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.507524 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q2vmh" podStartSLOduration=125.507508592 podStartE2EDuration="2m5.507508592s" podCreationTimestamp="2025-10-09 15:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:39.507032217 +0000 UTC m=+145.016743502" watchObservedRunningTime="2025-10-09 15:20:39.507508592 +0000 UTC m=+145.017219877" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.511368 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t7zd8" event={"ID":"10f4c85b-1896-4882-9ee0-8117ddf6a7a6","Type":"ContainerStarted","Data":"9fe1f82680206a5ccc69a4584832c0419effb032337a46c3e1dba15c93c454d6"} Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.527016 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ppwzx" event={"ID":"e99b117c-49b1-4d38-abb8-1f0d397781c2","Type":"ContainerStarted","Data":"90fa13cfc09a9cf2625f70bc6497d6c351c195758f72c30645aef7d30e5ca579"} Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.580713 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:39 crc kubenswrapper[4719]: E1009 15:20:39.581219 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:40.081205027 +0000 UTC m=+145.590916312 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.581990 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dwt2p" event={"ID":"2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7","Type":"ContainerStarted","Data":"06db45c647b85950356f4849b194448ab307dd055ace46838c4317b0ed9d9479"} Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.584686 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-dwt2p" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.585117 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9jzcj" podStartSLOduration=124.585099693 podStartE2EDuration="2m4.585099693s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:39.582810859 +0000 UTC m=+145.092522164" watchObservedRunningTime="2025-10-09 15:20:39.585099693 +0000 UTC m=+145.094810978" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.585963 4719 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dwt2p container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.586046 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dwt2p" podUID="2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.611737 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-z2lvl" event={"ID":"daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4","Type":"ContainerStarted","Data":"17dfad8f354f7501748f4118b4bd99db7386975d27701474f2be6f4e6cbe4e24"} Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.637240 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb" podStartSLOduration=124.637224582 podStartE2EDuration="2m4.637224582s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:39.623010414 +0000 UTC m=+145.132721709" watchObservedRunningTime="2025-10-09 15:20:39.637224582 +0000 UTC m=+145.146935867" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.657478 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ffm6c" event={"ID":"070fb955-7eaa-4ad0-a9a0-cda60314743f","Type":"ContainerStarted","Data":"50bef77a437a41bc4571b6e0af6865f279e8aca2d1e25c6dc447090457a28b5b"} Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.665539 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fhfxg" event={"ID":"816f9c0b-05db-4dfc-8edb-ba2e0a14d43d","Type":"ContainerStarted","Data":"711c4163ff44a0b3374cb086c4b9a713bd9d2fe68003e47c9fe2eb6fd224e072"} Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.671197 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-dwt2p" podStartSLOduration=124.671178246 podStartE2EDuration="2m4.671178246s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:39.662661762 +0000 UTC m=+145.172373047" watchObservedRunningTime="2025-10-09 15:20:39.671178246 +0000 UTC m=+145.180889531" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.683855 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:39 crc kubenswrapper[4719]: E1009 15:20:39.685580 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:40.18556364 +0000 UTC m=+145.695274925 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.706618 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hf8sg" event={"ID":"4f2f05e5-cf9a-4f30-8b2e-8a2b87f26ad4","Type":"ContainerStarted","Data":"3f7df55caeb65c410de0d89ce3ec749b07723e54aff1740a7bbe279bdb6f26fc"} Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.708612 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" event={"ID":"2a4743d0-e646-45ab-a225-816c0d99246a","Type":"ContainerStarted","Data":"368651fe1e7a4e823d7ef7fc1036b74d1ed08a186e6c6f7bbee4c752bb869142"} Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.709744 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.710640 4719 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-db9tz container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.21:6443/healthz\": dial tcp 10.217.0.21:6443: connect: connection refused" start-of-body= Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.710730 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" podUID="2a4743d0-e646-45ab-a225-816c0d99246a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.21:6443/healthz\": dial tcp 10.217.0.21:6443: connect: connection refused" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.722205 4719 generic.go:334] "Generic (PLEG): container finished" podID="563b4e02-b3d3-4f24-b571-091d77871f9b" containerID="fe987262f5233b04b0ecd15fad65acb63f5d899875f64dcdabd2f27b7bb583ce" exitCode=0 Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.722298 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9599k" event={"ID":"563b4e02-b3d3-4f24-b571-091d77871f9b","Type":"ContainerDied","Data":"fe987262f5233b04b0ecd15fad65acb63f5d899875f64dcdabd2f27b7bb583ce"} Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.735412 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" podStartSLOduration=125.735398306 podStartE2EDuration="2m5.735398306s" podCreationTimestamp="2025-10-09 15:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:39.734892069 +0000 UTC m=+145.244603354" watchObservedRunningTime="2025-10-09 15:20:39.735398306 +0000 UTC m=+145.245109591" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.736438 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-ffm6c" podStartSLOduration=124.73643339 podStartE2EDuration="2m4.73643339s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:39.694089445 +0000 UTC m=+145.203800740" watchObservedRunningTime="2025-10-09 15:20:39.73643339 +0000 UTC m=+145.246144675" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.754833 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-d88hv" event={"ID":"36c83e0c-8489-4fc6-9343-9fa35c1e75cb","Type":"ContainerStarted","Data":"36eb89b7aae11fdd31ada9a436336b6cd0197b063d661ebed470064fa72ce041"} Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.760540 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-xw899" event={"ID":"53e8b265-c7b0-4b11-bcb0-225ada8332ce","Type":"ContainerStarted","Data":"31b9e2d083c4e53910e7d35b0546638b104259a1cbf034b0e7b0f2b96c5f9027"} Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.767390 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6g6x" event={"ID":"7ff0f956-c475-4d8b-9ef0-8dab346c53f6","Type":"ContainerStarted","Data":"7af91852369a7895617d8d152a368054ea57c34883927f54cdfed3aab9777cb2"} Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.769902 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-n6hr2" event={"ID":"3df7532d-cc40-4430-aa65-d2abaaa9f2a1","Type":"ContainerStarted","Data":"bd13eea333c3b276f83a0f1acd9492a94d7cfba02bf1ef44da8309fa1ca29e66"} Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.774178 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qk8b7" event={"ID":"732da19d-17b5-4af0-b9bc-13c30ee6b5f5","Type":"ContainerStarted","Data":"76b7f85d2a10371775394a82fa8a5bab79aaa6c9bce1ef88af89db125d5cfbad"} Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.786158 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:39 crc kubenswrapper[4719]: E1009 15:20:39.789689 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:40.289676105 +0000 UTC m=+145.799387390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.800575 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6ql7w" event={"ID":"b28184c2-4cb3-4fe7-9c69-3fcf55a0d0e0","Type":"ContainerStarted","Data":"88fe3dc1218719d000e577a382d0e438cfe7f63bc986e72e7709ae1173e5ed2e"} Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.823973 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-xw899" podStartSLOduration=125.823952329 podStartE2EDuration="2m5.823952329s" podCreationTimestamp="2025-10-09 15:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:39.804928416 +0000 UTC m=+145.314639701" watchObservedRunningTime="2025-10-09 15:20:39.823952329 +0000 UTC m=+145.333663604" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.832756 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ff97c" event={"ID":"bc529b0d-9e56-4818-b017-7d035de4f2da","Type":"ContainerStarted","Data":"285e5f13d79e5d65f8fd856a99a54ec25832402e027960882cdaf8048767a975"} Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.833448 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-d88hv" podStartSLOduration=6.833438685 podStartE2EDuration="6.833438685s" podCreationTimestamp="2025-10-09 15:20:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:39.783807856 +0000 UTC m=+145.293519141" watchObservedRunningTime="2025-10-09 15:20:39.833438685 +0000 UTC m=+145.343149970" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.835241 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qk8b7" podStartSLOduration=124.835233423 podStartE2EDuration="2m4.835233423s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:39.832049681 +0000 UTC m=+145.341760976" watchObservedRunningTime="2025-10-09 15:20:39.835233423 +0000 UTC m=+145.344944748" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.866627 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x7wsc" event={"ID":"78762e7d-4683-45ec-b6ac-c646dc1eb8e8","Type":"ContainerStarted","Data":"7cd3120e5cd2d34e004f1323f9268e27a02073843d9db2744f1e8413415f2197"} Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.879300 4719 generic.go:334] "Generic (PLEG): container finished" podID="633935e5-0232-4844-8f77-e87a7d4385cd" containerID="af575cf514989b9066a78dd765098185dcfb5204539a187192352c26d97f8605" exitCode=0 Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.879411 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" event={"ID":"633935e5-0232-4844-8f77-e87a7d4385cd","Type":"ContainerDied","Data":"af575cf514989b9066a78dd765098185dcfb5204539a187192352c26d97f8605"} Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.880185 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-n6hr2" podStartSLOduration=6.880177141 podStartE2EDuration="6.880177141s" podCreationTimestamp="2025-10-09 15:20:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:39.88015945 +0000 UTC m=+145.389870735" watchObservedRunningTime="2025-10-09 15:20:39.880177141 +0000 UTC m=+145.389888426" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.889395 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:39 crc kubenswrapper[4719]: E1009 15:20:39.892096 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:40.392076195 +0000 UTC m=+145.901787480 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.893975 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333715-pgfd2" event={"ID":"24507f61-1a02-438c-b1ca-82515867e605","Type":"ContainerStarted","Data":"8de3d9cf280680a9dc7deeace9f3fd771589dbc230018023ab1c30b8f7b9f35f"} Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.923097 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57b4h" event={"ID":"f1c3d8c5-6a51-4c89-96e9-f7fb62e5685b","Type":"ContainerStarted","Data":"27d3bee4ac4b24c6e952c883c85dd40b5a5205df18a51e2df6b31fb60466c96a"} Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.924043 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57b4h" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.925650 4719 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-57b4h container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.925709 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57b4h" podUID="f1c3d8c5-6a51-4c89-96e9-f7fb62e5685b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.964720 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hz62m" event={"ID":"97eeda6e-c63f-4b48-a8bd-05e673d79117","Type":"ContainerStarted","Data":"445314c194b8c04ba92ac17acbc5fa05b9b4fe1dfb74222cc8f2d97d81ba24ab"} Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.965630 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x7wsc" podStartSLOduration=124.965611835 podStartE2EDuration="2m4.965611835s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:39.907678407 +0000 UTC m=+145.417389712" watchObservedRunningTime="2025-10-09 15:20:39.965611835 +0000 UTC m=+145.475323120" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.966731 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" event={"ID":"f679e74b-ce46-427e-928b-305b4579ca44","Type":"ContainerStarted","Data":"d4895f9cb3dcb6de7f235919cd64b233c0d0b8e2cbdc2e11fdaa8fd94a3b91d1"} Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.971901 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncvlk" event={"ID":"33659bcf-6d50-402b-a0da-7610749b535c","Type":"ContainerStarted","Data":"57fa209ea2f1fa2f878f97b8cd5f52d631d27c8e57f3e22709342aaa6a320807"} Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.977444 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncvlk" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.978427 4719 patch_prober.go:28] interesting pod/downloads-7954f5f757-p7bmv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.978480 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-p7bmv" podUID="4091b4ed-3afe-4bab-b41e-0bca5b6f58b0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.978522 4719 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-ncvlk container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.978548 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncvlk" podUID="33659bcf-6d50-402b-a0da-7610749b535c" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Oct 09 15:20:39 crc kubenswrapper[4719]: I1009 15:20:39.992368 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:39 crc kubenswrapper[4719]: E1009 15:20:39.996129 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:40.496111977 +0000 UTC m=+146.005823262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:40 crc kubenswrapper[4719]: I1009 15:20:40.007100 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6ql7w" podStartSLOduration=126.007080991 podStartE2EDuration="2m6.007080991s" podCreationTimestamp="2025-10-09 15:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:40.002001367 +0000 UTC m=+145.511712662" watchObservedRunningTime="2025-10-09 15:20:40.007080991 +0000 UTC m=+145.516792276" Oct 09 15:20:40 crc kubenswrapper[4719]: I1009 15:20:40.033300 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29333715-pgfd2" podStartSLOduration=125.033281625 podStartE2EDuration="2m5.033281625s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:40.031625521 +0000 UTC m=+145.541336806" watchObservedRunningTime="2025-10-09 15:20:40.033281625 +0000 UTC m=+145.542992910" Oct 09 15:20:40 crc kubenswrapper[4719]: I1009 15:20:40.043245 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb" Oct 09 15:20:40 crc kubenswrapper[4719]: I1009 15:20:40.075328 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncvlk" podStartSLOduration=125.07531494 podStartE2EDuration="2m5.07531494s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:40.071953311 +0000 UTC m=+145.581664596" watchObservedRunningTime="2025-10-09 15:20:40.07531494 +0000 UTC m=+145.585026225" Oct 09 15:20:40 crc kubenswrapper[4719]: I1009 15:20:40.094067 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57b4h" podStartSLOduration=125.094052633 podStartE2EDuration="2m5.094052633s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:40.091873563 +0000 UTC m=+145.601584848" watchObservedRunningTime="2025-10-09 15:20:40.094052633 +0000 UTC m=+145.603763908" Oct 09 15:20:40 crc kubenswrapper[4719]: I1009 15:20:40.095415 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:40 crc kubenswrapper[4719]: E1009 15:20:40.095581 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:40.595552911 +0000 UTC m=+146.105264196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:40 crc kubenswrapper[4719]: I1009 15:20:40.095974 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:40 crc kubenswrapper[4719]: E1009 15:20:40.106836 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:40.606822374 +0000 UTC m=+146.116533659 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:40 crc kubenswrapper[4719]: I1009 15:20:40.199968 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:40 crc kubenswrapper[4719]: E1009 15:20:40.200431 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:40.700416081 +0000 UTC m=+146.210127366 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:40 crc kubenswrapper[4719]: I1009 15:20:40.255393 4719 patch_prober.go:28] interesting pod/router-default-5444994796-vdfqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 15:20:40 crc kubenswrapper[4719]: [-]has-synced failed: reason withheld Oct 09 15:20:40 crc kubenswrapper[4719]: [+]process-running ok Oct 09 15:20:40 crc kubenswrapper[4719]: healthz check failed Oct 09 15:20:40 crc kubenswrapper[4719]: I1009 15:20:40.255747 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vdfqp" podUID="d2a62908-86f6-4b7f-9169-cb7a9ef1ece8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 15:20:40 crc kubenswrapper[4719]: I1009 15:20:40.310413 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:40 crc kubenswrapper[4719]: E1009 15:20:40.310710 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:40.810700085 +0000 UTC m=+146.320411370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:40 crc kubenswrapper[4719]: I1009 15:20:40.337939 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-22bgx" Oct 09 15:20:40 crc kubenswrapper[4719]: I1009 15:20:40.411888 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:40 crc kubenswrapper[4719]: E1009 15:20:40.412228 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:40.912214046 +0000 UTC m=+146.421925331 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:40 crc kubenswrapper[4719]: I1009 15:20:40.515217 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:40 crc kubenswrapper[4719]: E1009 15:20:40.515790 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:41.015778323 +0000 UTC m=+146.525489598 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:40 crc kubenswrapper[4719]: I1009 15:20:40.619853 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:40 crc kubenswrapper[4719]: E1009 15:20:40.620208 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:41.120192399 +0000 UTC m=+146.629903684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:40 crc kubenswrapper[4719]: I1009 15:20:40.721865 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:40 crc kubenswrapper[4719]: E1009 15:20:40.722264 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:41.222253248 +0000 UTC m=+146.731964533 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:40 crc kubenswrapper[4719]: I1009 15:20:40.823230 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:40 crc kubenswrapper[4719]: E1009 15:20:40.823434 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:41.323403847 +0000 UTC m=+146.833115132 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:40 crc kubenswrapper[4719]: I1009 15:20:40.823875 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:40 crc kubenswrapper[4719]: E1009 15:20:40.824198 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:41.324186152 +0000 UTC m=+146.833897437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:40 crc kubenswrapper[4719]: I1009 15:20:40.924860 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:40 crc kubenswrapper[4719]: E1009 15:20:40.925094 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:41.425044142 +0000 UTC m=+146.934755427 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:40 crc kubenswrapper[4719]: I1009 15:20:40.969657 4719 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-79ms4 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 09 15:20:40 crc kubenswrapper[4719]: I1009 15:20:40.970051 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-79ms4" podUID="312693cc-d986-462d-ac50-f4e44dbc8cf1" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.26:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 09 15:20:40 crc kubenswrapper[4719]: I1009 15:20:40.980037 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hz62m" event={"ID":"97eeda6e-c63f-4b48-a8bd-05e673d79117","Type":"ContainerStarted","Data":"e71fa6fb9f2997d879af573fac24de41a2ce494e2a424e1f61bd958cac70b469"} Oct 09 15:20:40 crc kubenswrapper[4719]: I1009 15:20:40.992875 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" event={"ID":"f679e74b-ce46-427e-928b-305b4579ca44","Type":"ContainerStarted","Data":"e828e3e323f4ba40906b7091ae21cbe8a006c2f46f005e3721048b536f71596c"} Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.002094 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ppwzx" event={"ID":"e99b117c-49b1-4d38-abb8-1f0d397781c2","Type":"ContainerStarted","Data":"9442171e5e7d6a74d9e4a9f315aaf1b62d4b5a3d64c49518b8b5383687b667a2"} Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.005141 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" event={"ID":"26813cbf-0ed2-460a-b36f-f1a8895e68ec","Type":"ContainerStarted","Data":"743096d21b414680cf77c53f8d6279e957e4ac1059dbb6806def8fa3e3b408dc"} Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.015849 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-z2lvl" event={"ID":"daa41d9a-ad1b-4c1b-8e40-07e4f0d3b3e4","Type":"ContainerStarted","Data":"1aaef59f3fef10bbdededf76361dc73fa4297b1e938e44bf36e11b34fe9628ae"} Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.020788 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6g6x" event={"ID":"7ff0f956-c475-4d8b-9ef0-8dab346c53f6","Type":"ContainerStarted","Data":"82bea310dac87bf2ac51c3a079b6d7407992edfd0d0c10b218bff967adfbc78e"} Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.020930 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6g6x" Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.025990 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:41 crc kubenswrapper[4719]: E1009 15:20:41.026340 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:41.526328146 +0000 UTC m=+147.036039431 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.026526 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fhfxg" event={"ID":"816f9c0b-05db-4dfc-8edb-ba2e0a14d43d","Type":"ContainerStarted","Data":"d934b3bbc2e62f9269cccdf3debb7bbfb9ba7e589543d6a36f2973d793fc357f"} Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.030700 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-762nw" event={"ID":"4f7457b2-c2c6-444c-8bc9-563f02df2183","Type":"ContainerStarted","Data":"fdb3718dc8d164a67f93b473641f3d4af17fe30d5dc48dd47b8f4123f2974202"} Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.030742 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-762nw" event={"ID":"4f7457b2-c2c6-444c-8bc9-563f02df2183","Type":"ContainerStarted","Data":"396ba55f342280a0979254a2591e3c5406e1147d4a21c5a786d4e07ad231f122"} Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.052885 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-762nw" Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.066555 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-qk8b7" event={"ID":"732da19d-17b5-4af0-b9bc-13c30ee6b5f5","Type":"ContainerStarted","Data":"d9e62cb6357a4b1b911320c9f060f286b7bff1451ef5af03b1347a980dabffa2"} Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.088072 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8v6g2" event={"ID":"74cde4fb-7e17-40dc-8537-54b5ecb898d7","Type":"ContainerStarted","Data":"51bdb13b8b1daa04f0e4a337bd27748fa5c35247f98e8fd02a1a9cd815a8586c"} Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.118384 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hz62m" podStartSLOduration=126.118368322 podStartE2EDuration="2m6.118368322s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:41.052289303 +0000 UTC m=+146.562000588" watchObservedRunningTime="2025-10-09 15:20:41.118368322 +0000 UTC m=+146.628079607" Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.118859 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ppwzx" podStartSLOduration=126.118851738 podStartE2EDuration="2m6.118851738s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:41.118718333 +0000 UTC m=+146.628429628" watchObservedRunningTime="2025-10-09 15:20:41.118851738 +0000 UTC m=+146.628563023" Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.125852 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9599k" event={"ID":"563b4e02-b3d3-4f24-b571-091d77871f9b","Type":"ContainerStarted","Data":"2502d106522763a8ba19b762edfdffa6ecf8fda5fbe664afcf0d0b98b554cb10"} Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.126460 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9599k" Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.127720 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:41 crc kubenswrapper[4719]: E1009 15:20:41.129335 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:41.629311975 +0000 UTC m=+147.139023260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.185985 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" event={"ID":"633935e5-0232-4844-8f77-e87a7d4385cd","Type":"ContainerStarted","Data":"3e9aeaa8ab874aee1c9cf471228d8ca309ddd7e5b9c3d2067596a5a79a303458"} Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.186021 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hf8sg" event={"ID":"4f2f05e5-cf9a-4f30-8b2e-8a2b87f26ad4","Type":"ContainerStarted","Data":"015a33a7f165b8b2bbc60739430747ee24c6000c808b8fac07475e218bde2078"} Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.205497 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-98tjn" event={"ID":"8384cfeb-4188-4b7e-b371-3b9e1032781f","Type":"ContainerStarted","Data":"c2d0030089574967844a21375f6334ae5319e138d2ff49fe937dbd3741f489d7"} Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.228341 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6g6x" podStartSLOduration=126.228327066 podStartE2EDuration="2m6.228327066s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:41.227744697 +0000 UTC m=+146.737455992" watchObservedRunningTime="2025-10-09 15:20:41.228327066 +0000 UTC m=+146.738038351" Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.228519 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-z2lvl" podStartSLOduration=127.228514212 podStartE2EDuration="2m7.228514212s" podCreationTimestamp="2025-10-09 15:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:41.191733876 +0000 UTC m=+146.701445171" watchObservedRunningTime="2025-10-09 15:20:41.228514212 +0000 UTC m=+146.738225497" Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.231777 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ff97c" event={"ID":"bc529b0d-9e56-4818-b017-7d035de4f2da","Type":"ContainerStarted","Data":"ef2e290a876c9569af3520fb9b790fd99796cec97934847d5986f94d7c0b8341"} Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.233277 4719 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dwt2p container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.233329 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dwt2p" podUID="2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.234599 4719 patch_prober.go:28] interesting pod/downloads-7954f5f757-p7bmv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.234639 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-p7bmv" podUID="4091b4ed-3afe-4bab-b41e-0bca5b6f58b0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.234967 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:41 crc kubenswrapper[4719]: E1009 15:20:41.236522 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:41.736506669 +0000 UTC m=+147.246217964 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.249843 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-79ms4" Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.250300 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncvlk" Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.255679 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57b4h" Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.255982 4719 patch_prober.go:28] interesting pod/router-default-5444994796-vdfqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 15:20:41 crc kubenswrapper[4719]: [-]has-synced failed: reason withheld Oct 09 15:20:41 crc kubenswrapper[4719]: [+]process-running ok Oct 09 15:20:41 crc kubenswrapper[4719]: healthz check failed Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.256018 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vdfqp" podUID="d2a62908-86f6-4b7f-9169-cb7a9ef1ece8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.262141 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fhfxg" podStartSLOduration=126.262109474 podStartE2EDuration="2m6.262109474s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:41.260876594 +0000 UTC m=+146.770587879" watchObservedRunningTime="2025-10-09 15:20:41.262109474 +0000 UTC m=+146.771820769" Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.304436 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-762nw" podStartSLOduration=8.304418428 podStartE2EDuration="8.304418428s" podCreationTimestamp="2025-10-09 15:20:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:41.30262282 +0000 UTC m=+146.812334105" watchObservedRunningTime="2025-10-09 15:20:41.304418428 +0000 UTC m=+146.814129703" Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.337029 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:41 crc kubenswrapper[4719]: E1009 15:20:41.338832 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:41.838816456 +0000 UTC m=+147.348527741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.383957 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9599k" podStartSLOduration=127.38393823 podStartE2EDuration="2m7.38393823s" podCreationTimestamp="2025-10-09 15:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:41.352635931 +0000 UTC m=+146.862347216" watchObservedRunningTime="2025-10-09 15:20:41.38393823 +0000 UTC m=+146.893649515" Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.422394 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" podStartSLOduration=126.422377988 podStartE2EDuration="2m6.422377988s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:41.421312364 +0000 UTC m=+146.931023649" watchObservedRunningTime="2025-10-09 15:20:41.422377988 +0000 UTC m=+146.932089273" Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.439328 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:41 crc kubenswrapper[4719]: E1009 15:20:41.439718 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:41.939705827 +0000 UTC m=+147.449417112 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.448624 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8v6g2" podStartSLOduration=126.448579233 podStartE2EDuration="2m6.448579233s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:41.448362476 +0000 UTC m=+146.958073781" watchObservedRunningTime="2025-10-09 15:20:41.448579233 +0000 UTC m=+146.958290518" Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.514842 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.534026 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t7zd8" podStartSLOduration=126.534010016 podStartE2EDuration="2m6.534010016s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:41.514594291 +0000 UTC m=+147.024305616" watchObservedRunningTime="2025-10-09 15:20:41.534010016 +0000 UTC m=+147.043721301" Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.535876 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ff97c" podStartSLOduration=126.535870286 podStartE2EDuration="2m6.535870286s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:41.531464693 +0000 UTC m=+147.041175988" watchObservedRunningTime="2025-10-09 15:20:41.535870286 +0000 UTC m=+147.045581571" Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.542008 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:41 crc kubenswrapper[4719]: E1009 15:20:41.542655 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:42.042637104 +0000 UTC m=+147.552348389 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.563914 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hf8sg" podStartSLOduration=126.563901129 podStartE2EDuration="2m6.563901129s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:41.561667638 +0000 UTC m=+147.071378923" watchObservedRunningTime="2025-10-09 15:20:41.563901129 +0000 UTC m=+147.073612414" Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.646985 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:41 crc kubenswrapper[4719]: E1009 15:20:41.647623 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:42.147608337 +0000 UTC m=+147.657319622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.750170 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:41 crc kubenswrapper[4719]: E1009 15:20:41.750665 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:42.250637927 +0000 UTC m=+147.760349212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.851624 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:41 crc kubenswrapper[4719]: E1009 15:20:41.851940 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:42.351928701 +0000 UTC m=+147.861639986 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.952500 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:41 crc kubenswrapper[4719]: E1009 15:20:41.952677 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:42.452651936 +0000 UTC m=+147.962363221 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:41 crc kubenswrapper[4719]: I1009 15:20:41.952902 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:41 crc kubenswrapper[4719]: E1009 15:20:41.953185 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:42.453177174 +0000 UTC m=+147.962888459 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.053984 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:42 crc kubenswrapper[4719]: E1009 15:20:42.054255 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:42.55421301 +0000 UTC m=+148.063924295 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.054462 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:42 crc kubenswrapper[4719]: E1009 15:20:42.054783 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:42.554768318 +0000 UTC m=+148.064479603 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.134146 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8726g"] Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.135092 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8726g" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.139842 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.148871 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8726g"] Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.155316 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:42 crc kubenswrapper[4719]: E1009 15:20:42.155867 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:42.655849834 +0000 UTC m=+148.165561119 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.237661 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-98tjn" event={"ID":"8384cfeb-4188-4b7e-b371-3b9e1032781f","Type":"ContainerStarted","Data":"2d8f5cede423b203b27bf35b8d01dfade05c5c0fb423472f2310bd42e4705831"} Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.240769 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" event={"ID":"26813cbf-0ed2-460a-b36f-f1a8895e68ec","Type":"ContainerStarted","Data":"f86027dfb695bd91e6c2a37e26b8fb010aac7351f4cb29864b53ee57c52bab80"} Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.242993 4719 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dwt2p container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.243038 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dwt2p" podUID="2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.250600 4719 patch_prober.go:28] interesting pod/router-default-5444994796-vdfqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 15:20:42 crc kubenswrapper[4719]: [-]has-synced failed: reason withheld Oct 09 15:20:42 crc kubenswrapper[4719]: [+]process-running ok Oct 09 15:20:42 crc kubenswrapper[4719]: healthz check failed Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.250659 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vdfqp" podUID="d2a62908-86f6-4b7f-9169-cb7a9ef1ece8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.257142 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkmtq\" (UniqueName: \"kubernetes.io/projected/ed97f513-40b6-4273-b6a5-9f9f5150e4cd-kube-api-access-fkmtq\") pod \"community-operators-8726g\" (UID: \"ed97f513-40b6-4273-b6a5-9f9f5150e4cd\") " pod="openshift-marketplace/community-operators-8726g" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.257210 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.257243 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed97f513-40b6-4273-b6a5-9f9f5150e4cd-utilities\") pod \"community-operators-8726g\" (UID: \"ed97f513-40b6-4273-b6a5-9f9f5150e4cd\") " pod="openshift-marketplace/community-operators-8726g" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.257282 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed97f513-40b6-4273-b6a5-9f9f5150e4cd-catalog-content\") pod \"community-operators-8726g\" (UID: \"ed97f513-40b6-4273-b6a5-9f9f5150e4cd\") " pod="openshift-marketplace/community-operators-8726g" Oct 09 15:20:42 crc kubenswrapper[4719]: E1009 15:20:42.257603 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:42.757591833 +0000 UTC m=+148.267303118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.272717 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-98tjn" podStartSLOduration=127.27270181 podStartE2EDuration="2m7.27270181s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:42.271801101 +0000 UTC m=+147.781512386" watchObservedRunningTime="2025-10-09 15:20:42.27270181 +0000 UTC m=+147.782413095" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.311613 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" podStartSLOduration=128.311594123 podStartE2EDuration="2m8.311594123s" podCreationTimestamp="2025-10-09 15:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:42.309770334 +0000 UTC m=+147.819481629" watchObservedRunningTime="2025-10-09 15:20:42.311594123 +0000 UTC m=+147.821305408" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.326446 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-458dz"] Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.327335 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-458dz" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.330114 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.352201 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-458dz"] Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.359072 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:42 crc kubenswrapper[4719]: E1009 15:20:42.359278 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:42.859247569 +0000 UTC m=+148.368958854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.359424 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkmtq\" (UniqueName: \"kubernetes.io/projected/ed97f513-40b6-4273-b6a5-9f9f5150e4cd-kube-api-access-fkmtq\") pod \"community-operators-8726g\" (UID: \"ed97f513-40b6-4273-b6a5-9f9f5150e4cd\") " pod="openshift-marketplace/community-operators-8726g" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.359609 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.359709 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed97f513-40b6-4273-b6a5-9f9f5150e4cd-utilities\") pod \"community-operators-8726g\" (UID: \"ed97f513-40b6-4273-b6a5-9f9f5150e4cd\") " pod="openshift-marketplace/community-operators-8726g" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.359984 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed97f513-40b6-4273-b6a5-9f9f5150e4cd-catalog-content\") pod \"community-operators-8726g\" (UID: \"ed97f513-40b6-4273-b6a5-9f9f5150e4cd\") " pod="openshift-marketplace/community-operators-8726g" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.362596 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed97f513-40b6-4273-b6a5-9f9f5150e4cd-utilities\") pod \"community-operators-8726g\" (UID: \"ed97f513-40b6-4273-b6a5-9f9f5150e4cd\") " pod="openshift-marketplace/community-operators-8726g" Oct 09 15:20:42 crc kubenswrapper[4719]: E1009 15:20:42.363708 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:42.863688102 +0000 UTC m=+148.373399387 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.364417 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed97f513-40b6-4273-b6a5-9f9f5150e4cd-catalog-content\") pod \"community-operators-8726g\" (UID: \"ed97f513-40b6-4273-b6a5-9f9f5150e4cd\") " pod="openshift-marketplace/community-operators-8726g" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.414501 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkmtq\" (UniqueName: \"kubernetes.io/projected/ed97f513-40b6-4273-b6a5-9f9f5150e4cd-kube-api-access-fkmtq\") pod \"community-operators-8726g\" (UID: \"ed97f513-40b6-4273-b6a5-9f9f5150e4cd\") " pod="openshift-marketplace/community-operators-8726g" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.442483 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9599k" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.462405 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:42 crc kubenswrapper[4719]: E1009 15:20:42.462595 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:42.962577349 +0000 UTC m=+148.472288634 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.462763 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz6xj\" (UniqueName: \"kubernetes.io/projected/08d47ff5-80a6-4395-8481-2e7f1c2c1409-kube-api-access-fz6xj\") pod \"certified-operators-458dz\" (UID: \"08d47ff5-80a6-4395-8481-2e7f1c2c1409\") " pod="openshift-marketplace/certified-operators-458dz" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.463078 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08d47ff5-80a6-4395-8481-2e7f1c2c1409-utilities\") pod \"certified-operators-458dz\" (UID: \"08d47ff5-80a6-4395-8481-2e7f1c2c1409\") " pod="openshift-marketplace/certified-operators-458dz" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.463234 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08d47ff5-80a6-4395-8481-2e7f1c2c1409-catalog-content\") pod \"certified-operators-458dz\" (UID: \"08d47ff5-80a6-4395-8481-2e7f1c2c1409\") " pod="openshift-marketplace/certified-operators-458dz" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.463425 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:42 crc kubenswrapper[4719]: E1009 15:20:42.463715 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:42.963705504 +0000 UTC m=+148.473416789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.495293 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8726g" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.525570 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jr2j6"] Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.526521 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jr2j6" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.550118 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jr2j6"] Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.567033 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.567435 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz6xj\" (UniqueName: \"kubernetes.io/projected/08d47ff5-80a6-4395-8481-2e7f1c2c1409-kube-api-access-fz6xj\") pod \"certified-operators-458dz\" (UID: \"08d47ff5-80a6-4395-8481-2e7f1c2c1409\") " pod="openshift-marketplace/certified-operators-458dz" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.567483 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08d47ff5-80a6-4395-8481-2e7f1c2c1409-utilities\") pod \"certified-operators-458dz\" (UID: \"08d47ff5-80a6-4395-8481-2e7f1c2c1409\") " pod="openshift-marketplace/certified-operators-458dz" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.567528 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08d47ff5-80a6-4395-8481-2e7f1c2c1409-catalog-content\") pod \"certified-operators-458dz\" (UID: \"08d47ff5-80a6-4395-8481-2e7f1c2c1409\") " pod="openshift-marketplace/certified-operators-458dz" Oct 09 15:20:42 crc kubenswrapper[4719]: E1009 15:20:42.569445 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:43.069419532 +0000 UTC m=+148.579130817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.569624 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08d47ff5-80a6-4395-8481-2e7f1c2c1409-utilities\") pod \"certified-operators-458dz\" (UID: \"08d47ff5-80a6-4395-8481-2e7f1c2c1409\") " pod="openshift-marketplace/certified-operators-458dz" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.570255 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08d47ff5-80a6-4395-8481-2e7f1c2c1409-catalog-content\") pod \"certified-operators-458dz\" (UID: \"08d47ff5-80a6-4395-8481-2e7f1c2c1409\") " pod="openshift-marketplace/certified-operators-458dz" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.611156 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz6xj\" (UniqueName: \"kubernetes.io/projected/08d47ff5-80a6-4395-8481-2e7f1c2c1409-kube-api-access-fz6xj\") pod \"certified-operators-458dz\" (UID: \"08d47ff5-80a6-4395-8481-2e7f1c2c1409\") " pod="openshift-marketplace/certified-operators-458dz" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.644646 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-458dz" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.668463 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c3c0e14-9adc-47ec-af46-d0779a6c6e1d-catalog-content\") pod \"community-operators-jr2j6\" (UID: \"7c3c0e14-9adc-47ec-af46-d0779a6c6e1d\") " pod="openshift-marketplace/community-operators-jr2j6" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.668508 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c3c0e14-9adc-47ec-af46-d0779a6c6e1d-utilities\") pod \"community-operators-jr2j6\" (UID: \"7c3c0e14-9adc-47ec-af46-d0779a6c6e1d\") " pod="openshift-marketplace/community-operators-jr2j6" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.668576 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.668615 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72pjn\" (UniqueName: \"kubernetes.io/projected/7c3c0e14-9adc-47ec-af46-d0779a6c6e1d-kube-api-access-72pjn\") pod \"community-operators-jr2j6\" (UID: \"7c3c0e14-9adc-47ec-af46-d0779a6c6e1d\") " pod="openshift-marketplace/community-operators-jr2j6" Oct 09 15:20:42 crc kubenswrapper[4719]: E1009 15:20:42.668924 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:43.168911087 +0000 UTC m=+148.678622372 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.755596 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wb65c"] Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.756694 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wb65c" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.771803 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.772047 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72pjn\" (UniqueName: \"kubernetes.io/projected/7c3c0e14-9adc-47ec-af46-d0779a6c6e1d-kube-api-access-72pjn\") pod \"community-operators-jr2j6\" (UID: \"7c3c0e14-9adc-47ec-af46-d0779a6c6e1d\") " pod="openshift-marketplace/community-operators-jr2j6" Oct 09 15:20:42 crc kubenswrapper[4719]: E1009 15:20:42.772108 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:43.272084032 +0000 UTC m=+148.781795317 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.772312 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c3c0e14-9adc-47ec-af46-d0779a6c6e1d-catalog-content\") pod \"community-operators-jr2j6\" (UID: \"7c3c0e14-9adc-47ec-af46-d0779a6c6e1d\") " pod="openshift-marketplace/community-operators-jr2j6" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.772387 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c3c0e14-9adc-47ec-af46-d0779a6c6e1d-utilities\") pod \"community-operators-jr2j6\" (UID: \"7c3c0e14-9adc-47ec-af46-d0779a6c6e1d\") " pod="openshift-marketplace/community-operators-jr2j6" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.773626 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c3c0e14-9adc-47ec-af46-d0779a6c6e1d-utilities\") pod \"community-operators-jr2j6\" (UID: \"7c3c0e14-9adc-47ec-af46-d0779a6c6e1d\") " pod="openshift-marketplace/community-operators-jr2j6" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.775292 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c3c0e14-9adc-47ec-af46-d0779a6c6e1d-catalog-content\") pod \"community-operators-jr2j6\" (UID: \"7c3c0e14-9adc-47ec-af46-d0779a6c6e1d\") " pod="openshift-marketplace/community-operators-jr2j6" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.801892 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72pjn\" (UniqueName: \"kubernetes.io/projected/7c3c0e14-9adc-47ec-af46-d0779a6c6e1d-kube-api-access-72pjn\") pod \"community-operators-jr2j6\" (UID: \"7c3c0e14-9adc-47ec-af46-d0779a6c6e1d\") " pod="openshift-marketplace/community-operators-jr2j6" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.836266 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wb65c"] Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.875059 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.875116 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6e7a7e4-2eff-468f-b764-b1b73c4285f4-catalog-content\") pod \"certified-operators-wb65c\" (UID: \"b6e7a7e4-2eff-468f-b764-b1b73c4285f4\") " pod="openshift-marketplace/certified-operators-wb65c" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.875171 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrv4p\" (UniqueName: \"kubernetes.io/projected/b6e7a7e4-2eff-468f-b764-b1b73c4285f4-kube-api-access-nrv4p\") pod \"certified-operators-wb65c\" (UID: \"b6e7a7e4-2eff-468f-b764-b1b73c4285f4\") " pod="openshift-marketplace/certified-operators-wb65c" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.875234 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6e7a7e4-2eff-468f-b764-b1b73c4285f4-utilities\") pod \"certified-operators-wb65c\" (UID: \"b6e7a7e4-2eff-468f-b764-b1b73c4285f4\") " pod="openshift-marketplace/certified-operators-wb65c" Oct 09 15:20:42 crc kubenswrapper[4719]: E1009 15:20:42.875526 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:43.375512915 +0000 UTC m=+148.885224200 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.876397 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jr2j6" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.978912 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:42 crc kubenswrapper[4719]: E1009 15:20:42.979107 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:43.479078042 +0000 UTC m=+148.988789347 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.979179 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrv4p\" (UniqueName: \"kubernetes.io/projected/b6e7a7e4-2eff-468f-b764-b1b73c4285f4-kube-api-access-nrv4p\") pod \"certified-operators-wb65c\" (UID: \"b6e7a7e4-2eff-468f-b764-b1b73c4285f4\") " pod="openshift-marketplace/certified-operators-wb65c" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.979233 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.979276 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6e7a7e4-2eff-468f-b764-b1b73c4285f4-utilities\") pod \"certified-operators-wb65c\" (UID: \"b6e7a7e4-2eff-468f-b764-b1b73c4285f4\") " pod="openshift-marketplace/certified-operators-wb65c" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.979299 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.979319 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.979335 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6e7a7e4-2eff-468f-b764-b1b73c4285f4-catalog-content\") pod \"certified-operators-wb65c\" (UID: \"b6e7a7e4-2eff-468f-b764-b1b73c4285f4\") " pod="openshift-marketplace/certified-operators-wb65c" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.979715 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6e7a7e4-2eff-468f-b764-b1b73c4285f4-catalog-content\") pod \"certified-operators-wb65c\" (UID: \"b6e7a7e4-2eff-468f-b764-b1b73c4285f4\") " pod="openshift-marketplace/certified-operators-wb65c" Oct 09 15:20:42 crc kubenswrapper[4719]: E1009 15:20:42.980888 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:43.48086889 +0000 UTC m=+148.990580175 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.981207 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6e7a7e4-2eff-468f-b764-b1b73c4285f4-utilities\") pod \"certified-operators-wb65c\" (UID: \"b6e7a7e4-2eff-468f-b764-b1b73c4285f4\") " pod="openshift-marketplace/certified-operators-wb65c" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.987290 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:20:42 crc kubenswrapper[4719]: I1009 15:20:42.993560 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.008999 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrv4p\" (UniqueName: \"kubernetes.io/projected/b6e7a7e4-2eff-468f-b764-b1b73c4285f4-kube-api-access-nrv4p\") pod \"certified-operators-wb65c\" (UID: \"b6e7a7e4-2eff-468f-b764-b1b73c4285f4\") " pod="openshift-marketplace/certified-operators-wb65c" Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.038558 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8726g"] Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.082416 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:43 crc kubenswrapper[4719]: E1009 15:20:43.082569 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:43.582544986 +0000 UTC m=+149.092256271 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.082632 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.082677 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.082701 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:20:43 crc kubenswrapper[4719]: E1009 15:20:43.082978 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:43.58296746 +0000 UTC m=+149.092678745 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.090088 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.090744 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wb65c" Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.091312 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.092527 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.145083 4719 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.184973 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:43 crc kubenswrapper[4719]: E1009 15:20:43.185416 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 15:20:43.685395041 +0000 UTC m=+149.195106336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.257057 4719 patch_prober.go:28] interesting pod/router-default-5444994796-vdfqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 15:20:43 crc kubenswrapper[4719]: [-]has-synced failed: reason withheld Oct 09 15:20:43 crc kubenswrapper[4719]: [+]process-running ok Oct 09 15:20:43 crc kubenswrapper[4719]: healthz check failed Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.257108 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vdfqp" podUID="d2a62908-86f6-4b7f-9169-cb7a9ef1ece8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.274038 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" event={"ID":"f679e74b-ce46-427e-928b-305b4579ca44","Type":"ContainerStarted","Data":"b82452ecc69fbecc5073a01f292d2c4c4347b8534292ae8b5813074c695bc502"} Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.279119 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8726g" event={"ID":"ed97f513-40b6-4273-b6a5-9f9f5150e4cd","Type":"ContainerStarted","Data":"d9225dff3af8bc39d3903a75602bf4f012ea5889485a9b723131af923dff3e47"} Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.280633 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jr2j6"] Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.286311 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:43 crc kubenswrapper[4719]: E1009 15:20:43.286685 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 15:20:43.786672745 +0000 UTC m=+149.296384030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cqrnr" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.353731 4719 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-09T15:20:43.145106212Z","Handler":null,"Name":""} Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.359892 4719 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.359919 4719 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.380636 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.386990 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.389991 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-458dz"] Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.392165 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.394097 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.490728 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.515001 4719 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.515276 4719 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.555717 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cqrnr\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.573936 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wb65c"] Oct 09 15:20:43 crc kubenswrapper[4719]: I1009 15:20:43.847455 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:43 crc kubenswrapper[4719]: W1009 15:20:43.923166 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-ceab5bea5978d361ca79714c7c8d84cdffe6ab0167d1691801226522707ec7c0 WatchSource:0}: Error finding container ceab5bea5978d361ca79714c7c8d84cdffe6ab0167d1691801226522707ec7c0: Status 404 returned error can't find the container with id ceab5bea5978d361ca79714c7c8d84cdffe6ab0167d1691801226522707ec7c0 Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.120640 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nzwhh"] Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.121989 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nzwhh" Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.125105 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.142508 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nzwhh"] Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.156924 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cqrnr"] Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.207336 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84e55955-c37c-4897-ab18-f71812f3ccff-utilities\") pod \"redhat-marketplace-nzwhh\" (UID: \"84e55955-c37c-4897-ab18-f71812f3ccff\") " pod="openshift-marketplace/redhat-marketplace-nzwhh" Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.207426 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84e55955-c37c-4897-ab18-f71812f3ccff-catalog-content\") pod \"redhat-marketplace-nzwhh\" (UID: \"84e55955-c37c-4897-ab18-f71812f3ccff\") " pod="openshift-marketplace/redhat-marketplace-nzwhh" Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.207456 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zhbb\" (UniqueName: \"kubernetes.io/projected/84e55955-c37c-4897-ab18-f71812f3ccff-kube-api-access-6zhbb\") pod \"redhat-marketplace-nzwhh\" (UID: \"84e55955-c37c-4897-ab18-f71812f3ccff\") " pod="openshift-marketplace/redhat-marketplace-nzwhh" Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.250070 4719 patch_prober.go:28] interesting pod/router-default-5444994796-vdfqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 15:20:44 crc kubenswrapper[4719]: [-]has-synced failed: reason withheld Oct 09 15:20:44 crc kubenswrapper[4719]: [+]process-running ok Oct 09 15:20:44 crc kubenswrapper[4719]: healthz check failed Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.250154 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vdfqp" podUID="d2a62908-86f6-4b7f-9169-cb7a9ef1ece8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.287218 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b4cd7c1d73b8671a89a66eb865ba4daa66e24aa93f3af08cf5b613de94d04468"} Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.287300 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ceab5bea5978d361ca79714c7c8d84cdffe6ab0167d1691801226522707ec7c0"} Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.289103 4719 generic.go:334] "Generic (PLEG): container finished" podID="7c3c0e14-9adc-47ec-af46-d0779a6c6e1d" containerID="951425d60be53159e62068ddf26998b12dadc7c9dba4747729e3d9c09a357faf" exitCode=0 Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.289147 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jr2j6" event={"ID":"7c3c0e14-9adc-47ec-af46-d0779a6c6e1d","Type":"ContainerDied","Data":"951425d60be53159e62068ddf26998b12dadc7c9dba4747729e3d9c09a357faf"} Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.289532 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jr2j6" event={"ID":"7c3c0e14-9adc-47ec-af46-d0779a6c6e1d","Type":"ContainerStarted","Data":"12ed7ebd8cf2758f73be552d9e34b09f140cb28f7e283d91595ff06451197635"} Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.291589 4719 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.292211 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" event={"ID":"92f1494f-b7f7-4e94-90ce-132cc3a14a62","Type":"ContainerStarted","Data":"ffdb6ed3d78a73e8b850ec7354ead2b7cc4ffbff6a4b837f4ce909549944a38f"} Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.294014 4719 generic.go:334] "Generic (PLEG): container finished" podID="b6e7a7e4-2eff-468f-b764-b1b73c4285f4" containerID="202fcfe5e5b21b7f8e8b23374809fb626392a26b3ee94912a17717b3cb138518" exitCode=0 Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.294061 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wb65c" event={"ID":"b6e7a7e4-2eff-468f-b764-b1b73c4285f4","Type":"ContainerDied","Data":"202fcfe5e5b21b7f8e8b23374809fb626392a26b3ee94912a17717b3cb138518"} Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.294109 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wb65c" event={"ID":"b6e7a7e4-2eff-468f-b764-b1b73c4285f4","Type":"ContainerStarted","Data":"b0b4e21f6ba6656a3000556c4340e99265dc16bfd49348c38df76e423dcf3441"} Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.295545 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"50efdc0f1a0d222589becd41e9e16f5d6a3bb5cc816290c759815945033618fb"} Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.295566 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7de4edbd539cabebe3af429708d098ea11998c313869e2df0c2d29005a4881f5"} Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.295878 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.301512 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" event={"ID":"f679e74b-ce46-427e-928b-305b4579ca44","Type":"ContainerStarted","Data":"aa977ffbdc7e2dd7f109347bc88eafd0944244aa127d18eb63813e0122036d92"} Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.301593 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" event={"ID":"f679e74b-ce46-427e-928b-305b4579ca44","Type":"ContainerStarted","Data":"f3ee4e6cf2028dbc96986637c6265c2a56a1d5c86864e216bdb3947f7f737b2c"} Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.303739 4719 generic.go:334] "Generic (PLEG): container finished" podID="ed97f513-40b6-4273-b6a5-9f9f5150e4cd" containerID="5c19562fc43a92244a4db189ca1ec2c82fc57f3b907c12c9e60537ce625f3e12" exitCode=0 Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.303798 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8726g" event={"ID":"ed97f513-40b6-4273-b6a5-9f9f5150e4cd","Type":"ContainerDied","Data":"5c19562fc43a92244a4db189ca1ec2c82fc57f3b907c12c9e60537ce625f3e12"} Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.305266 4719 generic.go:334] "Generic (PLEG): container finished" podID="08d47ff5-80a6-4395-8481-2e7f1c2c1409" containerID="77437c01f6a56cdf406d96b76e103b566616312482bd0c803ef940280a4659a8" exitCode=0 Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.305358 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-458dz" event={"ID":"08d47ff5-80a6-4395-8481-2e7f1c2c1409","Type":"ContainerDied","Data":"77437c01f6a56cdf406d96b76e103b566616312482bd0c803ef940280a4659a8"} Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.305393 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-458dz" event={"ID":"08d47ff5-80a6-4395-8481-2e7f1c2c1409","Type":"ContainerStarted","Data":"ca09dd2c6d40653c165ba28617d1792e3e5a247980ce7e48efd1620b05ba2f0d"} Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.308018 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84e55955-c37c-4897-ab18-f71812f3ccff-catalog-content\") pod \"redhat-marketplace-nzwhh\" (UID: \"84e55955-c37c-4897-ab18-f71812f3ccff\") " pod="openshift-marketplace/redhat-marketplace-nzwhh" Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.308059 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zhbb\" (UniqueName: \"kubernetes.io/projected/84e55955-c37c-4897-ab18-f71812f3ccff-kube-api-access-6zhbb\") pod \"redhat-marketplace-nzwhh\" (UID: \"84e55955-c37c-4897-ab18-f71812f3ccff\") " pod="openshift-marketplace/redhat-marketplace-nzwhh" Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.308161 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84e55955-c37c-4897-ab18-f71812f3ccff-utilities\") pod \"redhat-marketplace-nzwhh\" (UID: \"84e55955-c37c-4897-ab18-f71812f3ccff\") " pod="openshift-marketplace/redhat-marketplace-nzwhh" Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.308496 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84e55955-c37c-4897-ab18-f71812f3ccff-catalog-content\") pod \"redhat-marketplace-nzwhh\" (UID: \"84e55955-c37c-4897-ab18-f71812f3ccff\") " pod="openshift-marketplace/redhat-marketplace-nzwhh" Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.308575 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84e55955-c37c-4897-ab18-f71812f3ccff-utilities\") pod \"redhat-marketplace-nzwhh\" (UID: \"84e55955-c37c-4897-ab18-f71812f3ccff\") " pod="openshift-marketplace/redhat-marketplace-nzwhh" Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.309016 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1668e78facd0163b62232b858cd7ea5a1094c64845931e72e2744d10d72bc07f"} Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.309063 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"fde4b38b91b0c07bdde1012335b48401394d8c457067edf702388c3ece83c547"} Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.335183 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zhbb\" (UniqueName: \"kubernetes.io/projected/84e55955-c37c-4897-ab18-f71812f3ccff-kube-api-access-6zhbb\") pod \"redhat-marketplace-nzwhh\" (UID: \"84e55955-c37c-4897-ab18-f71812f3ccff\") " pod="openshift-marketplace/redhat-marketplace-nzwhh" Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.367071 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-x8kq2" podStartSLOduration=11.367049709 podStartE2EDuration="11.367049709s" podCreationTimestamp="2025-10-09 15:20:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:44.36647996 +0000 UTC m=+149.876191245" watchObservedRunningTime="2025-10-09 15:20:44.367049709 +0000 UTC m=+149.876761004" Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.436749 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nzwhh" Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.531913 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bfn6d"] Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.533094 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bfn6d" Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.608617 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bfn6d"] Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.624501 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b211e254-37fb-4252-a758-4d9e781ad03d-utilities\") pod \"redhat-marketplace-bfn6d\" (UID: \"b211e254-37fb-4252-a758-4d9e781ad03d\") " pod="openshift-marketplace/redhat-marketplace-bfn6d" Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.624619 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f69vv\" (UniqueName: \"kubernetes.io/projected/b211e254-37fb-4252-a758-4d9e781ad03d-kube-api-access-f69vv\") pod \"redhat-marketplace-bfn6d\" (UID: \"b211e254-37fb-4252-a758-4d9e781ad03d\") " pod="openshift-marketplace/redhat-marketplace-bfn6d" Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.624759 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b211e254-37fb-4252-a758-4d9e781ad03d-catalog-content\") pod \"redhat-marketplace-bfn6d\" (UID: \"b211e254-37fb-4252-a758-4d9e781ad03d\") " pod="openshift-marketplace/redhat-marketplace-bfn6d" Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.728972 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b211e254-37fb-4252-a758-4d9e781ad03d-catalog-content\") pod \"redhat-marketplace-bfn6d\" (UID: \"b211e254-37fb-4252-a758-4d9e781ad03d\") " pod="openshift-marketplace/redhat-marketplace-bfn6d" Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.729050 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b211e254-37fb-4252-a758-4d9e781ad03d-utilities\") pod \"redhat-marketplace-bfn6d\" (UID: \"b211e254-37fb-4252-a758-4d9e781ad03d\") " pod="openshift-marketplace/redhat-marketplace-bfn6d" Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.729080 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f69vv\" (UniqueName: \"kubernetes.io/projected/b211e254-37fb-4252-a758-4d9e781ad03d-kube-api-access-f69vv\") pod \"redhat-marketplace-bfn6d\" (UID: \"b211e254-37fb-4252-a758-4d9e781ad03d\") " pod="openshift-marketplace/redhat-marketplace-bfn6d" Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.729722 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b211e254-37fb-4252-a758-4d9e781ad03d-utilities\") pod \"redhat-marketplace-bfn6d\" (UID: \"b211e254-37fb-4252-a758-4d9e781ad03d\") " pod="openshift-marketplace/redhat-marketplace-bfn6d" Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.729956 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b211e254-37fb-4252-a758-4d9e781ad03d-catalog-content\") pod \"redhat-marketplace-bfn6d\" (UID: \"b211e254-37fb-4252-a758-4d9e781ad03d\") " pod="openshift-marketplace/redhat-marketplace-bfn6d" Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.741766 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nzwhh"] Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.755843 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f69vv\" (UniqueName: \"kubernetes.io/projected/b211e254-37fb-4252-a758-4d9e781ad03d-kube-api-access-f69vv\") pod \"redhat-marketplace-bfn6d\" (UID: \"b211e254-37fb-4252-a758-4d9e781ad03d\") " pod="openshift-marketplace/redhat-marketplace-bfn6d" Oct 09 15:20:44 crc kubenswrapper[4719]: I1009 15:20:44.847702 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bfn6d" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.067765 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bfn6d"] Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.173685 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.250227 4719 patch_prober.go:28] interesting pod/router-default-5444994796-vdfqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 15:20:45 crc kubenswrapper[4719]: [-]has-synced failed: reason withheld Oct 09 15:20:45 crc kubenswrapper[4719]: [+]process-running ok Oct 09 15:20:45 crc kubenswrapper[4719]: healthz check failed Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.250278 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vdfqp" podUID="d2a62908-86f6-4b7f-9169-cb7a9ef1ece8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.323162 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jrqj4"] Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.327484 4719 generic.go:334] "Generic (PLEG): container finished" podID="24507f61-1a02-438c-b1ca-82515867e605" containerID="8de3d9cf280680a9dc7deeace9f3fd771589dbc230018023ab1c30b8f7b9f35f" exitCode=0 Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.330432 4719 generic.go:334] "Generic (PLEG): container finished" podID="b211e254-37fb-4252-a758-4d9e781ad03d" containerID="cc0667b7909440c57773bb7b878d9e0e0c5ecead750c07918fdf3f77cd05b7f9" exitCode=0 Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.386265 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jrqj4" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.387038 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jrqj4"] Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.387081 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333715-pgfd2" event={"ID":"24507f61-1a02-438c-b1ca-82515867e605","Type":"ContainerDied","Data":"8de3d9cf280680a9dc7deeace9f3fd771589dbc230018023ab1c30b8f7b9f35f"} Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.387119 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.387130 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bfn6d" event={"ID":"b211e254-37fb-4252-a758-4d9e781ad03d","Type":"ContainerDied","Data":"cc0667b7909440c57773bb7b878d9e0e0c5ecead750c07918fdf3f77cd05b7f9"} Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.387139 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bfn6d" event={"ID":"b211e254-37fb-4252-a758-4d9e781ad03d","Type":"ContainerStarted","Data":"196ed975c6618f7b995124c26cd7280349fdff5e91afcd03715906d0c6283d35"} Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.387148 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" event={"ID":"92f1494f-b7f7-4e94-90ce-132cc3a14a62","Type":"ContainerStarted","Data":"73a707c6d3843d9bd942924f9d2ef2134ef47e4dfd97c0dbc3520eb26069c5fa"} Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.388015 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.403689 4719 generic.go:334] "Generic (PLEG): container finished" podID="84e55955-c37c-4897-ab18-f71812f3ccff" containerID="abe54057dd1d529463de4a92df26291e1dd56cb065499bdc584985a25b277443" exitCode=0 Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.404650 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzwhh" event={"ID":"84e55955-c37c-4897-ab18-f71812f3ccff","Type":"ContainerDied","Data":"abe54057dd1d529463de4a92df26291e1dd56cb065499bdc584985a25b277443"} Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.404673 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzwhh" event={"ID":"84e55955-c37c-4897-ab18-f71812f3ccff","Type":"ContainerStarted","Data":"0b93971f8539e1eba0fe3f949371b233f893a119298cbad93f62e993d56eeac1"} Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.424705 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" podStartSLOduration=130.42468828 podStartE2EDuration="2m10.42468828s" podCreationTimestamp="2025-10-09 15:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:45.422009444 +0000 UTC m=+150.931720749" watchObservedRunningTime="2025-10-09 15:20:45.42468828 +0000 UTC m=+150.934399565" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.553425 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8lwd\" (UniqueName: \"kubernetes.io/projected/f662f8ad-fe9b-40c9-845e-8f2749a6482d-kube-api-access-j8lwd\") pod \"redhat-operators-jrqj4\" (UID: \"f662f8ad-fe9b-40c9-845e-8f2749a6482d\") " pod="openshift-marketplace/redhat-operators-jrqj4" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.553491 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f662f8ad-fe9b-40c9-845e-8f2749a6482d-utilities\") pod \"redhat-operators-jrqj4\" (UID: \"f662f8ad-fe9b-40c9-845e-8f2749a6482d\") " pod="openshift-marketplace/redhat-operators-jrqj4" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.553528 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f662f8ad-fe9b-40c9-845e-8f2749a6482d-catalog-content\") pod \"redhat-operators-jrqj4\" (UID: \"f662f8ad-fe9b-40c9-845e-8f2749a6482d\") " pod="openshift-marketplace/redhat-operators-jrqj4" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.654502 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8lwd\" (UniqueName: \"kubernetes.io/projected/f662f8ad-fe9b-40c9-845e-8f2749a6482d-kube-api-access-j8lwd\") pod \"redhat-operators-jrqj4\" (UID: \"f662f8ad-fe9b-40c9-845e-8f2749a6482d\") " pod="openshift-marketplace/redhat-operators-jrqj4" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.654829 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f662f8ad-fe9b-40c9-845e-8f2749a6482d-utilities\") pod \"redhat-operators-jrqj4\" (UID: \"f662f8ad-fe9b-40c9-845e-8f2749a6482d\") " pod="openshift-marketplace/redhat-operators-jrqj4" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.654855 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f662f8ad-fe9b-40c9-845e-8f2749a6482d-catalog-content\") pod \"redhat-operators-jrqj4\" (UID: \"f662f8ad-fe9b-40c9-845e-8f2749a6482d\") " pod="openshift-marketplace/redhat-operators-jrqj4" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.655284 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f662f8ad-fe9b-40c9-845e-8f2749a6482d-catalog-content\") pod \"redhat-operators-jrqj4\" (UID: \"f662f8ad-fe9b-40c9-845e-8f2749a6482d\") " pod="openshift-marketplace/redhat-operators-jrqj4" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.655421 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f662f8ad-fe9b-40c9-845e-8f2749a6482d-utilities\") pod \"redhat-operators-jrqj4\" (UID: \"f662f8ad-fe9b-40c9-845e-8f2749a6482d\") " pod="openshift-marketplace/redhat-operators-jrqj4" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.677556 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8lwd\" (UniqueName: \"kubernetes.io/projected/f662f8ad-fe9b-40c9-845e-8f2749a6482d-kube-api-access-j8lwd\") pod \"redhat-operators-jrqj4\" (UID: \"f662f8ad-fe9b-40c9-845e-8f2749a6482d\") " pod="openshift-marketplace/redhat-operators-jrqj4" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.718295 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2pqrf"] Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.719205 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2pqrf" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.727414 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jrqj4" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.730679 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2pqrf"] Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.860996 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9351b8b6-df12-4eee-9e51-ebc70c9da6bd-utilities\") pod \"redhat-operators-2pqrf\" (UID: \"9351b8b6-df12-4eee-9e51-ebc70c9da6bd\") " pod="openshift-marketplace/redhat-operators-2pqrf" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.861069 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9351b8b6-df12-4eee-9e51-ebc70c9da6bd-catalog-content\") pod \"redhat-operators-2pqrf\" (UID: \"9351b8b6-df12-4eee-9e51-ebc70c9da6bd\") " pod="openshift-marketplace/redhat-operators-2pqrf" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.861087 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6kct\" (UniqueName: \"kubernetes.io/projected/9351b8b6-df12-4eee-9e51-ebc70c9da6bd-kube-api-access-r6kct\") pod \"redhat-operators-2pqrf\" (UID: \"9351b8b6-df12-4eee-9e51-ebc70c9da6bd\") " pod="openshift-marketplace/redhat-operators-2pqrf" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.878879 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.880148 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.897904 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.901561 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.907827 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.949322 4719 patch_prober.go:28] interesting pod/downloads-7954f5f757-p7bmv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.949396 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-p7bmv" podUID="4091b4ed-3afe-4bab-b41e-0bca5b6f58b0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.949799 4719 patch_prober.go:28] interesting pod/downloads-7954f5f757-p7bmv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.949828 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-p7bmv" podUID="4091b4ed-3afe-4bab-b41e-0bca5b6f58b0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.958013 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.958051 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.964854 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9351b8b6-df12-4eee-9e51-ebc70c9da6bd-catalog-content\") pod \"redhat-operators-2pqrf\" (UID: \"9351b8b6-df12-4eee-9e51-ebc70c9da6bd\") " pod="openshift-marketplace/redhat-operators-2pqrf" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.964905 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6kct\" (UniqueName: \"kubernetes.io/projected/9351b8b6-df12-4eee-9e51-ebc70c9da6bd-kube-api-access-r6kct\") pod \"redhat-operators-2pqrf\" (UID: \"9351b8b6-df12-4eee-9e51-ebc70c9da6bd\") " pod="openshift-marketplace/redhat-operators-2pqrf" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.964958 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/34716fe0-1567-461c-9844-db7bce6942f6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"34716fe0-1567-461c-9844-db7bce6942f6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.965026 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9351b8b6-df12-4eee-9e51-ebc70c9da6bd-utilities\") pod \"redhat-operators-2pqrf\" (UID: \"9351b8b6-df12-4eee-9e51-ebc70c9da6bd\") " pod="openshift-marketplace/redhat-operators-2pqrf" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.965061 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34716fe0-1567-461c-9844-db7bce6942f6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"34716fe0-1567-461c-9844-db7bce6942f6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.965578 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9351b8b6-df12-4eee-9e51-ebc70c9da6bd-catalog-content\") pod \"redhat-operators-2pqrf\" (UID: \"9351b8b6-df12-4eee-9e51-ebc70c9da6bd\") " pod="openshift-marketplace/redhat-operators-2pqrf" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.966106 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9351b8b6-df12-4eee-9e51-ebc70c9da6bd-utilities\") pod \"redhat-operators-2pqrf\" (UID: \"9351b8b6-df12-4eee-9e51-ebc70c9da6bd\") " pod="openshift-marketplace/redhat-operators-2pqrf" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.972525 4719 patch_prober.go:28] interesting pod/console-f9d7485db-j74ct container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.972588 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-j74ct" podUID="c895d97a-7287-49a8-9ac5-bc87e8bcf297" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.978188 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.978968 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.987280 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6kct\" (UniqueName: \"kubernetes.io/projected/9351b8b6-df12-4eee-9e51-ebc70c9da6bd-kube-api-access-r6kct\") pod \"redhat-operators-2pqrf\" (UID: \"9351b8b6-df12-4eee-9e51-ebc70c9da6bd\") " pod="openshift-marketplace/redhat-operators-2pqrf" Oct 09 15:20:45 crc kubenswrapper[4719]: I1009 15:20:45.997561 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.037478 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2pqrf" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.065903 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/34716fe0-1567-461c-9844-db7bce6942f6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"34716fe0-1567-461c-9844-db7bce6942f6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.066068 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34716fe0-1567-461c-9844-db7bce6942f6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"34716fe0-1567-461c-9844-db7bce6942f6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.068129 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34716fe0-1567-461c-9844-db7bce6942f6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"34716fe0-1567-461c-9844-db7bce6942f6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.088097 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/34716fe0-1567-461c-9844-db7bce6942f6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"34716fe0-1567-461c-9844-db7bce6942f6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.101723 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.101763 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.109006 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.227826 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.246538 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-vdfqp" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.256547 4719 patch_prober.go:28] interesting pod/router-default-5444994796-vdfqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 15:20:46 crc kubenswrapper[4719]: [-]has-synced failed: reason withheld Oct 09 15:20:46 crc kubenswrapper[4719]: [+]process-running ok Oct 09 15:20:46 crc kubenswrapper[4719]: healthz check failed Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.256600 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vdfqp" podUID="d2a62908-86f6-4b7f-9169-cb7a9ef1ece8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.333315 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jrqj4"] Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.344681 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.345728 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.349099 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.351536 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.377370 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2pqrf"] Oct 09 15:20:46 crc kubenswrapper[4719]: W1009 15:20:46.403310 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9351b8b6_df12_4eee_9e51_ebc70c9da6bd.slice/crio-67894cc0aae515b82a40c2ad85b686bbd54b0cdf6713349858bae47133775a5f WatchSource:0}: Error finding container 67894cc0aae515b82a40c2ad85b686bbd54b0cdf6713349858bae47133775a5f: Status 404 returned error can't find the container with id 67894cc0aae515b82a40c2ad85b686bbd54b0cdf6713349858bae47133775a5f Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.423692 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pqrf" event={"ID":"9351b8b6-df12-4eee-9e51-ebc70c9da6bd","Type":"ContainerStarted","Data":"67894cc0aae515b82a40c2ad85b686bbd54b0cdf6713349858bae47133775a5f"} Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.430162 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrqj4" event={"ID":"f662f8ad-fe9b-40c9-845e-8f2749a6482d","Type":"ContainerStarted","Data":"eab3c28380de162670e59acc1e072b0af4d19ffdb2d47fa91c0f14eded8f95bb"} Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.438542 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-4mm2x" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.438661 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-696rc" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.478116 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54705293-788b-4c76-851e-2ff0563877ca-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"54705293-788b-4c76-851e-2ff0563877ca\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.478153 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54705293-788b-4c76-851e-2ff0563877ca-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"54705293-788b-4c76-851e-2ff0563877ca\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.557238 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.579158 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54705293-788b-4c76-851e-2ff0563877ca-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"54705293-788b-4c76-851e-2ff0563877ca\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.579194 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54705293-788b-4c76-851e-2ff0563877ca-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"54705293-788b-4c76-851e-2ff0563877ca\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.580589 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54705293-788b-4c76-851e-2ff0563877ca-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"54705293-788b-4c76-851e-2ff0563877ca\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.603023 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.614143 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54705293-788b-4c76-851e-2ff0563877ca-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"54705293-788b-4c76-851e-2ff0563877ca\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.677926 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.735852 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-dwt2p" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.907211 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333715-pgfd2" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.983251 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69jd7\" (UniqueName: \"kubernetes.io/projected/24507f61-1a02-438c-b1ca-82515867e605-kube-api-access-69jd7\") pod \"24507f61-1a02-438c-b1ca-82515867e605\" (UID: \"24507f61-1a02-438c-b1ca-82515867e605\") " Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.983374 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/24507f61-1a02-438c-b1ca-82515867e605-secret-volume\") pod \"24507f61-1a02-438c-b1ca-82515867e605\" (UID: \"24507f61-1a02-438c-b1ca-82515867e605\") " Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.984762 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24507f61-1a02-438c-b1ca-82515867e605-config-volume\") pod \"24507f61-1a02-438c-b1ca-82515867e605\" (UID: \"24507f61-1a02-438c-b1ca-82515867e605\") " Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.985540 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24507f61-1a02-438c-b1ca-82515867e605-config-volume" (OuterVolumeSpecName: "config-volume") pod "24507f61-1a02-438c-b1ca-82515867e605" (UID: "24507f61-1a02-438c-b1ca-82515867e605"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.991311 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24507f61-1a02-438c-b1ca-82515867e605-kube-api-access-69jd7" (OuterVolumeSpecName: "kube-api-access-69jd7") pod "24507f61-1a02-438c-b1ca-82515867e605" (UID: "24507f61-1a02-438c-b1ca-82515867e605"). InnerVolumeSpecName "kube-api-access-69jd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:20:46 crc kubenswrapper[4719]: I1009 15:20:46.991502 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24507f61-1a02-438c-b1ca-82515867e605-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "24507f61-1a02-438c-b1ca-82515867e605" (UID: "24507f61-1a02-438c-b1ca-82515867e605"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:20:47 crc kubenswrapper[4719]: I1009 15:20:47.087074 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69jd7\" (UniqueName: \"kubernetes.io/projected/24507f61-1a02-438c-b1ca-82515867e605-kube-api-access-69jd7\") on node \"crc\" DevicePath \"\"" Oct 09 15:20:47 crc kubenswrapper[4719]: I1009 15:20:47.087114 4719 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/24507f61-1a02-438c-b1ca-82515867e605-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 15:20:47 crc kubenswrapper[4719]: I1009 15:20:47.087124 4719 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24507f61-1a02-438c-b1ca-82515867e605-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 15:20:47 crc kubenswrapper[4719]: I1009 15:20:47.207682 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 09 15:20:47 crc kubenswrapper[4719]: W1009 15:20:47.227988 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod54705293_788b_4c76_851e_2ff0563877ca.slice/crio-f3f5a5f5adc26ca893efcee9a52e23bf95654f85e2b2f6ca3be8b51037f5b261 WatchSource:0}: Error finding container f3f5a5f5adc26ca893efcee9a52e23bf95654f85e2b2f6ca3be8b51037f5b261: Status 404 returned error can't find the container with id f3f5a5f5adc26ca893efcee9a52e23bf95654f85e2b2f6ca3be8b51037f5b261 Oct 09 15:20:47 crc kubenswrapper[4719]: I1009 15:20:47.251366 4719 patch_prober.go:28] interesting pod/router-default-5444994796-vdfqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 15:20:47 crc kubenswrapper[4719]: [-]has-synced failed: reason withheld Oct 09 15:20:47 crc kubenswrapper[4719]: [+]process-running ok Oct 09 15:20:47 crc kubenswrapper[4719]: healthz check failed Oct 09 15:20:47 crc kubenswrapper[4719]: I1009 15:20:47.251415 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vdfqp" podUID="d2a62908-86f6-4b7f-9169-cb7a9ef1ece8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 15:20:47 crc kubenswrapper[4719]: I1009 15:20:47.455201 4719 generic.go:334] "Generic (PLEG): container finished" podID="9351b8b6-df12-4eee-9e51-ebc70c9da6bd" containerID="d8f8e2982b9ceaf6a35f6b707b8f942ccf3e5e38aeb6c03b7c89d29899a9131c" exitCode=0 Oct 09 15:20:47 crc kubenswrapper[4719]: I1009 15:20:47.455259 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pqrf" event={"ID":"9351b8b6-df12-4eee-9e51-ebc70c9da6bd","Type":"ContainerDied","Data":"d8f8e2982b9ceaf6a35f6b707b8f942ccf3e5e38aeb6c03b7c89d29899a9131c"} Oct 09 15:20:47 crc kubenswrapper[4719]: I1009 15:20:47.458234 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333715-pgfd2" Oct 09 15:20:47 crc kubenswrapper[4719]: I1009 15:20:47.459254 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333715-pgfd2" event={"ID":"24507f61-1a02-438c-b1ca-82515867e605","Type":"ContainerDied","Data":"8dbe68ee06f635859c0f96ae0629d94ea2c2d8db1acb53020adb461514eb0de4"} Oct 09 15:20:47 crc kubenswrapper[4719]: I1009 15:20:47.459277 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dbe68ee06f635859c0f96ae0629d94ea2c2d8db1acb53020adb461514eb0de4" Oct 09 15:20:47 crc kubenswrapper[4719]: I1009 15:20:47.476990 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"54705293-788b-4c76-851e-2ff0563877ca","Type":"ContainerStarted","Data":"f3f5a5f5adc26ca893efcee9a52e23bf95654f85e2b2f6ca3be8b51037f5b261"} Oct 09 15:20:47 crc kubenswrapper[4719]: I1009 15:20:47.516063 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"34716fe0-1567-461c-9844-db7bce6942f6","Type":"ContainerStarted","Data":"248276e8ccb481fa86d9e4e63f7bd732c0b3c2cf41177fae15f14f210afe2ba4"} Oct 09 15:20:47 crc kubenswrapper[4719]: I1009 15:20:47.521446 4719 generic.go:334] "Generic (PLEG): container finished" podID="f662f8ad-fe9b-40c9-845e-8f2749a6482d" containerID="5e6dcdee156e16599ba7119a35ec28ad5884c765620c4dc15220447b0524863f" exitCode=0 Oct 09 15:20:47 crc kubenswrapper[4719]: I1009 15:20:47.521519 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrqj4" event={"ID":"f662f8ad-fe9b-40c9-845e-8f2749a6482d","Type":"ContainerDied","Data":"5e6dcdee156e16599ba7119a35ec28ad5884c765620c4dc15220447b0524863f"} Oct 09 15:20:48 crc kubenswrapper[4719]: I1009 15:20:48.249426 4719 patch_prober.go:28] interesting pod/router-default-5444994796-vdfqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 15:20:48 crc kubenswrapper[4719]: [-]has-synced failed: reason withheld Oct 09 15:20:48 crc kubenswrapper[4719]: [+]process-running ok Oct 09 15:20:48 crc kubenswrapper[4719]: healthz check failed Oct 09 15:20:48 crc kubenswrapper[4719]: I1009 15:20:48.249813 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vdfqp" podUID="d2a62908-86f6-4b7f-9169-cb7a9ef1ece8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 15:20:48 crc kubenswrapper[4719]: I1009 15:20:48.533593 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"54705293-788b-4c76-851e-2ff0563877ca","Type":"ContainerStarted","Data":"7429deecbb13c77e9fc78ec8ca041405c0e78c24112aee1d6e7933c00e04fb52"} Oct 09 15:20:48 crc kubenswrapper[4719]: I1009 15:20:48.539971 4719 generic.go:334] "Generic (PLEG): container finished" podID="34716fe0-1567-461c-9844-db7bce6942f6" containerID="ed41521c335461f39a52e84ff4fcf0bf7756906977fddee271eabd228a00e18f" exitCode=0 Oct 09 15:20:48 crc kubenswrapper[4719]: I1009 15:20:48.540033 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"34716fe0-1567-461c-9844-db7bce6942f6","Type":"ContainerDied","Data":"ed41521c335461f39a52e84ff4fcf0bf7756906977fddee271eabd228a00e18f"} Oct 09 15:20:48 crc kubenswrapper[4719]: I1009 15:20:48.547499 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.547463558 podStartE2EDuration="2.547463558s" podCreationTimestamp="2025-10-09 15:20:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:20:48.547001784 +0000 UTC m=+154.056713079" watchObservedRunningTime="2025-10-09 15:20:48.547463558 +0000 UTC m=+154.057174843" Oct 09 15:20:48 crc kubenswrapper[4719]: I1009 15:20:48.591170 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:20:49 crc kubenswrapper[4719]: I1009 15:20:49.249095 4719 patch_prober.go:28] interesting pod/router-default-5444994796-vdfqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 15:20:49 crc kubenswrapper[4719]: [-]has-synced failed: reason withheld Oct 09 15:20:49 crc kubenswrapper[4719]: [+]process-running ok Oct 09 15:20:49 crc kubenswrapper[4719]: healthz check failed Oct 09 15:20:49 crc kubenswrapper[4719]: I1009 15:20:49.249148 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vdfqp" podUID="d2a62908-86f6-4b7f-9169-cb7a9ef1ece8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 15:20:49 crc kubenswrapper[4719]: I1009 15:20:49.594469 4719 generic.go:334] "Generic (PLEG): container finished" podID="54705293-788b-4c76-851e-2ff0563877ca" containerID="7429deecbb13c77e9fc78ec8ca041405c0e78c24112aee1d6e7933c00e04fb52" exitCode=0 Oct 09 15:20:49 crc kubenswrapper[4719]: I1009 15:20:49.594555 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"54705293-788b-4c76-851e-2ff0563877ca","Type":"ContainerDied","Data":"7429deecbb13c77e9fc78ec8ca041405c0e78c24112aee1d6e7933c00e04fb52"} Oct 09 15:20:49 crc kubenswrapper[4719]: I1009 15:20:49.946883 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 15:20:50 crc kubenswrapper[4719]: I1009 15:20:50.048922 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34716fe0-1567-461c-9844-db7bce6942f6-kubelet-dir\") pod \"34716fe0-1567-461c-9844-db7bce6942f6\" (UID: \"34716fe0-1567-461c-9844-db7bce6942f6\") " Oct 09 15:20:50 crc kubenswrapper[4719]: I1009 15:20:50.049078 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/34716fe0-1567-461c-9844-db7bce6942f6-kube-api-access\") pod \"34716fe0-1567-461c-9844-db7bce6942f6\" (UID: \"34716fe0-1567-461c-9844-db7bce6942f6\") " Oct 09 15:20:50 crc kubenswrapper[4719]: I1009 15:20:50.049123 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34716fe0-1567-461c-9844-db7bce6942f6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "34716fe0-1567-461c-9844-db7bce6942f6" (UID: "34716fe0-1567-461c-9844-db7bce6942f6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 15:20:50 crc kubenswrapper[4719]: I1009 15:20:50.049395 4719 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34716fe0-1567-461c-9844-db7bce6942f6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 09 15:20:50 crc kubenswrapper[4719]: I1009 15:20:50.054795 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34716fe0-1567-461c-9844-db7bce6942f6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "34716fe0-1567-461c-9844-db7bce6942f6" (UID: "34716fe0-1567-461c-9844-db7bce6942f6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:20:50 crc kubenswrapper[4719]: I1009 15:20:50.150518 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/34716fe0-1567-461c-9844-db7bce6942f6-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 15:20:50 crc kubenswrapper[4719]: I1009 15:20:50.248306 4719 patch_prober.go:28] interesting pod/router-default-5444994796-vdfqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 15:20:50 crc kubenswrapper[4719]: [-]has-synced failed: reason withheld Oct 09 15:20:50 crc kubenswrapper[4719]: [+]process-running ok Oct 09 15:20:50 crc kubenswrapper[4719]: healthz check failed Oct 09 15:20:50 crc kubenswrapper[4719]: I1009 15:20:50.248375 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vdfqp" podUID="d2a62908-86f6-4b7f-9169-cb7a9ef1ece8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 15:20:50 crc kubenswrapper[4719]: I1009 15:20:50.603948 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"34716fe0-1567-461c-9844-db7bce6942f6","Type":"ContainerDied","Data":"248276e8ccb481fa86d9e4e63f7bd732c0b3c2cf41177fae15f14f210afe2ba4"} Oct 09 15:20:50 crc kubenswrapper[4719]: I1009 15:20:50.604016 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="248276e8ccb481fa86d9e4e63f7bd732c0b3c2cf41177fae15f14f210afe2ba4" Oct 09 15:20:50 crc kubenswrapper[4719]: I1009 15:20:50.603983 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 15:20:51 crc kubenswrapper[4719]: I1009 15:20:51.248419 4719 patch_prober.go:28] interesting pod/router-default-5444994796-vdfqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 15:20:51 crc kubenswrapper[4719]: [-]has-synced failed: reason withheld Oct 09 15:20:51 crc kubenswrapper[4719]: [+]process-running ok Oct 09 15:20:51 crc kubenswrapper[4719]: healthz check failed Oct 09 15:20:51 crc kubenswrapper[4719]: I1009 15:20:51.248704 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vdfqp" podUID="d2a62908-86f6-4b7f-9169-cb7a9ef1ece8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 15:20:51 crc kubenswrapper[4719]: I1009 15:20:51.912870 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-762nw" Oct 09 15:20:52 crc kubenswrapper[4719]: I1009 15:20:52.253685 4719 patch_prober.go:28] interesting pod/router-default-5444994796-vdfqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 15:20:52 crc kubenswrapper[4719]: [-]has-synced failed: reason withheld Oct 09 15:20:52 crc kubenswrapper[4719]: [+]process-running ok Oct 09 15:20:52 crc kubenswrapper[4719]: healthz check failed Oct 09 15:20:52 crc kubenswrapper[4719]: I1009 15:20:52.253770 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vdfqp" podUID="d2a62908-86f6-4b7f-9169-cb7a9ef1ece8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 15:20:53 crc kubenswrapper[4719]: I1009 15:20:53.250066 4719 patch_prober.go:28] interesting pod/router-default-5444994796-vdfqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 15:20:53 crc kubenswrapper[4719]: [-]has-synced failed: reason withheld Oct 09 15:20:53 crc kubenswrapper[4719]: [+]process-running ok Oct 09 15:20:53 crc kubenswrapper[4719]: healthz check failed Oct 09 15:20:53 crc kubenswrapper[4719]: I1009 15:20:53.250334 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vdfqp" podUID="d2a62908-86f6-4b7f-9169-cb7a9ef1ece8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 15:20:54 crc kubenswrapper[4719]: I1009 15:20:54.248801 4719 patch_prober.go:28] interesting pod/router-default-5444994796-vdfqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 15:20:54 crc kubenswrapper[4719]: [+]has-synced ok Oct 09 15:20:54 crc kubenswrapper[4719]: [+]process-running ok Oct 09 15:20:54 crc kubenswrapper[4719]: healthz check failed Oct 09 15:20:54 crc kubenswrapper[4719]: I1009 15:20:54.248889 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vdfqp" podUID="d2a62908-86f6-4b7f-9169-cb7a9ef1ece8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 15:20:55 crc kubenswrapper[4719]: I1009 15:20:55.249034 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-vdfqp" Oct 09 15:20:55 crc kubenswrapper[4719]: I1009 15:20:55.252148 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-vdfqp" Oct 09 15:20:55 crc kubenswrapper[4719]: I1009 15:20:55.953259 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-p7bmv" Oct 09 15:20:55 crc kubenswrapper[4719]: I1009 15:20:55.960965 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:55 crc kubenswrapper[4719]: I1009 15:20:55.965001 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:20:56 crc kubenswrapper[4719]: I1009 15:20:56.644565 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs\") pod \"network-metrics-daemon-58bdp\" (UID: \"d00237ae-ca20-4202-8e24-e4988fbf5269\") " pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:20:56 crc kubenswrapper[4719]: I1009 15:20:56.651912 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d00237ae-ca20-4202-8e24-e4988fbf5269-metrics-certs\") pod \"network-metrics-daemon-58bdp\" (UID: \"d00237ae-ca20-4202-8e24-e4988fbf5269\") " pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:20:56 crc kubenswrapper[4719]: I1009 15:20:56.776458 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-58bdp" Oct 09 15:20:56 crc kubenswrapper[4719]: I1009 15:20:56.883210 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 15:20:56 crc kubenswrapper[4719]: I1009 15:20:56.947772 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54705293-788b-4c76-851e-2ff0563877ca-kube-api-access\") pod \"54705293-788b-4c76-851e-2ff0563877ca\" (UID: \"54705293-788b-4c76-851e-2ff0563877ca\") " Oct 09 15:20:56 crc kubenswrapper[4719]: I1009 15:20:56.947841 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54705293-788b-4c76-851e-2ff0563877ca-kubelet-dir\") pod \"54705293-788b-4c76-851e-2ff0563877ca\" (UID: \"54705293-788b-4c76-851e-2ff0563877ca\") " Oct 09 15:20:56 crc kubenswrapper[4719]: I1009 15:20:56.948027 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/54705293-788b-4c76-851e-2ff0563877ca-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "54705293-788b-4c76-851e-2ff0563877ca" (UID: "54705293-788b-4c76-851e-2ff0563877ca"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 15:20:56 crc kubenswrapper[4719]: I1009 15:20:56.948103 4719 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54705293-788b-4c76-851e-2ff0563877ca-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 09 15:20:56 crc kubenswrapper[4719]: I1009 15:20:56.951747 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54705293-788b-4c76-851e-2ff0563877ca-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "54705293-788b-4c76-851e-2ff0563877ca" (UID: "54705293-788b-4c76-851e-2ff0563877ca"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:20:57 crc kubenswrapper[4719]: I1009 15:20:57.049485 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54705293-788b-4c76-851e-2ff0563877ca-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 15:20:57 crc kubenswrapper[4719]: I1009 15:20:57.656578 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"54705293-788b-4c76-851e-2ff0563877ca","Type":"ContainerDied","Data":"f3f5a5f5adc26ca893efcee9a52e23bf95654f85e2b2f6ca3be8b51037f5b261"} Oct 09 15:20:57 crc kubenswrapper[4719]: I1009 15:20:57.656852 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3f5a5f5adc26ca893efcee9a52e23bf95654f85e2b2f6ca3be8b51037f5b261" Oct 09 15:20:57 crc kubenswrapper[4719]: I1009 15:20:57.656627 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 15:21:03 crc kubenswrapper[4719]: I1009 15:21:03.852818 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:21:06 crc kubenswrapper[4719]: I1009 15:21:06.976883 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:21:06 crc kubenswrapper[4719]: I1009 15:21:06.977208 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:21:10 crc kubenswrapper[4719]: E1009 15:21:10.575941 4719 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 09 15:21:10 crc kubenswrapper[4719]: E1009 15:21:10.576134 4719 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f69vv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-bfn6d_openshift-marketplace(b211e254-37fb-4252-a758-4d9e781ad03d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 15:21:10 crc kubenswrapper[4719]: E1009 15:21:10.577724 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-bfn6d" podUID="b211e254-37fb-4252-a758-4d9e781ad03d" Oct 09 15:21:13 crc kubenswrapper[4719]: E1009 15:21:13.722754 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-bfn6d" podUID="b211e254-37fb-4252-a758-4d9e781ad03d" Oct 09 15:21:14 crc kubenswrapper[4719]: E1009 15:21:14.895182 4719 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 09 15:21:14 crc kubenswrapper[4719]: E1009 15:21:14.895733 4719 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fz6xj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-458dz_openshift-marketplace(08d47ff5-80a6-4395-8481-2e7f1c2c1409): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 15:21:14 crc kubenswrapper[4719]: E1009 15:21:14.896954 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-458dz" podUID="08d47ff5-80a6-4395-8481-2e7f1c2c1409" Oct 09 15:21:15 crc kubenswrapper[4719]: E1009 15:21:15.104328 4719 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 09 15:21:15 crc kubenswrapper[4719]: E1009 15:21:15.104492 4719 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6zhbb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-nzwhh_openshift-marketplace(84e55955-c37c-4897-ab18-f71812f3ccff): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 15:21:15 crc kubenswrapper[4719]: E1009 15:21:15.105700 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-nzwhh" podUID="84e55955-c37c-4897-ab18-f71812f3ccff" Oct 09 15:21:15 crc kubenswrapper[4719]: E1009 15:21:15.946104 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-458dz" podUID="08d47ff5-80a6-4395-8481-2e7f1c2c1409" Oct 09 15:21:15 crc kubenswrapper[4719]: E1009 15:21:15.946162 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-nzwhh" podUID="84e55955-c37c-4897-ab18-f71812f3ccff" Oct 09 15:21:16 crc kubenswrapper[4719]: E1009 15:21:16.013693 4719 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 09 15:21:16 crc kubenswrapper[4719]: E1009 15:21:16.013850 4719 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-72pjn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-jr2j6_openshift-marketplace(7c3c0e14-9adc-47ec-af46-d0779a6c6e1d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 15:21:16 crc kubenswrapper[4719]: E1009 15:21:16.015067 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-jr2j6" podUID="7c3c0e14-9adc-47ec-af46-d0779a6c6e1d" Oct 09 15:21:16 crc kubenswrapper[4719]: E1009 15:21:16.044467 4719 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 09 15:21:16 crc kubenswrapper[4719]: E1009 15:21:16.044632 4719 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fkmtq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8726g_openshift-marketplace(ed97f513-40b6-4273-b6a5-9f9f5150e4cd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 15:21:16 crc kubenswrapper[4719]: E1009 15:21:16.046611 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-8726g" podUID="ed97f513-40b6-4273-b6a5-9f9f5150e4cd" Oct 09 15:21:16 crc kubenswrapper[4719]: I1009 15:21:16.729127 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x6g6x" Oct 09 15:21:18 crc kubenswrapper[4719]: E1009 15:21:18.533677 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-jr2j6" podUID="7c3c0e14-9adc-47ec-af46-d0779a6c6e1d" Oct 09 15:21:18 crc kubenswrapper[4719]: E1009 15:21:18.533707 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8726g" podUID="ed97f513-40b6-4273-b6a5-9f9f5150e4cd" Oct 09 15:21:18 crc kubenswrapper[4719]: E1009 15:21:18.603066 4719 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 09 15:21:18 crc kubenswrapper[4719]: E1009 15:21:18.603207 4719 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j8lwd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-jrqj4_openshift-marketplace(f662f8ad-fe9b-40c9-845e-8f2749a6482d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 15:21:18 crc kubenswrapper[4719]: E1009 15:21:18.604489 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-jrqj4" podUID="f662f8ad-fe9b-40c9-845e-8f2749a6482d" Oct 09 15:21:18 crc kubenswrapper[4719]: E1009 15:21:18.637379 4719 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 09 15:21:18 crc kubenswrapper[4719]: E1009 15:21:18.637749 4719 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nrv4p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-wb65c_openshift-marketplace(b6e7a7e4-2eff-468f-b764-b1b73c4285f4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 15:21:18 crc kubenswrapper[4719]: E1009 15:21:18.638931 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-wb65c" podUID="b6e7a7e4-2eff-468f-b764-b1b73c4285f4" Oct 09 15:21:18 crc kubenswrapper[4719]: E1009 15:21:18.764584 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-wb65c" podUID="b6e7a7e4-2eff-468f-b764-b1b73c4285f4" Oct 09 15:21:18 crc kubenswrapper[4719]: E1009 15:21:18.765302 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-jrqj4" podUID="f662f8ad-fe9b-40c9-845e-8f2749a6482d" Oct 09 15:21:18 crc kubenswrapper[4719]: I1009 15:21:18.904675 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-58bdp"] Oct 09 15:21:19 crc kubenswrapper[4719]: I1009 15:21:19.769138 4719 generic.go:334] "Generic (PLEG): container finished" podID="9351b8b6-df12-4eee-9e51-ebc70c9da6bd" containerID="8100623dc56b84e9afd506ba8e1f49ee3d68cd37c4e46e453c7a0d3ceb86e491" exitCode=0 Oct 09 15:21:19 crc kubenswrapper[4719]: I1009 15:21:19.769291 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pqrf" event={"ID":"9351b8b6-df12-4eee-9e51-ebc70c9da6bd","Type":"ContainerDied","Data":"8100623dc56b84e9afd506ba8e1f49ee3d68cd37c4e46e453c7a0d3ceb86e491"} Oct 09 15:21:19 crc kubenswrapper[4719]: I1009 15:21:19.774241 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-58bdp" event={"ID":"d00237ae-ca20-4202-8e24-e4988fbf5269","Type":"ContainerStarted","Data":"6a4603ac58f8df069afcff3ca0b0e30bb84211cecb7b8d33e667ff06100dcd50"} Oct 09 15:21:19 crc kubenswrapper[4719]: I1009 15:21:19.774277 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-58bdp" event={"ID":"d00237ae-ca20-4202-8e24-e4988fbf5269","Type":"ContainerStarted","Data":"5fe8b5d823de4d09834868e2138dafcf8a6607576fe33a2245a4dcc66b706a0f"} Oct 09 15:21:19 crc kubenswrapper[4719]: I1009 15:21:19.774299 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-58bdp" event={"ID":"d00237ae-ca20-4202-8e24-e4988fbf5269","Type":"ContainerStarted","Data":"466f76eba4303468a773bed8e4b771065709d369d5e270107feafef8dde582fb"} Oct 09 15:21:19 crc kubenswrapper[4719]: I1009 15:21:19.814596 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-58bdp" podStartSLOduration=165.81457081 podStartE2EDuration="2m45.81457081s" podCreationTimestamp="2025-10-09 15:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:21:19.80990062 +0000 UTC m=+185.319611925" watchObservedRunningTime="2025-10-09 15:21:19.81457081 +0000 UTC m=+185.324282095" Oct 09 15:21:20 crc kubenswrapper[4719]: I1009 15:21:20.781583 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pqrf" event={"ID":"9351b8b6-df12-4eee-9e51-ebc70c9da6bd","Type":"ContainerStarted","Data":"02ab9509066ba24318502afffae750725f0ec4a6fe0561e81874bcc5c32ccce9"} Oct 09 15:21:23 crc kubenswrapper[4719]: I1009 15:21:23.387290 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 15:21:23 crc kubenswrapper[4719]: I1009 15:21:23.403573 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2pqrf" podStartSLOduration=5.479979212 podStartE2EDuration="38.403555022s" podCreationTimestamp="2025-10-09 15:20:45 +0000 UTC" firstStartedPulling="2025-10-09 15:20:47.456495853 +0000 UTC m=+152.966207138" lastFinishedPulling="2025-10-09 15:21:20.380071663 +0000 UTC m=+185.889782948" observedRunningTime="2025-10-09 15:21:20.79798662 +0000 UTC m=+186.307697925" watchObservedRunningTime="2025-10-09 15:21:23.403555022 +0000 UTC m=+188.913266327" Oct 09 15:21:26 crc kubenswrapper[4719]: I1009 15:21:26.038844 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2pqrf" Oct 09 15:21:26 crc kubenswrapper[4719]: I1009 15:21:26.039233 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2pqrf" Oct 09 15:21:26 crc kubenswrapper[4719]: I1009 15:21:26.184456 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2pqrf" Oct 09 15:21:26 crc kubenswrapper[4719]: I1009 15:21:26.854794 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2pqrf" Oct 09 15:21:27 crc kubenswrapper[4719]: I1009 15:21:27.392264 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2pqrf"] Oct 09 15:21:27 crc kubenswrapper[4719]: I1009 15:21:27.824800 4719 generic.go:334] "Generic (PLEG): container finished" podID="08d47ff5-80a6-4395-8481-2e7f1c2c1409" containerID="94567adc6b6d9802690c7ad5a5954f5d8baf4e91277c4077781542ca0f86219e" exitCode=0 Oct 09 15:21:27 crc kubenswrapper[4719]: I1009 15:21:27.825165 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-458dz" event={"ID":"08d47ff5-80a6-4395-8481-2e7f1c2c1409","Type":"ContainerDied","Data":"94567adc6b6d9802690c7ad5a5954f5d8baf4e91277c4077781542ca0f86219e"} Oct 09 15:21:28 crc kubenswrapper[4719]: I1009 15:21:28.831512 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-458dz" event={"ID":"08d47ff5-80a6-4395-8481-2e7f1c2c1409","Type":"ContainerStarted","Data":"612d3eced1905fbcd994b1a686a5888fbbbf3f88233a858114ba4a4c5d88658a"} Oct 09 15:21:28 crc kubenswrapper[4719]: I1009 15:21:28.833511 4719 generic.go:334] "Generic (PLEG): container finished" podID="b211e254-37fb-4252-a758-4d9e781ad03d" containerID="d872d5605abc16f22398760efee80b22e4e49b25e4db7191d618d356639f9b25" exitCode=0 Oct 09 15:21:28 crc kubenswrapper[4719]: I1009 15:21:28.833578 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bfn6d" event={"ID":"b211e254-37fb-4252-a758-4d9e781ad03d","Type":"ContainerDied","Data":"d872d5605abc16f22398760efee80b22e4e49b25e4db7191d618d356639f9b25"} Oct 09 15:21:28 crc kubenswrapper[4719]: I1009 15:21:28.833701 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2pqrf" podUID="9351b8b6-df12-4eee-9e51-ebc70c9da6bd" containerName="registry-server" containerID="cri-o://02ab9509066ba24318502afffae750725f0ec4a6fe0561e81874bcc5c32ccce9" gracePeriod=2 Oct 09 15:21:28 crc kubenswrapper[4719]: I1009 15:21:28.864217 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-458dz" podStartSLOduration=2.916529435 podStartE2EDuration="46.864198957s" podCreationTimestamp="2025-10-09 15:20:42 +0000 UTC" firstStartedPulling="2025-10-09 15:20:44.306874039 +0000 UTC m=+149.816585324" lastFinishedPulling="2025-10-09 15:21:28.254543551 +0000 UTC m=+193.764254846" observedRunningTime="2025-10-09 15:21:28.848523542 +0000 UTC m=+194.358234827" watchObservedRunningTime="2025-10-09 15:21:28.864198957 +0000 UTC m=+194.373910242" Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.243023 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2pqrf" Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.445115 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9351b8b6-df12-4eee-9e51-ebc70c9da6bd-catalog-content\") pod \"9351b8b6-df12-4eee-9e51-ebc70c9da6bd\" (UID: \"9351b8b6-df12-4eee-9e51-ebc70c9da6bd\") " Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.445291 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9351b8b6-df12-4eee-9e51-ebc70c9da6bd-utilities\") pod \"9351b8b6-df12-4eee-9e51-ebc70c9da6bd\" (UID: \"9351b8b6-df12-4eee-9e51-ebc70c9da6bd\") " Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.445381 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6kct\" (UniqueName: \"kubernetes.io/projected/9351b8b6-df12-4eee-9e51-ebc70c9da6bd-kube-api-access-r6kct\") pod \"9351b8b6-df12-4eee-9e51-ebc70c9da6bd\" (UID: \"9351b8b6-df12-4eee-9e51-ebc70c9da6bd\") " Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.446224 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9351b8b6-df12-4eee-9e51-ebc70c9da6bd-utilities" (OuterVolumeSpecName: "utilities") pod "9351b8b6-df12-4eee-9e51-ebc70c9da6bd" (UID: "9351b8b6-df12-4eee-9e51-ebc70c9da6bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.468195 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9351b8b6-df12-4eee-9e51-ebc70c9da6bd-kube-api-access-r6kct" (OuterVolumeSpecName: "kube-api-access-r6kct") pod "9351b8b6-df12-4eee-9e51-ebc70c9da6bd" (UID: "9351b8b6-df12-4eee-9e51-ebc70c9da6bd"). InnerVolumeSpecName "kube-api-access-r6kct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.546345 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9351b8b6-df12-4eee-9e51-ebc70c9da6bd-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.546403 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6kct\" (UniqueName: \"kubernetes.io/projected/9351b8b6-df12-4eee-9e51-ebc70c9da6bd-kube-api-access-r6kct\") on node \"crc\" DevicePath \"\"" Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.547939 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9351b8b6-df12-4eee-9e51-ebc70c9da6bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9351b8b6-df12-4eee-9e51-ebc70c9da6bd" (UID: "9351b8b6-df12-4eee-9e51-ebc70c9da6bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.647617 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9351b8b6-df12-4eee-9e51-ebc70c9da6bd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.840147 4719 generic.go:334] "Generic (PLEG): container finished" podID="9351b8b6-df12-4eee-9e51-ebc70c9da6bd" containerID="02ab9509066ba24318502afffae750725f0ec4a6fe0561e81874bcc5c32ccce9" exitCode=0 Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.840247 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pqrf" event={"ID":"9351b8b6-df12-4eee-9e51-ebc70c9da6bd","Type":"ContainerDied","Data":"02ab9509066ba24318502afffae750725f0ec4a6fe0561e81874bcc5c32ccce9"} Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.840280 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pqrf" event={"ID":"9351b8b6-df12-4eee-9e51-ebc70c9da6bd","Type":"ContainerDied","Data":"67894cc0aae515b82a40c2ad85b686bbd54b0cdf6713349858bae47133775a5f"} Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.840289 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2pqrf" Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.840300 4719 scope.go:117] "RemoveContainer" containerID="02ab9509066ba24318502afffae750725f0ec4a6fe0561e81874bcc5c32ccce9" Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.843360 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bfn6d" event={"ID":"b211e254-37fb-4252-a758-4d9e781ad03d","Type":"ContainerStarted","Data":"2981648ef9c1fc526f0978e495f3c9ab6d3083acce6bc7497a6c05e1fdb67121"} Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.855548 4719 scope.go:117] "RemoveContainer" containerID="8100623dc56b84e9afd506ba8e1f49ee3d68cd37c4e46e453c7a0d3ceb86e491" Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.867590 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bfn6d" podStartSLOduration=2.00344701 podStartE2EDuration="45.86757057s" podCreationTimestamp="2025-10-09 15:20:44 +0000 UTC" firstStartedPulling="2025-10-09 15:20:45.379967439 +0000 UTC m=+150.889678724" lastFinishedPulling="2025-10-09 15:21:29.244090999 +0000 UTC m=+194.753802284" observedRunningTime="2025-10-09 15:21:29.864825951 +0000 UTC m=+195.374537256" watchObservedRunningTime="2025-10-09 15:21:29.86757057 +0000 UTC m=+195.377281855" Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.868949 4719 scope.go:117] "RemoveContainer" containerID="d8f8e2982b9ceaf6a35f6b707b8f942ccf3e5e38aeb6c03b7c89d29899a9131c" Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.883037 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2pqrf"] Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.890467 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2pqrf"] Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.897247 4719 scope.go:117] "RemoveContainer" containerID="02ab9509066ba24318502afffae750725f0ec4a6fe0561e81874bcc5c32ccce9" Oct 09 15:21:29 crc kubenswrapper[4719]: E1009 15:21:29.897705 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02ab9509066ba24318502afffae750725f0ec4a6fe0561e81874bcc5c32ccce9\": container with ID starting with 02ab9509066ba24318502afffae750725f0ec4a6fe0561e81874bcc5c32ccce9 not found: ID does not exist" containerID="02ab9509066ba24318502afffae750725f0ec4a6fe0561e81874bcc5c32ccce9" Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.897740 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02ab9509066ba24318502afffae750725f0ec4a6fe0561e81874bcc5c32ccce9"} err="failed to get container status \"02ab9509066ba24318502afffae750725f0ec4a6fe0561e81874bcc5c32ccce9\": rpc error: code = NotFound desc = could not find container \"02ab9509066ba24318502afffae750725f0ec4a6fe0561e81874bcc5c32ccce9\": container with ID starting with 02ab9509066ba24318502afffae750725f0ec4a6fe0561e81874bcc5c32ccce9 not found: ID does not exist" Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.897779 4719 scope.go:117] "RemoveContainer" containerID="8100623dc56b84e9afd506ba8e1f49ee3d68cd37c4e46e453c7a0d3ceb86e491" Oct 09 15:21:29 crc kubenswrapper[4719]: E1009 15:21:29.898059 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8100623dc56b84e9afd506ba8e1f49ee3d68cd37c4e46e453c7a0d3ceb86e491\": container with ID starting with 8100623dc56b84e9afd506ba8e1f49ee3d68cd37c4e46e453c7a0d3ceb86e491 not found: ID does not exist" containerID="8100623dc56b84e9afd506ba8e1f49ee3d68cd37c4e46e453c7a0d3ceb86e491" Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.898083 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8100623dc56b84e9afd506ba8e1f49ee3d68cd37c4e46e453c7a0d3ceb86e491"} err="failed to get container status \"8100623dc56b84e9afd506ba8e1f49ee3d68cd37c4e46e453c7a0d3ceb86e491\": rpc error: code = NotFound desc = could not find container \"8100623dc56b84e9afd506ba8e1f49ee3d68cd37c4e46e453c7a0d3ceb86e491\": container with ID starting with 8100623dc56b84e9afd506ba8e1f49ee3d68cd37c4e46e453c7a0d3ceb86e491 not found: ID does not exist" Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.898097 4719 scope.go:117] "RemoveContainer" containerID="d8f8e2982b9ceaf6a35f6b707b8f942ccf3e5e38aeb6c03b7c89d29899a9131c" Oct 09 15:21:29 crc kubenswrapper[4719]: E1009 15:21:29.898408 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8f8e2982b9ceaf6a35f6b707b8f942ccf3e5e38aeb6c03b7c89d29899a9131c\": container with ID starting with d8f8e2982b9ceaf6a35f6b707b8f942ccf3e5e38aeb6c03b7c89d29899a9131c not found: ID does not exist" containerID="d8f8e2982b9ceaf6a35f6b707b8f942ccf3e5e38aeb6c03b7c89d29899a9131c" Oct 09 15:21:29 crc kubenswrapper[4719]: I1009 15:21:29.898428 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8f8e2982b9ceaf6a35f6b707b8f942ccf3e5e38aeb6c03b7c89d29899a9131c"} err="failed to get container status \"d8f8e2982b9ceaf6a35f6b707b8f942ccf3e5e38aeb6c03b7c89d29899a9131c\": rpc error: code = NotFound desc = could not find container \"d8f8e2982b9ceaf6a35f6b707b8f942ccf3e5e38aeb6c03b7c89d29899a9131c\": container with ID starting with d8f8e2982b9ceaf6a35f6b707b8f942ccf3e5e38aeb6c03b7c89d29899a9131c not found: ID does not exist" Oct 09 15:21:31 crc kubenswrapper[4719]: I1009 15:21:31.167886 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9351b8b6-df12-4eee-9e51-ebc70c9da6bd" path="/var/lib/kubelet/pods/9351b8b6-df12-4eee-9e51-ebc70c9da6bd/volumes" Oct 09 15:21:31 crc kubenswrapper[4719]: I1009 15:21:31.862808 4719 generic.go:334] "Generic (PLEG): container finished" podID="84e55955-c37c-4897-ab18-f71812f3ccff" containerID="144d2e281045c08d389c86154c7c4d11fcfc332bdb04e8eb4662ed80a07c38d2" exitCode=0 Oct 09 15:21:31 crc kubenswrapper[4719]: I1009 15:21:31.862855 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzwhh" event={"ID":"84e55955-c37c-4897-ab18-f71812f3ccff","Type":"ContainerDied","Data":"144d2e281045c08d389c86154c7c4d11fcfc332bdb04e8eb4662ed80a07c38d2"} Oct 09 15:21:32 crc kubenswrapper[4719]: I1009 15:21:32.645917 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-458dz" Oct 09 15:21:32 crc kubenswrapper[4719]: I1009 15:21:32.646282 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-458dz" Oct 09 15:21:32 crc kubenswrapper[4719]: I1009 15:21:32.697064 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-458dz" Oct 09 15:21:32 crc kubenswrapper[4719]: I1009 15:21:32.872865 4719 generic.go:334] "Generic (PLEG): container finished" podID="7c3c0e14-9adc-47ec-af46-d0779a6c6e1d" containerID="919abbdd037c8084005d7c59d57a89258c35f7a2717371554035ccf4e77cbc31" exitCode=0 Oct 09 15:21:32 crc kubenswrapper[4719]: I1009 15:21:32.872958 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jr2j6" event={"ID":"7c3c0e14-9adc-47ec-af46-d0779a6c6e1d","Type":"ContainerDied","Data":"919abbdd037c8084005d7c59d57a89258c35f7a2717371554035ccf4e77cbc31"} Oct 09 15:21:32 crc kubenswrapper[4719]: I1009 15:21:32.876108 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzwhh" event={"ID":"84e55955-c37c-4897-ab18-f71812f3ccff","Type":"ContainerStarted","Data":"adc204002ebc9268cee537b1aeb8b8104b65f70ed4d636a9b367993ea02baa12"} Oct 09 15:21:32 crc kubenswrapper[4719]: I1009 15:21:32.908955 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nzwhh" podStartSLOduration=2.008227202 podStartE2EDuration="48.908911742s" podCreationTimestamp="2025-10-09 15:20:44 +0000 UTC" firstStartedPulling="2025-10-09 15:20:45.405907085 +0000 UTC m=+150.915618370" lastFinishedPulling="2025-10-09 15:21:32.306591625 +0000 UTC m=+197.816302910" observedRunningTime="2025-10-09 15:21:32.90725593 +0000 UTC m=+198.416967225" watchObservedRunningTime="2025-10-09 15:21:32.908911742 +0000 UTC m=+198.418623027" Oct 09 15:21:33 crc kubenswrapper[4719]: I1009 15:21:33.882097 4719 generic.go:334] "Generic (PLEG): container finished" podID="b6e7a7e4-2eff-468f-b764-b1b73c4285f4" containerID="611ba69f67f3b6eeb3f4549747aa5927ed93f01e6790af84e756d8cd2935ea69" exitCode=0 Oct 09 15:21:33 crc kubenswrapper[4719]: I1009 15:21:33.882190 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wb65c" event={"ID":"b6e7a7e4-2eff-468f-b764-b1b73c4285f4","Type":"ContainerDied","Data":"611ba69f67f3b6eeb3f4549747aa5927ed93f01e6790af84e756d8cd2935ea69"} Oct 09 15:21:33 crc kubenswrapper[4719]: I1009 15:21:33.885385 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jr2j6" event={"ID":"7c3c0e14-9adc-47ec-af46-d0779a6c6e1d","Type":"ContainerStarted","Data":"7039f8c88601d4a88d9d371afa891fd3878f7f9dda15e8bd5c979b1e6cbfc7a3"} Oct 09 15:21:33 crc kubenswrapper[4719]: I1009 15:21:33.920566 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jr2j6" podStartSLOduration=2.9541476749999998 podStartE2EDuration="51.920551247s" podCreationTimestamp="2025-10-09 15:20:42 +0000 UTC" firstStartedPulling="2025-10-09 15:20:44.291286057 +0000 UTC m=+149.800997342" lastFinishedPulling="2025-10-09 15:21:33.257689629 +0000 UTC m=+198.767400914" observedRunningTime="2025-10-09 15:21:33.918230946 +0000 UTC m=+199.427942241" watchObservedRunningTime="2025-10-09 15:21:33.920551247 +0000 UTC m=+199.430262532" Oct 09 15:21:34 crc kubenswrapper[4719]: I1009 15:21:34.437579 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nzwhh" Oct 09 15:21:34 crc kubenswrapper[4719]: I1009 15:21:34.437915 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nzwhh" Oct 09 15:21:34 crc kubenswrapper[4719]: I1009 15:21:34.475112 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nzwhh" Oct 09 15:21:34 crc kubenswrapper[4719]: I1009 15:21:34.848251 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bfn6d" Oct 09 15:21:34 crc kubenswrapper[4719]: I1009 15:21:34.848594 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bfn6d" Oct 09 15:21:34 crc kubenswrapper[4719]: I1009 15:21:34.888585 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bfn6d" Oct 09 15:21:34 crc kubenswrapper[4719]: I1009 15:21:34.893013 4719 generic.go:334] "Generic (PLEG): container finished" podID="ed97f513-40b6-4273-b6a5-9f9f5150e4cd" containerID="cd87fad9adbd89f7c930c17c5ad320a2c9c102c74d806bf2d53871e524528dbb" exitCode=0 Oct 09 15:21:34 crc kubenswrapper[4719]: I1009 15:21:34.893086 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8726g" event={"ID":"ed97f513-40b6-4273-b6a5-9f9f5150e4cd","Type":"ContainerDied","Data":"cd87fad9adbd89f7c930c17c5ad320a2c9c102c74d806bf2d53871e524528dbb"} Oct 09 15:21:34 crc kubenswrapper[4719]: I1009 15:21:34.895295 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wb65c" event={"ID":"b6e7a7e4-2eff-468f-b764-b1b73c4285f4","Type":"ContainerStarted","Data":"32ff36211d45cf9cf5a038c68ff2bc0c092df0024cb9b1a209c03c73f076a57e"} Oct 09 15:21:34 crc kubenswrapper[4719]: I1009 15:21:34.942087 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bfn6d" Oct 09 15:21:34 crc kubenswrapper[4719]: I1009 15:21:34.953196 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wb65c" podStartSLOduration=2.965379397 podStartE2EDuration="52.953117376s" podCreationTimestamp="2025-10-09 15:20:42 +0000 UTC" firstStartedPulling="2025-10-09 15:20:44.295947127 +0000 UTC m=+149.805658412" lastFinishedPulling="2025-10-09 15:21:34.283685106 +0000 UTC m=+199.793396391" observedRunningTime="2025-10-09 15:21:34.949926339 +0000 UTC m=+200.459637634" watchObservedRunningTime="2025-10-09 15:21:34.953117376 +0000 UTC m=+200.462828691" Oct 09 15:21:35 crc kubenswrapper[4719]: I1009 15:21:35.593734 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bfn6d"] Oct 09 15:21:35 crc kubenswrapper[4719]: I1009 15:21:35.902398 4719 generic.go:334] "Generic (PLEG): container finished" podID="f662f8ad-fe9b-40c9-845e-8f2749a6482d" containerID="ce55e4ace64cf0fbddbf1d92bb507aa29f2e15c65403636b84bc5dca3aab536d" exitCode=0 Oct 09 15:21:35 crc kubenswrapper[4719]: I1009 15:21:35.902472 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrqj4" event={"ID":"f662f8ad-fe9b-40c9-845e-8f2749a6482d","Type":"ContainerDied","Data":"ce55e4ace64cf0fbddbf1d92bb507aa29f2e15c65403636b84bc5dca3aab536d"} Oct 09 15:21:35 crc kubenswrapper[4719]: I1009 15:21:35.905025 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8726g" event={"ID":"ed97f513-40b6-4273-b6a5-9f9f5150e4cd","Type":"ContainerStarted","Data":"3a5e42928051c0b346e579953af6446026583140155d9a9da6bd8476093e7c75"} Oct 09 15:21:35 crc kubenswrapper[4719]: I1009 15:21:35.940418 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8726g" podStartSLOduration=2.918087889 podStartE2EDuration="53.940400184s" podCreationTimestamp="2025-10-09 15:20:42 +0000 UTC" firstStartedPulling="2025-10-09 15:20:44.305069022 +0000 UTC m=+149.814780307" lastFinishedPulling="2025-10-09 15:21:35.327381317 +0000 UTC m=+200.837092602" observedRunningTime="2025-10-09 15:21:35.938768053 +0000 UTC m=+201.448479348" watchObservedRunningTime="2025-10-09 15:21:35.940400184 +0000 UTC m=+201.450111459" Oct 09 15:21:36 crc kubenswrapper[4719]: I1009 15:21:36.911735 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrqj4" event={"ID":"f662f8ad-fe9b-40c9-845e-8f2749a6482d","Type":"ContainerStarted","Data":"eb8deadd96a6798c61c0933c513d1ca3d045a55a795f8752f92c1952d5c3b99e"} Oct 09 15:21:36 crc kubenswrapper[4719]: I1009 15:21:36.911869 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bfn6d" podUID="b211e254-37fb-4252-a758-4d9e781ad03d" containerName="registry-server" containerID="cri-o://2981648ef9c1fc526f0978e495f3c9ab6d3083acce6bc7497a6c05e1fdb67121" gracePeriod=2 Oct 09 15:21:36 crc kubenswrapper[4719]: I1009 15:21:36.977713 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:21:36 crc kubenswrapper[4719]: I1009 15:21:36.977774 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:21:36 crc kubenswrapper[4719]: I1009 15:21:36.977823 4719 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 15:21:36 crc kubenswrapper[4719]: I1009 15:21:36.978460 4719 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a"} pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 15:21:36 crc kubenswrapper[4719]: I1009 15:21:36.978523 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" containerID="cri-o://b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a" gracePeriod=600 Oct 09 15:21:37 crc kubenswrapper[4719]: I1009 15:21:37.658997 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jrqj4" podStartSLOduration=3.896812881 podStartE2EDuration="52.658979392s" podCreationTimestamp="2025-10-09 15:20:45 +0000 UTC" firstStartedPulling="2025-10-09 15:20:47.523344747 +0000 UTC m=+153.033056032" lastFinishedPulling="2025-10-09 15:21:36.285511258 +0000 UTC m=+201.795222543" observedRunningTime="2025-10-09 15:21:36.935943115 +0000 UTC m=+202.445654410" watchObservedRunningTime="2025-10-09 15:21:37.658979392 +0000 UTC m=+203.168690677" Oct 09 15:21:37 crc kubenswrapper[4719]: I1009 15:21:37.661060 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-db9tz"] Oct 09 15:21:37 crc kubenswrapper[4719]: I1009 15:21:37.936342 4719 generic.go:334] "Generic (PLEG): container finished" podID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerID="b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a" exitCode=0 Oct 09 15:21:37 crc kubenswrapper[4719]: I1009 15:21:37.936394 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerDied","Data":"b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a"} Oct 09 15:21:37 crc kubenswrapper[4719]: I1009 15:21:37.940705 4719 generic.go:334] "Generic (PLEG): container finished" podID="b211e254-37fb-4252-a758-4d9e781ad03d" containerID="2981648ef9c1fc526f0978e495f3c9ab6d3083acce6bc7497a6c05e1fdb67121" exitCode=0 Oct 09 15:21:37 crc kubenswrapper[4719]: I1009 15:21:37.940744 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bfn6d" event={"ID":"b211e254-37fb-4252-a758-4d9e781ad03d","Type":"ContainerDied","Data":"2981648ef9c1fc526f0978e495f3c9ab6d3083acce6bc7497a6c05e1fdb67121"} Oct 09 15:21:38 crc kubenswrapper[4719]: I1009 15:21:38.254715 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bfn6d" Oct 09 15:21:38 crc kubenswrapper[4719]: I1009 15:21:38.287752 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f69vv\" (UniqueName: \"kubernetes.io/projected/b211e254-37fb-4252-a758-4d9e781ad03d-kube-api-access-f69vv\") pod \"b211e254-37fb-4252-a758-4d9e781ad03d\" (UID: \"b211e254-37fb-4252-a758-4d9e781ad03d\") " Oct 09 15:21:38 crc kubenswrapper[4719]: I1009 15:21:38.287808 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b211e254-37fb-4252-a758-4d9e781ad03d-utilities\") pod \"b211e254-37fb-4252-a758-4d9e781ad03d\" (UID: \"b211e254-37fb-4252-a758-4d9e781ad03d\") " Oct 09 15:21:38 crc kubenswrapper[4719]: I1009 15:21:38.287849 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b211e254-37fb-4252-a758-4d9e781ad03d-catalog-content\") pod \"b211e254-37fb-4252-a758-4d9e781ad03d\" (UID: \"b211e254-37fb-4252-a758-4d9e781ad03d\") " Oct 09 15:21:38 crc kubenswrapper[4719]: I1009 15:21:38.288730 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b211e254-37fb-4252-a758-4d9e781ad03d-utilities" (OuterVolumeSpecName: "utilities") pod "b211e254-37fb-4252-a758-4d9e781ad03d" (UID: "b211e254-37fb-4252-a758-4d9e781ad03d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:21:38 crc kubenswrapper[4719]: I1009 15:21:38.292960 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b211e254-37fb-4252-a758-4d9e781ad03d-kube-api-access-f69vv" (OuterVolumeSpecName: "kube-api-access-f69vv") pod "b211e254-37fb-4252-a758-4d9e781ad03d" (UID: "b211e254-37fb-4252-a758-4d9e781ad03d"). InnerVolumeSpecName "kube-api-access-f69vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:21:38 crc kubenswrapper[4719]: I1009 15:21:38.312314 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b211e254-37fb-4252-a758-4d9e781ad03d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b211e254-37fb-4252-a758-4d9e781ad03d" (UID: "b211e254-37fb-4252-a758-4d9e781ad03d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:21:38 crc kubenswrapper[4719]: I1009 15:21:38.389478 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b211e254-37fb-4252-a758-4d9e781ad03d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 15:21:38 crc kubenswrapper[4719]: I1009 15:21:38.389798 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f69vv\" (UniqueName: \"kubernetes.io/projected/b211e254-37fb-4252-a758-4d9e781ad03d-kube-api-access-f69vv\") on node \"crc\" DevicePath \"\"" Oct 09 15:21:38 crc kubenswrapper[4719]: I1009 15:21:38.389811 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b211e254-37fb-4252-a758-4d9e781ad03d-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 15:21:38 crc kubenswrapper[4719]: I1009 15:21:38.946544 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerStarted","Data":"58630cc589d6ba8e40a40e1e3c93cc21531a1b6e5470575e2e8a4d654789d22a"} Oct 09 15:21:38 crc kubenswrapper[4719]: I1009 15:21:38.950087 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bfn6d" event={"ID":"b211e254-37fb-4252-a758-4d9e781ad03d","Type":"ContainerDied","Data":"196ed975c6618f7b995124c26cd7280349fdff5e91afcd03715906d0c6283d35"} Oct 09 15:21:38 crc kubenswrapper[4719]: I1009 15:21:38.950140 4719 scope.go:117] "RemoveContainer" containerID="2981648ef9c1fc526f0978e495f3c9ab6d3083acce6bc7497a6c05e1fdb67121" Oct 09 15:21:38 crc kubenswrapper[4719]: I1009 15:21:38.950174 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bfn6d" Oct 09 15:21:38 crc kubenswrapper[4719]: I1009 15:21:38.974511 4719 scope.go:117] "RemoveContainer" containerID="d872d5605abc16f22398760efee80b22e4e49b25e4db7191d618d356639f9b25" Oct 09 15:21:38 crc kubenswrapper[4719]: I1009 15:21:38.980878 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bfn6d"] Oct 09 15:21:38 crc kubenswrapper[4719]: I1009 15:21:38.984257 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bfn6d"] Oct 09 15:21:39 crc kubenswrapper[4719]: I1009 15:21:39.019485 4719 scope.go:117] "RemoveContainer" containerID="cc0667b7909440c57773bb7b878d9e0e0c5ecead750c07918fdf3f77cd05b7f9" Oct 09 15:21:39 crc kubenswrapper[4719]: I1009 15:21:39.169007 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b211e254-37fb-4252-a758-4d9e781ad03d" path="/var/lib/kubelet/pods/b211e254-37fb-4252-a758-4d9e781ad03d/volumes" Oct 09 15:21:42 crc kubenswrapper[4719]: I1009 15:21:42.495621 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8726g" Oct 09 15:21:42 crc kubenswrapper[4719]: I1009 15:21:42.495829 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8726g" Oct 09 15:21:42 crc kubenswrapper[4719]: I1009 15:21:42.550098 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8726g" Oct 09 15:21:42 crc kubenswrapper[4719]: I1009 15:21:42.682181 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-458dz" Oct 09 15:21:42 crc kubenswrapper[4719]: I1009 15:21:42.877697 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jr2j6" Oct 09 15:21:42 crc kubenswrapper[4719]: I1009 15:21:42.877974 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jr2j6" Oct 09 15:21:42 crc kubenswrapper[4719]: I1009 15:21:42.919318 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jr2j6" Oct 09 15:21:43 crc kubenswrapper[4719]: I1009 15:21:43.010699 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8726g" Oct 09 15:21:43 crc kubenswrapper[4719]: I1009 15:21:43.010769 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jr2j6" Oct 09 15:21:43 crc kubenswrapper[4719]: I1009 15:21:43.091524 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wb65c" Oct 09 15:21:43 crc kubenswrapper[4719]: I1009 15:21:43.092221 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wb65c" Oct 09 15:21:43 crc kubenswrapper[4719]: I1009 15:21:43.134831 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wb65c" Oct 09 15:21:44 crc kubenswrapper[4719]: I1009 15:21:44.009995 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wb65c" Oct 09 15:21:44 crc kubenswrapper[4719]: I1009 15:21:44.474228 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nzwhh" Oct 09 15:21:44 crc kubenswrapper[4719]: I1009 15:21:44.791269 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jr2j6"] Oct 09 15:21:44 crc kubenswrapper[4719]: I1009 15:21:44.995607 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wb65c"] Oct 09 15:21:45 crc kubenswrapper[4719]: I1009 15:21:45.727617 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jrqj4" Oct 09 15:21:45 crc kubenswrapper[4719]: I1009 15:21:45.727923 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jrqj4" Oct 09 15:21:45 crc kubenswrapper[4719]: I1009 15:21:45.771411 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jrqj4" Oct 09 15:21:45 crc kubenswrapper[4719]: I1009 15:21:45.994612 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jr2j6" podUID="7c3c0e14-9adc-47ec-af46-d0779a6c6e1d" containerName="registry-server" containerID="cri-o://7039f8c88601d4a88d9d371afa891fd3878f7f9dda15e8bd5c979b1e6cbfc7a3" gracePeriod=2 Oct 09 15:21:46 crc kubenswrapper[4719]: I1009 15:21:46.042797 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jrqj4" Oct 09 15:21:47 crc kubenswrapper[4719]: I1009 15:21:47.001371 4719 generic.go:334] "Generic (PLEG): container finished" podID="7c3c0e14-9adc-47ec-af46-d0779a6c6e1d" containerID="7039f8c88601d4a88d9d371afa891fd3878f7f9dda15e8bd5c979b1e6cbfc7a3" exitCode=0 Oct 09 15:21:47 crc kubenswrapper[4719]: I1009 15:21:47.001388 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jr2j6" event={"ID":"7c3c0e14-9adc-47ec-af46-d0779a6c6e1d","Type":"ContainerDied","Data":"7039f8c88601d4a88d9d371afa891fd3878f7f9dda15e8bd5c979b1e6cbfc7a3"} Oct 09 15:21:47 crc kubenswrapper[4719]: I1009 15:21:47.001815 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wb65c" podUID="b6e7a7e4-2eff-468f-b764-b1b73c4285f4" containerName="registry-server" containerID="cri-o://32ff36211d45cf9cf5a038c68ff2bc0c092df0024cb9b1a209c03c73f076a57e" gracePeriod=2 Oct 09 15:21:47 crc kubenswrapper[4719]: I1009 15:21:47.256140 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jr2j6" Oct 09 15:21:47 crc kubenswrapper[4719]: I1009 15:21:47.300767 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c3c0e14-9adc-47ec-af46-d0779a6c6e1d-catalog-content\") pod \"7c3c0e14-9adc-47ec-af46-d0779a6c6e1d\" (UID: \"7c3c0e14-9adc-47ec-af46-d0779a6c6e1d\") " Oct 09 15:21:47 crc kubenswrapper[4719]: I1009 15:21:47.300836 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72pjn\" (UniqueName: \"kubernetes.io/projected/7c3c0e14-9adc-47ec-af46-d0779a6c6e1d-kube-api-access-72pjn\") pod \"7c3c0e14-9adc-47ec-af46-d0779a6c6e1d\" (UID: \"7c3c0e14-9adc-47ec-af46-d0779a6c6e1d\") " Oct 09 15:21:47 crc kubenswrapper[4719]: I1009 15:21:47.300898 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c3c0e14-9adc-47ec-af46-d0779a6c6e1d-utilities\") pod \"7c3c0e14-9adc-47ec-af46-d0779a6c6e1d\" (UID: \"7c3c0e14-9adc-47ec-af46-d0779a6c6e1d\") " Oct 09 15:21:47 crc kubenswrapper[4719]: I1009 15:21:47.302151 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c3c0e14-9adc-47ec-af46-d0779a6c6e1d-utilities" (OuterVolumeSpecName: "utilities") pod "7c3c0e14-9adc-47ec-af46-d0779a6c6e1d" (UID: "7c3c0e14-9adc-47ec-af46-d0779a6c6e1d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:21:47 crc kubenswrapper[4719]: I1009 15:21:47.307227 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c3c0e14-9adc-47ec-af46-d0779a6c6e1d-kube-api-access-72pjn" (OuterVolumeSpecName: "kube-api-access-72pjn") pod "7c3c0e14-9adc-47ec-af46-d0779a6c6e1d" (UID: "7c3c0e14-9adc-47ec-af46-d0779a6c6e1d"). InnerVolumeSpecName "kube-api-access-72pjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:21:47 crc kubenswrapper[4719]: I1009 15:21:47.358622 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c3c0e14-9adc-47ec-af46-d0779a6c6e1d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c3c0e14-9adc-47ec-af46-d0779a6c6e1d" (UID: "7c3c0e14-9adc-47ec-af46-d0779a6c6e1d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:21:47 crc kubenswrapper[4719]: I1009 15:21:47.402569 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c3c0e14-9adc-47ec-af46-d0779a6c6e1d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 15:21:47 crc kubenswrapper[4719]: I1009 15:21:47.402603 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72pjn\" (UniqueName: \"kubernetes.io/projected/7c3c0e14-9adc-47ec-af46-d0779a6c6e1d-kube-api-access-72pjn\") on node \"crc\" DevicePath \"\"" Oct 09 15:21:47 crc kubenswrapper[4719]: I1009 15:21:47.402615 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c3c0e14-9adc-47ec-af46-d0779a6c6e1d-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 15:21:47 crc kubenswrapper[4719]: I1009 15:21:47.827841 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wb65c" Oct 09 15:21:47 crc kubenswrapper[4719]: I1009 15:21:47.907299 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6e7a7e4-2eff-468f-b764-b1b73c4285f4-catalog-content\") pod \"b6e7a7e4-2eff-468f-b764-b1b73c4285f4\" (UID: \"b6e7a7e4-2eff-468f-b764-b1b73c4285f4\") " Oct 09 15:21:47 crc kubenswrapper[4719]: I1009 15:21:47.907451 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrv4p\" (UniqueName: \"kubernetes.io/projected/b6e7a7e4-2eff-468f-b764-b1b73c4285f4-kube-api-access-nrv4p\") pod \"b6e7a7e4-2eff-468f-b764-b1b73c4285f4\" (UID: \"b6e7a7e4-2eff-468f-b764-b1b73c4285f4\") " Oct 09 15:21:47 crc kubenswrapper[4719]: I1009 15:21:47.907510 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6e7a7e4-2eff-468f-b764-b1b73c4285f4-utilities\") pod \"b6e7a7e4-2eff-468f-b764-b1b73c4285f4\" (UID: \"b6e7a7e4-2eff-468f-b764-b1b73c4285f4\") " Oct 09 15:21:47 crc kubenswrapper[4719]: I1009 15:21:47.908515 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6e7a7e4-2eff-468f-b764-b1b73c4285f4-utilities" (OuterVolumeSpecName: "utilities") pod "b6e7a7e4-2eff-468f-b764-b1b73c4285f4" (UID: "b6e7a7e4-2eff-468f-b764-b1b73c4285f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:21:47 crc kubenswrapper[4719]: I1009 15:21:47.911985 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6e7a7e4-2eff-468f-b764-b1b73c4285f4-kube-api-access-nrv4p" (OuterVolumeSpecName: "kube-api-access-nrv4p") pod "b6e7a7e4-2eff-468f-b764-b1b73c4285f4" (UID: "b6e7a7e4-2eff-468f-b764-b1b73c4285f4"). InnerVolumeSpecName "kube-api-access-nrv4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:21:47 crc kubenswrapper[4719]: I1009 15:21:47.960726 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6e7a7e4-2eff-468f-b764-b1b73c4285f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6e7a7e4-2eff-468f-b764-b1b73c4285f4" (UID: "b6e7a7e4-2eff-468f-b764-b1b73c4285f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:21:48 crc kubenswrapper[4719]: I1009 15:21:48.008251 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jr2j6" event={"ID":"7c3c0e14-9adc-47ec-af46-d0779a6c6e1d","Type":"ContainerDied","Data":"12ed7ebd8cf2758f73be552d9e34b09f140cb28f7e283d91595ff06451197635"} Oct 09 15:21:48 crc kubenswrapper[4719]: I1009 15:21:48.008303 4719 scope.go:117] "RemoveContainer" containerID="7039f8c88601d4a88d9d371afa891fd3878f7f9dda15e8bd5c979b1e6cbfc7a3" Oct 09 15:21:48 crc kubenswrapper[4719]: I1009 15:21:48.008302 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jr2j6" Oct 09 15:21:48 crc kubenswrapper[4719]: I1009 15:21:48.009234 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6e7a7e4-2eff-468f-b764-b1b73c4285f4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 15:21:48 crc kubenswrapper[4719]: I1009 15:21:48.009279 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrv4p\" (UniqueName: \"kubernetes.io/projected/b6e7a7e4-2eff-468f-b764-b1b73c4285f4-kube-api-access-nrv4p\") on node \"crc\" DevicePath \"\"" Oct 09 15:21:48 crc kubenswrapper[4719]: I1009 15:21:48.009291 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6e7a7e4-2eff-468f-b764-b1b73c4285f4-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 15:21:48 crc kubenswrapper[4719]: I1009 15:21:48.011020 4719 generic.go:334] "Generic (PLEG): container finished" podID="b6e7a7e4-2eff-468f-b764-b1b73c4285f4" containerID="32ff36211d45cf9cf5a038c68ff2bc0c092df0024cb9b1a209c03c73f076a57e" exitCode=0 Oct 09 15:21:48 crc kubenswrapper[4719]: I1009 15:21:48.011052 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wb65c" event={"ID":"b6e7a7e4-2eff-468f-b764-b1b73c4285f4","Type":"ContainerDied","Data":"32ff36211d45cf9cf5a038c68ff2bc0c092df0024cb9b1a209c03c73f076a57e"} Oct 09 15:21:48 crc kubenswrapper[4719]: I1009 15:21:48.011080 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wb65c" event={"ID":"b6e7a7e4-2eff-468f-b764-b1b73c4285f4","Type":"ContainerDied","Data":"b0b4e21f6ba6656a3000556c4340e99265dc16bfd49348c38df76e423dcf3441"} Oct 09 15:21:48 crc kubenswrapper[4719]: I1009 15:21:48.011135 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wb65c" Oct 09 15:21:48 crc kubenswrapper[4719]: I1009 15:21:48.023276 4719 scope.go:117] "RemoveContainer" containerID="919abbdd037c8084005d7c59d57a89258c35f7a2717371554035ccf4e77cbc31" Oct 09 15:21:48 crc kubenswrapper[4719]: I1009 15:21:48.035160 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jr2j6"] Oct 09 15:21:48 crc kubenswrapper[4719]: I1009 15:21:48.038155 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jr2j6"] Oct 09 15:21:48 crc kubenswrapper[4719]: I1009 15:21:48.040978 4719 scope.go:117] "RemoveContainer" containerID="951425d60be53159e62068ddf26998b12dadc7c9dba4747729e3d9c09a357faf" Oct 09 15:21:48 crc kubenswrapper[4719]: I1009 15:21:48.088093 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wb65c"] Oct 09 15:21:48 crc kubenswrapper[4719]: I1009 15:21:48.090794 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wb65c"] Oct 09 15:21:48 crc kubenswrapper[4719]: I1009 15:21:48.100268 4719 scope.go:117] "RemoveContainer" containerID="32ff36211d45cf9cf5a038c68ff2bc0c092df0024cb9b1a209c03c73f076a57e" Oct 09 15:21:48 crc kubenswrapper[4719]: I1009 15:21:48.111637 4719 scope.go:117] "RemoveContainer" containerID="611ba69f67f3b6eeb3f4549747aa5927ed93f01e6790af84e756d8cd2935ea69" Oct 09 15:21:48 crc kubenswrapper[4719]: I1009 15:21:48.133694 4719 scope.go:117] "RemoveContainer" containerID="202fcfe5e5b21b7f8e8b23374809fb626392a26b3ee94912a17717b3cb138518" Oct 09 15:21:48 crc kubenswrapper[4719]: I1009 15:21:48.151954 4719 scope.go:117] "RemoveContainer" containerID="32ff36211d45cf9cf5a038c68ff2bc0c092df0024cb9b1a209c03c73f076a57e" Oct 09 15:21:48 crc kubenswrapper[4719]: E1009 15:21:48.152397 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32ff36211d45cf9cf5a038c68ff2bc0c092df0024cb9b1a209c03c73f076a57e\": container with ID starting with 32ff36211d45cf9cf5a038c68ff2bc0c092df0024cb9b1a209c03c73f076a57e not found: ID does not exist" containerID="32ff36211d45cf9cf5a038c68ff2bc0c092df0024cb9b1a209c03c73f076a57e" Oct 09 15:21:48 crc kubenswrapper[4719]: I1009 15:21:48.152437 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32ff36211d45cf9cf5a038c68ff2bc0c092df0024cb9b1a209c03c73f076a57e"} err="failed to get container status \"32ff36211d45cf9cf5a038c68ff2bc0c092df0024cb9b1a209c03c73f076a57e\": rpc error: code = NotFound desc = could not find container \"32ff36211d45cf9cf5a038c68ff2bc0c092df0024cb9b1a209c03c73f076a57e\": container with ID starting with 32ff36211d45cf9cf5a038c68ff2bc0c092df0024cb9b1a209c03c73f076a57e not found: ID does not exist" Oct 09 15:21:48 crc kubenswrapper[4719]: I1009 15:21:48.152464 4719 scope.go:117] "RemoveContainer" containerID="611ba69f67f3b6eeb3f4549747aa5927ed93f01e6790af84e756d8cd2935ea69" Oct 09 15:21:48 crc kubenswrapper[4719]: E1009 15:21:48.152886 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"611ba69f67f3b6eeb3f4549747aa5927ed93f01e6790af84e756d8cd2935ea69\": container with ID starting with 611ba69f67f3b6eeb3f4549747aa5927ed93f01e6790af84e756d8cd2935ea69 not found: ID does not exist" containerID="611ba69f67f3b6eeb3f4549747aa5927ed93f01e6790af84e756d8cd2935ea69" Oct 09 15:21:48 crc kubenswrapper[4719]: I1009 15:21:48.152908 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"611ba69f67f3b6eeb3f4549747aa5927ed93f01e6790af84e756d8cd2935ea69"} err="failed to get container status \"611ba69f67f3b6eeb3f4549747aa5927ed93f01e6790af84e756d8cd2935ea69\": rpc error: code = NotFound desc = could not find container \"611ba69f67f3b6eeb3f4549747aa5927ed93f01e6790af84e756d8cd2935ea69\": container with ID starting with 611ba69f67f3b6eeb3f4549747aa5927ed93f01e6790af84e756d8cd2935ea69 not found: ID does not exist" Oct 09 15:21:48 crc kubenswrapper[4719]: I1009 15:21:48.152921 4719 scope.go:117] "RemoveContainer" containerID="202fcfe5e5b21b7f8e8b23374809fb626392a26b3ee94912a17717b3cb138518" Oct 09 15:21:48 crc kubenswrapper[4719]: E1009 15:21:48.153140 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"202fcfe5e5b21b7f8e8b23374809fb626392a26b3ee94912a17717b3cb138518\": container with ID starting with 202fcfe5e5b21b7f8e8b23374809fb626392a26b3ee94912a17717b3cb138518 not found: ID does not exist" containerID="202fcfe5e5b21b7f8e8b23374809fb626392a26b3ee94912a17717b3cb138518" Oct 09 15:21:48 crc kubenswrapper[4719]: I1009 15:21:48.153161 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"202fcfe5e5b21b7f8e8b23374809fb626392a26b3ee94912a17717b3cb138518"} err="failed to get container status \"202fcfe5e5b21b7f8e8b23374809fb626392a26b3ee94912a17717b3cb138518\": rpc error: code = NotFound desc = could not find container \"202fcfe5e5b21b7f8e8b23374809fb626392a26b3ee94912a17717b3cb138518\": container with ID starting with 202fcfe5e5b21b7f8e8b23374809fb626392a26b3ee94912a17717b3cb138518 not found: ID does not exist" Oct 09 15:21:49 crc kubenswrapper[4719]: I1009 15:21:49.168735 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c3c0e14-9adc-47ec-af46-d0779a6c6e1d" path="/var/lib/kubelet/pods/7c3c0e14-9adc-47ec-af46-d0779a6c6e1d/volumes" Oct 09 15:21:49 crc kubenswrapper[4719]: I1009 15:21:49.169500 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6e7a7e4-2eff-468f-b764-b1b73c4285f4" path="/var/lib/kubelet/pods/b6e7a7e4-2eff-468f-b764-b1b73c4285f4/volumes" Oct 09 15:22:02 crc kubenswrapper[4719]: I1009 15:22:02.687107 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" podUID="2a4743d0-e646-45ab-a225-816c0d99246a" containerName="oauth-openshift" containerID="cri-o://368651fe1e7a4e823d7ef7fc1036b74d1ed08a186e6c6f7bbee4c752bb869142" gracePeriod=15 Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.084578 4719 generic.go:334] "Generic (PLEG): container finished" podID="2a4743d0-e646-45ab-a225-816c0d99246a" containerID="368651fe1e7a4e823d7ef7fc1036b74d1ed08a186e6c6f7bbee4c752bb869142" exitCode=0 Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.084924 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" event={"ID":"2a4743d0-e646-45ab-a225-816c0d99246a","Type":"ContainerDied","Data":"368651fe1e7a4e823d7ef7fc1036b74d1ed08a186e6c6f7bbee4c752bb869142"} Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.085078 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" event={"ID":"2a4743d0-e646-45ab-a225-816c0d99246a","Type":"ContainerDied","Data":"0d62ffa5d2b292d56e53efa5265a5a666660a9a2c862e3a12faf8ee2f5e7331c"} Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.085099 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d62ffa5d2b292d56e53efa5265a5a666660a9a2c862e3a12faf8ee2f5e7331c" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.114744 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.143375 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-bb968f6ff-pl6xp"] Oct 09 15:22:03 crc kubenswrapper[4719]: E1009 15:22:03.143569 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c3c0e14-9adc-47ec-af46-d0779a6c6e1d" containerName="extract-utilities" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.143582 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c3c0e14-9adc-47ec-af46-d0779a6c6e1d" containerName="extract-utilities" Oct 09 15:22:03 crc kubenswrapper[4719]: E1009 15:22:03.143589 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e7a7e4-2eff-468f-b764-b1b73c4285f4" containerName="registry-server" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.143595 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e7a7e4-2eff-468f-b764-b1b73c4285f4" containerName="registry-server" Oct 09 15:22:03 crc kubenswrapper[4719]: E1009 15:22:03.143604 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e7a7e4-2eff-468f-b764-b1b73c4285f4" containerName="extract-content" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.143610 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e7a7e4-2eff-468f-b764-b1b73c4285f4" containerName="extract-content" Oct 09 15:22:03 crc kubenswrapper[4719]: E1009 15:22:03.143623 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c3c0e14-9adc-47ec-af46-d0779a6c6e1d" containerName="extract-content" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.143629 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c3c0e14-9adc-47ec-af46-d0779a6c6e1d" containerName="extract-content" Oct 09 15:22:03 crc kubenswrapper[4719]: E1009 15:22:03.143638 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e7a7e4-2eff-468f-b764-b1b73c4285f4" containerName="extract-utilities" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.143643 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e7a7e4-2eff-468f-b764-b1b73c4285f4" containerName="extract-utilities" Oct 09 15:22:03 crc kubenswrapper[4719]: E1009 15:22:03.143651 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54705293-788b-4c76-851e-2ff0563877ca" containerName="pruner" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.143666 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="54705293-788b-4c76-851e-2ff0563877ca" containerName="pruner" Oct 09 15:22:03 crc kubenswrapper[4719]: E1009 15:22:03.143677 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b211e254-37fb-4252-a758-4d9e781ad03d" containerName="extract-content" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.143684 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="b211e254-37fb-4252-a758-4d9e781ad03d" containerName="extract-content" Oct 09 15:22:03 crc kubenswrapper[4719]: E1009 15:22:03.143698 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34716fe0-1567-461c-9844-db7bce6942f6" containerName="pruner" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.143705 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="34716fe0-1567-461c-9844-db7bce6942f6" containerName="pruner" Oct 09 15:22:03 crc kubenswrapper[4719]: E1009 15:22:03.143715 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c3c0e14-9adc-47ec-af46-d0779a6c6e1d" containerName="registry-server" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.143721 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c3c0e14-9adc-47ec-af46-d0779a6c6e1d" containerName="registry-server" Oct 09 15:22:03 crc kubenswrapper[4719]: E1009 15:22:03.143731 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b211e254-37fb-4252-a758-4d9e781ad03d" containerName="registry-server" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.143737 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="b211e254-37fb-4252-a758-4d9e781ad03d" containerName="registry-server" Oct 09 15:22:03 crc kubenswrapper[4719]: E1009 15:22:03.143746 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a4743d0-e646-45ab-a225-816c0d99246a" containerName="oauth-openshift" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.143754 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a4743d0-e646-45ab-a225-816c0d99246a" containerName="oauth-openshift" Oct 09 15:22:03 crc kubenswrapper[4719]: E1009 15:22:03.143764 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9351b8b6-df12-4eee-9e51-ebc70c9da6bd" containerName="extract-content" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.143773 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="9351b8b6-df12-4eee-9e51-ebc70c9da6bd" containerName="extract-content" Oct 09 15:22:03 crc kubenswrapper[4719]: E1009 15:22:03.143785 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24507f61-1a02-438c-b1ca-82515867e605" containerName="collect-profiles" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.143794 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="24507f61-1a02-438c-b1ca-82515867e605" containerName="collect-profiles" Oct 09 15:22:03 crc kubenswrapper[4719]: E1009 15:22:03.143805 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9351b8b6-df12-4eee-9e51-ebc70c9da6bd" containerName="registry-server" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.143813 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="9351b8b6-df12-4eee-9e51-ebc70c9da6bd" containerName="registry-server" Oct 09 15:22:03 crc kubenswrapper[4719]: E1009 15:22:03.143825 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b211e254-37fb-4252-a758-4d9e781ad03d" containerName="extract-utilities" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.143833 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="b211e254-37fb-4252-a758-4d9e781ad03d" containerName="extract-utilities" Oct 09 15:22:03 crc kubenswrapper[4719]: E1009 15:22:03.143842 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9351b8b6-df12-4eee-9e51-ebc70c9da6bd" containerName="extract-utilities" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.143850 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="9351b8b6-df12-4eee-9e51-ebc70c9da6bd" containerName="extract-utilities" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.143954 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="54705293-788b-4c76-851e-2ff0563877ca" containerName="pruner" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.143966 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="34716fe0-1567-461c-9844-db7bce6942f6" containerName="pruner" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.143974 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c3c0e14-9adc-47ec-af46-d0779a6c6e1d" containerName="registry-server" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.143982 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="b211e254-37fb-4252-a758-4d9e781ad03d" containerName="registry-server" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.143991 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6e7a7e4-2eff-468f-b764-b1b73c4285f4" containerName="registry-server" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.143997 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="9351b8b6-df12-4eee-9e51-ebc70c9da6bd" containerName="registry-server" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.144004 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a4743d0-e646-45ab-a225-816c0d99246a" containerName="oauth-openshift" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.144010 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="24507f61-1a02-438c-b1ca-82515867e605" containerName="collect-profiles" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.144412 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.174325 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-bb968f6ff-pl6xp"] Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.191771 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-trusted-ca-bundle\") pod \"2a4743d0-e646-45ab-a225-816c0d99246a\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.191829 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-serving-cert\") pod \"2a4743d0-e646-45ab-a225-816c0d99246a\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.191865 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-session\") pod \"2a4743d0-e646-45ab-a225-816c0d99246a\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.191898 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-user-template-provider-selection\") pod \"2a4743d0-e646-45ab-a225-816c0d99246a\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.191929 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2a4743d0-e646-45ab-a225-816c0d99246a-audit-dir\") pod \"2a4743d0-e646-45ab-a225-816c0d99246a\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.191973 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsdfm\" (UniqueName: \"kubernetes.io/projected/2a4743d0-e646-45ab-a225-816c0d99246a-kube-api-access-vsdfm\") pod \"2a4743d0-e646-45ab-a225-816c0d99246a\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.191999 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-ocp-branding-template\") pod \"2a4743d0-e646-45ab-a225-816c0d99246a\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.192044 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-cliconfig\") pod \"2a4743d0-e646-45ab-a225-816c0d99246a\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.192053 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2a4743d0-e646-45ab-a225-816c0d99246a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "2a4743d0-e646-45ab-a225-816c0d99246a" (UID: "2a4743d0-e646-45ab-a225-816c0d99246a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.192071 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-user-template-error\") pod \"2a4743d0-e646-45ab-a225-816c0d99246a\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.192173 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-user-template-login\") pod \"2a4743d0-e646-45ab-a225-816c0d99246a\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.192202 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-service-ca\") pod \"2a4743d0-e646-45ab-a225-816c0d99246a\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.192222 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2a4743d0-e646-45ab-a225-816c0d99246a-audit-policies\") pod \"2a4743d0-e646-45ab-a225-816c0d99246a\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.192246 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-user-idp-0-file-data\") pod \"2a4743d0-e646-45ab-a225-816c0d99246a\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.192270 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-router-certs\") pod \"2a4743d0-e646-45ab-a225-816c0d99246a\" (UID: \"2a4743d0-e646-45ab-a225-816c0d99246a\") " Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.192484 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-system-service-ca\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.192533 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-audit-dir\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.192553 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-system-session\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.192587 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-audit-policies\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.192642 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.192677 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-user-template-login\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.192754 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzj6q\" (UniqueName: \"kubernetes.io/projected/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-kube-api-access-bzj6q\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.192789 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-system-router-certs\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.192805 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.192824 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-user-template-error\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.192843 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-system-cliconfig\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.192858 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-system-serving-cert\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.192874 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.192896 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.192952 4719 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2a4743d0-e646-45ab-a225-816c0d99246a-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.192946 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "2a4743d0-e646-45ab-a225-816c0d99246a" (UID: "2a4743d0-e646-45ab-a225-816c0d99246a"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.193397 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "2a4743d0-e646-45ab-a225-816c0d99246a" (UID: "2a4743d0-e646-45ab-a225-816c0d99246a"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.193720 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a4743d0-e646-45ab-a225-816c0d99246a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "2a4743d0-e646-45ab-a225-816c0d99246a" (UID: "2a4743d0-e646-45ab-a225-816c0d99246a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.194612 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "2a4743d0-e646-45ab-a225-816c0d99246a" (UID: "2a4743d0-e646-45ab-a225-816c0d99246a"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.198452 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "2a4743d0-e646-45ab-a225-816c0d99246a" (UID: "2a4743d0-e646-45ab-a225-816c0d99246a"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.198494 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a4743d0-e646-45ab-a225-816c0d99246a-kube-api-access-vsdfm" (OuterVolumeSpecName: "kube-api-access-vsdfm") pod "2a4743d0-e646-45ab-a225-816c0d99246a" (UID: "2a4743d0-e646-45ab-a225-816c0d99246a"). InnerVolumeSpecName "kube-api-access-vsdfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.199275 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "2a4743d0-e646-45ab-a225-816c0d99246a" (UID: "2a4743d0-e646-45ab-a225-816c0d99246a"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.199734 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "2a4743d0-e646-45ab-a225-816c0d99246a" (UID: "2a4743d0-e646-45ab-a225-816c0d99246a"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.200114 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "2a4743d0-e646-45ab-a225-816c0d99246a" (UID: "2a4743d0-e646-45ab-a225-816c0d99246a"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.201597 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "2a4743d0-e646-45ab-a225-816c0d99246a" (UID: "2a4743d0-e646-45ab-a225-816c0d99246a"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.207468 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "2a4743d0-e646-45ab-a225-816c0d99246a" (UID: "2a4743d0-e646-45ab-a225-816c0d99246a"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.207774 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "2a4743d0-e646-45ab-a225-816c0d99246a" (UID: "2a4743d0-e646-45ab-a225-816c0d99246a"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.207983 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "2a4743d0-e646-45ab-a225-816c0d99246a" (UID: "2a4743d0-e646-45ab-a225-816c0d99246a"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294473 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzj6q\" (UniqueName: \"kubernetes.io/projected/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-kube-api-access-bzj6q\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294527 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-system-router-certs\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294546 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294565 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-user-template-error\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294584 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-system-cliconfig\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294603 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-system-serving-cert\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294618 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294637 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294665 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-system-service-ca\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294685 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-audit-dir\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294701 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-system-session\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294720 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-audit-policies\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294744 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294765 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-user-template-login\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294801 4719 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294813 4719 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294824 4719 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2a4743d0-e646-45ab-a225-816c0d99246a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294833 4719 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294843 4719 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294852 4719 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294861 4719 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294872 4719 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294882 4719 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294891 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsdfm\" (UniqueName: \"kubernetes.io/projected/2a4743d0-e646-45ab-a225-816c0d99246a-kube-api-access-vsdfm\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294900 4719 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294908 4719 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.294917 4719 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2a4743d0-e646-45ab-a225-816c0d99246a-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.296085 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-system-service-ca\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.296544 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-audit-policies\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.296761 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-system-cliconfig\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.297084 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-audit-dir\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.298566 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-system-router-certs\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.299081 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-user-template-login\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.299151 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.299317 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.300148 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-user-template-error\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.300271 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.300665 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-system-session\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.301166 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.305776 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-v4-0-config-system-serving-cert\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.311809 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzj6q\" (UniqueName: \"kubernetes.io/projected/1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62-kube-api-access-bzj6q\") pod \"oauth-openshift-bb968f6ff-pl6xp\" (UID: \"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62\") " pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.474422 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:03 crc kubenswrapper[4719]: I1009 15:22:03.912968 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-bb968f6ff-pl6xp"] Oct 09 15:22:04 crc kubenswrapper[4719]: I1009 15:22:04.091116 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" event={"ID":"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62","Type":"ContainerStarted","Data":"c3b5e9db5cceb378e304b650249ad28a0d40f947239e5720be6105ae4f4996b3"} Oct 09 15:22:04 crc kubenswrapper[4719]: I1009 15:22:04.091143 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-db9tz" Oct 09 15:22:04 crc kubenswrapper[4719]: I1009 15:22:04.118730 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-db9tz"] Oct 09 15:22:04 crc kubenswrapper[4719]: I1009 15:22:04.122084 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-db9tz"] Oct 09 15:22:05 crc kubenswrapper[4719]: I1009 15:22:05.097742 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" event={"ID":"1e49b0ec-c0a7-4f9b-9a64-8ff61522dc62","Type":"ContainerStarted","Data":"421ade455170a73d5eee801703c03705b01d8e344fa132dca54f09da6d4cbd50"} Oct 09 15:22:05 crc kubenswrapper[4719]: I1009 15:22:05.098711 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:05 crc kubenswrapper[4719]: I1009 15:22:05.104925 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" Oct 09 15:22:05 crc kubenswrapper[4719]: I1009 15:22:05.146610 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-bb968f6ff-pl6xp" podStartSLOduration=28.146568785 podStartE2EDuration="28.146568785s" podCreationTimestamp="2025-10-09 15:21:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:22:05.140536391 +0000 UTC m=+230.650247686" watchObservedRunningTime="2025-10-09 15:22:05.146568785 +0000 UTC m=+230.656280110" Oct 09 15:22:05 crc kubenswrapper[4719]: I1009 15:22:05.171184 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a4743d0-e646-45ab-a225-816c0d99246a" path="/var/lib/kubelet/pods/2a4743d0-e646-45ab-a225-816c0d99246a/volumes" Oct 09 15:22:15 crc kubenswrapper[4719]: I1009 15:22:15.612084 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-458dz"] Oct 09 15:22:15 crc kubenswrapper[4719]: I1009 15:22:15.612913 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-458dz" podUID="08d47ff5-80a6-4395-8481-2e7f1c2c1409" containerName="registry-server" containerID="cri-o://612d3eced1905fbcd994b1a686a5888fbbbf3f88233a858114ba4a4c5d88658a" gracePeriod=30 Oct 09 15:22:15 crc kubenswrapper[4719]: I1009 15:22:15.617416 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8726g"] Oct 09 15:22:15 crc kubenswrapper[4719]: I1009 15:22:15.617642 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8726g" podUID="ed97f513-40b6-4273-b6a5-9f9f5150e4cd" containerName="registry-server" containerID="cri-o://3a5e42928051c0b346e579953af6446026583140155d9a9da6bd8476093e7c75" gracePeriod=30 Oct 09 15:22:15 crc kubenswrapper[4719]: I1009 15:22:15.629934 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dwt2p"] Oct 09 15:22:15 crc kubenswrapper[4719]: I1009 15:22:15.630137 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-dwt2p" podUID="2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7" containerName="marketplace-operator" containerID="cri-o://06db45c647b85950356f4849b194448ab307dd055ace46838c4317b0ed9d9479" gracePeriod=30 Oct 09 15:22:15 crc kubenswrapper[4719]: I1009 15:22:15.637989 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nzwhh"] Oct 09 15:22:15 crc kubenswrapper[4719]: I1009 15:22:15.638263 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nzwhh" podUID="84e55955-c37c-4897-ab18-f71812f3ccff" containerName="registry-server" containerID="cri-o://adc204002ebc9268cee537b1aeb8b8104b65f70ed4d636a9b367993ea02baa12" gracePeriod=30 Oct 09 15:22:15 crc kubenswrapper[4719]: I1009 15:22:15.653510 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jrqj4"] Oct 09 15:22:15 crc kubenswrapper[4719]: I1009 15:22:15.653766 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jrqj4" podUID="f662f8ad-fe9b-40c9-845e-8f2749a6482d" containerName="registry-server" containerID="cri-o://eb8deadd96a6798c61c0933c513d1ca3d045a55a795f8752f92c1952d5c3b99e" gracePeriod=30 Oct 09 15:22:15 crc kubenswrapper[4719]: I1009 15:22:15.658153 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gj4pz"] Oct 09 15:22:15 crc kubenswrapper[4719]: I1009 15:22:15.658946 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gj4pz" Oct 09 15:22:15 crc kubenswrapper[4719]: I1009 15:22:15.660421 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gj4pz"] Oct 09 15:22:15 crc kubenswrapper[4719]: E1009 15:22:15.730074 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eb8deadd96a6798c61c0933c513d1ca3d045a55a795f8752f92c1952d5c3b99e is running failed: container process not found" containerID="eb8deadd96a6798c61c0933c513d1ca3d045a55a795f8752f92c1952d5c3b99e" cmd=["grpc_health_probe","-addr=:50051"] Oct 09 15:22:15 crc kubenswrapper[4719]: E1009 15:22:15.730529 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eb8deadd96a6798c61c0933c513d1ca3d045a55a795f8752f92c1952d5c3b99e is running failed: container process not found" containerID="eb8deadd96a6798c61c0933c513d1ca3d045a55a795f8752f92c1952d5c3b99e" cmd=["grpc_health_probe","-addr=:50051"] Oct 09 15:22:15 crc kubenswrapper[4719]: E1009 15:22:15.731167 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eb8deadd96a6798c61c0933c513d1ca3d045a55a795f8752f92c1952d5c3b99e is running failed: container process not found" containerID="eb8deadd96a6798c61c0933c513d1ca3d045a55a795f8752f92c1952d5c3b99e" cmd=["grpc_health_probe","-addr=:50051"] Oct 09 15:22:15 crc kubenswrapper[4719]: E1009 15:22:15.731373 4719 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eb8deadd96a6798c61c0933c513d1ca3d045a55a795f8752f92c1952d5c3b99e is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-jrqj4" podUID="f662f8ad-fe9b-40c9-845e-8f2749a6482d" containerName="registry-server" Oct 09 15:22:15 crc kubenswrapper[4719]: I1009 15:22:15.756823 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9080569c-497b-4281-a120-7c538380a16c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gj4pz\" (UID: \"9080569c-497b-4281-a120-7c538380a16c\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj4pz" Oct 09 15:22:15 crc kubenswrapper[4719]: I1009 15:22:15.757150 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qhnj\" (UniqueName: \"kubernetes.io/projected/9080569c-497b-4281-a120-7c538380a16c-kube-api-access-7qhnj\") pod \"marketplace-operator-79b997595-gj4pz\" (UID: \"9080569c-497b-4281-a120-7c538380a16c\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj4pz" Oct 09 15:22:15 crc kubenswrapper[4719]: I1009 15:22:15.758549 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9080569c-497b-4281-a120-7c538380a16c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gj4pz\" (UID: \"9080569c-497b-4281-a120-7c538380a16c\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj4pz" Oct 09 15:22:15 crc kubenswrapper[4719]: I1009 15:22:15.859627 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9080569c-497b-4281-a120-7c538380a16c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gj4pz\" (UID: \"9080569c-497b-4281-a120-7c538380a16c\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj4pz" Oct 09 15:22:15 crc kubenswrapper[4719]: I1009 15:22:15.859680 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9080569c-497b-4281-a120-7c538380a16c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gj4pz\" (UID: \"9080569c-497b-4281-a120-7c538380a16c\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj4pz" Oct 09 15:22:15 crc kubenswrapper[4719]: I1009 15:22:15.859816 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qhnj\" (UniqueName: \"kubernetes.io/projected/9080569c-497b-4281-a120-7c538380a16c-kube-api-access-7qhnj\") pod \"marketplace-operator-79b997595-gj4pz\" (UID: \"9080569c-497b-4281-a120-7c538380a16c\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj4pz" Oct 09 15:22:15 crc kubenswrapper[4719]: I1009 15:22:15.861338 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9080569c-497b-4281-a120-7c538380a16c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gj4pz\" (UID: \"9080569c-497b-4281-a120-7c538380a16c\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj4pz" Oct 09 15:22:15 crc kubenswrapper[4719]: I1009 15:22:15.866653 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9080569c-497b-4281-a120-7c538380a16c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gj4pz\" (UID: \"9080569c-497b-4281-a120-7c538380a16c\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj4pz" Oct 09 15:22:15 crc kubenswrapper[4719]: I1009 15:22:15.880476 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qhnj\" (UniqueName: \"kubernetes.io/projected/9080569c-497b-4281-a120-7c538380a16c-kube-api-access-7qhnj\") pod \"marketplace-operator-79b997595-gj4pz\" (UID: \"9080569c-497b-4281-a120-7c538380a16c\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj4pz" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.046722 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gj4pz" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.050408 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-458dz" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.109600 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dwt2p" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.111980 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8726g" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.120691 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jrqj4" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.129394 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nzwhh" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.166541 4719 generic.go:334] "Generic (PLEG): container finished" podID="ed97f513-40b6-4273-b6a5-9f9f5150e4cd" containerID="3a5e42928051c0b346e579953af6446026583140155d9a9da6bd8476093e7c75" exitCode=0 Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.166947 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8726g" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.168149 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08d47ff5-80a6-4395-8481-2e7f1c2c1409-catalog-content\") pod \"08d47ff5-80a6-4395-8481-2e7f1c2c1409\" (UID: \"08d47ff5-80a6-4395-8481-2e7f1c2c1409\") " Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.168221 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08d47ff5-80a6-4395-8481-2e7f1c2c1409-utilities\") pod \"08d47ff5-80a6-4395-8481-2e7f1c2c1409\" (UID: \"08d47ff5-80a6-4395-8481-2e7f1c2c1409\") " Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.168263 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz6xj\" (UniqueName: \"kubernetes.io/projected/08d47ff5-80a6-4395-8481-2e7f1c2c1409-kube-api-access-fz6xj\") pod \"08d47ff5-80a6-4395-8481-2e7f1c2c1409\" (UID: \"08d47ff5-80a6-4395-8481-2e7f1c2c1409\") " Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.170255 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8726g" event={"ID":"ed97f513-40b6-4273-b6a5-9f9f5150e4cd","Type":"ContainerDied","Data":"3a5e42928051c0b346e579953af6446026583140155d9a9da6bd8476093e7c75"} Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.170301 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8726g" event={"ID":"ed97f513-40b6-4273-b6a5-9f9f5150e4cd","Type":"ContainerDied","Data":"d9225dff3af8bc39d3903a75602bf4f012ea5889485a9b723131af923dff3e47"} Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.170330 4719 scope.go:117] "RemoveContainer" containerID="3a5e42928051c0b346e579953af6446026583140155d9a9da6bd8476093e7c75" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.175529 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08d47ff5-80a6-4395-8481-2e7f1c2c1409-utilities" (OuterVolumeSpecName: "utilities") pod "08d47ff5-80a6-4395-8481-2e7f1c2c1409" (UID: "08d47ff5-80a6-4395-8481-2e7f1c2c1409"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.177666 4719 generic.go:334] "Generic (PLEG): container finished" podID="08d47ff5-80a6-4395-8481-2e7f1c2c1409" containerID="612d3eced1905fbcd994b1a686a5888fbbbf3f88233a858114ba4a4c5d88658a" exitCode=0 Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.177764 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-458dz" event={"ID":"08d47ff5-80a6-4395-8481-2e7f1c2c1409","Type":"ContainerDied","Data":"612d3eced1905fbcd994b1a686a5888fbbbf3f88233a858114ba4a4c5d88658a"} Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.177790 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-458dz" event={"ID":"08d47ff5-80a6-4395-8481-2e7f1c2c1409","Type":"ContainerDied","Data":"ca09dd2c6d40653c165ba28617d1792e3e5a247980ce7e48efd1620b05ba2f0d"} Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.177876 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-458dz" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.179150 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08d47ff5-80a6-4395-8481-2e7f1c2c1409-kube-api-access-fz6xj" (OuterVolumeSpecName: "kube-api-access-fz6xj") pod "08d47ff5-80a6-4395-8481-2e7f1c2c1409" (UID: "08d47ff5-80a6-4395-8481-2e7f1c2c1409"). InnerVolumeSpecName "kube-api-access-fz6xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.185480 4719 generic.go:334] "Generic (PLEG): container finished" podID="f662f8ad-fe9b-40c9-845e-8f2749a6482d" containerID="eb8deadd96a6798c61c0933c513d1ca3d045a55a795f8752f92c1952d5c3b99e" exitCode=0 Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.185557 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrqj4" event={"ID":"f662f8ad-fe9b-40c9-845e-8f2749a6482d","Type":"ContainerDied","Data":"eb8deadd96a6798c61c0933c513d1ca3d045a55a795f8752f92c1952d5c3b99e"} Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.185589 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrqj4" event={"ID":"f662f8ad-fe9b-40c9-845e-8f2749a6482d","Type":"ContainerDied","Data":"eab3c28380de162670e59acc1e072b0af4d19ffdb2d47fa91c0f14eded8f95bb"} Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.185661 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jrqj4" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.198153 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzwhh" event={"ID":"84e55955-c37c-4897-ab18-f71812f3ccff","Type":"ContainerDied","Data":"adc204002ebc9268cee537b1aeb8b8104b65f70ed4d636a9b367993ea02baa12"} Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.198098 4719 generic.go:334] "Generic (PLEG): container finished" podID="84e55955-c37c-4897-ab18-f71812f3ccff" containerID="adc204002ebc9268cee537b1aeb8b8104b65f70ed4d636a9b367993ea02baa12" exitCode=0 Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.198187 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nzwhh" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.198490 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzwhh" event={"ID":"84e55955-c37c-4897-ab18-f71812f3ccff","Type":"ContainerDied","Data":"0b93971f8539e1eba0fe3f949371b233f893a119298cbad93f62e993d56eeac1"} Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.202156 4719 scope.go:117] "RemoveContainer" containerID="cd87fad9adbd89f7c930c17c5ad320a2c9c102c74d806bf2d53871e524528dbb" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.215094 4719 generic.go:334] "Generic (PLEG): container finished" podID="2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7" containerID="06db45c647b85950356f4849b194448ab307dd055ace46838c4317b0ed9d9479" exitCode=0 Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.215106 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dwt2p" event={"ID":"2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7","Type":"ContainerDied","Data":"06db45c647b85950356f4849b194448ab307dd055ace46838c4317b0ed9d9479"} Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.216255 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dwt2p" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.216827 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dwt2p" event={"ID":"2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7","Type":"ContainerDied","Data":"9040c5f0f2fbdaf36169f27fb223bbd93aae6a616f4c52e7363530c3e8860be0"} Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.235822 4719 scope.go:117] "RemoveContainer" containerID="5c19562fc43a92244a4db189ca1ec2c82fc57f3b907c12c9e60537ce625f3e12" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.248042 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08d47ff5-80a6-4395-8481-2e7f1c2c1409-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "08d47ff5-80a6-4395-8481-2e7f1c2c1409" (UID: "08d47ff5-80a6-4395-8481-2e7f1c2c1409"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.262370 4719 scope.go:117] "RemoveContainer" containerID="3a5e42928051c0b346e579953af6446026583140155d9a9da6bd8476093e7c75" Oct 09 15:22:16 crc kubenswrapper[4719]: E1009 15:22:16.262932 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a5e42928051c0b346e579953af6446026583140155d9a9da6bd8476093e7c75\": container with ID starting with 3a5e42928051c0b346e579953af6446026583140155d9a9da6bd8476093e7c75 not found: ID does not exist" containerID="3a5e42928051c0b346e579953af6446026583140155d9a9da6bd8476093e7c75" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.262973 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a5e42928051c0b346e579953af6446026583140155d9a9da6bd8476093e7c75"} err="failed to get container status \"3a5e42928051c0b346e579953af6446026583140155d9a9da6bd8476093e7c75\": rpc error: code = NotFound desc = could not find container \"3a5e42928051c0b346e579953af6446026583140155d9a9da6bd8476093e7c75\": container with ID starting with 3a5e42928051c0b346e579953af6446026583140155d9a9da6bd8476093e7c75 not found: ID does not exist" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.262997 4719 scope.go:117] "RemoveContainer" containerID="cd87fad9adbd89f7c930c17c5ad320a2c9c102c74d806bf2d53871e524528dbb" Oct 09 15:22:16 crc kubenswrapper[4719]: E1009 15:22:16.263317 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd87fad9adbd89f7c930c17c5ad320a2c9c102c74d806bf2d53871e524528dbb\": container with ID starting with cd87fad9adbd89f7c930c17c5ad320a2c9c102c74d806bf2d53871e524528dbb not found: ID does not exist" containerID="cd87fad9adbd89f7c930c17c5ad320a2c9c102c74d806bf2d53871e524528dbb" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.263370 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd87fad9adbd89f7c930c17c5ad320a2c9c102c74d806bf2d53871e524528dbb"} err="failed to get container status \"cd87fad9adbd89f7c930c17c5ad320a2c9c102c74d806bf2d53871e524528dbb\": rpc error: code = NotFound desc = could not find container \"cd87fad9adbd89f7c930c17c5ad320a2c9c102c74d806bf2d53871e524528dbb\": container with ID starting with cd87fad9adbd89f7c930c17c5ad320a2c9c102c74d806bf2d53871e524528dbb not found: ID does not exist" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.263401 4719 scope.go:117] "RemoveContainer" containerID="5c19562fc43a92244a4db189ca1ec2c82fc57f3b907c12c9e60537ce625f3e12" Oct 09 15:22:16 crc kubenswrapper[4719]: E1009 15:22:16.263761 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c19562fc43a92244a4db189ca1ec2c82fc57f3b907c12c9e60537ce625f3e12\": container with ID starting with 5c19562fc43a92244a4db189ca1ec2c82fc57f3b907c12c9e60537ce625f3e12 not found: ID does not exist" containerID="5c19562fc43a92244a4db189ca1ec2c82fc57f3b907c12c9e60537ce625f3e12" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.263780 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c19562fc43a92244a4db189ca1ec2c82fc57f3b907c12c9e60537ce625f3e12"} err="failed to get container status \"5c19562fc43a92244a4db189ca1ec2c82fc57f3b907c12c9e60537ce625f3e12\": rpc error: code = NotFound desc = could not find container \"5c19562fc43a92244a4db189ca1ec2c82fc57f3b907c12c9e60537ce625f3e12\": container with ID starting with 5c19562fc43a92244a4db189ca1ec2c82fc57f3b907c12c9e60537ce625f3e12 not found: ID does not exist" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.263791 4719 scope.go:117] "RemoveContainer" containerID="612d3eced1905fbcd994b1a686a5888fbbbf3f88233a858114ba4a4c5d88658a" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.270487 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8lwd\" (UniqueName: \"kubernetes.io/projected/f662f8ad-fe9b-40c9-845e-8f2749a6482d-kube-api-access-j8lwd\") pod \"f662f8ad-fe9b-40c9-845e-8f2749a6482d\" (UID: \"f662f8ad-fe9b-40c9-845e-8f2749a6482d\") " Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.270530 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed97f513-40b6-4273-b6a5-9f9f5150e4cd-catalog-content\") pod \"ed97f513-40b6-4273-b6a5-9f9f5150e4cd\" (UID: \"ed97f513-40b6-4273-b6a5-9f9f5150e4cd\") " Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.270552 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f662f8ad-fe9b-40c9-845e-8f2749a6482d-utilities\") pod \"f662f8ad-fe9b-40c9-845e-8f2749a6482d\" (UID: \"f662f8ad-fe9b-40c9-845e-8f2749a6482d\") " Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.270569 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed97f513-40b6-4273-b6a5-9f9f5150e4cd-utilities\") pod \"ed97f513-40b6-4273-b6a5-9f9f5150e4cd\" (UID: \"ed97f513-40b6-4273-b6a5-9f9f5150e4cd\") " Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.270626 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkmtq\" (UniqueName: \"kubernetes.io/projected/ed97f513-40b6-4273-b6a5-9f9f5150e4cd-kube-api-access-fkmtq\") pod \"ed97f513-40b6-4273-b6a5-9f9f5150e4cd\" (UID: \"ed97f513-40b6-4273-b6a5-9f9f5150e4cd\") " Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.270646 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zhbb\" (UniqueName: \"kubernetes.io/projected/84e55955-c37c-4897-ab18-f71812f3ccff-kube-api-access-6zhbb\") pod \"84e55955-c37c-4897-ab18-f71812f3ccff\" (UID: \"84e55955-c37c-4897-ab18-f71812f3ccff\") " Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.270676 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7-marketplace-trusted-ca\") pod \"2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7\" (UID: \"2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7\") " Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.270696 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7flcr\" (UniqueName: \"kubernetes.io/projected/2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7-kube-api-access-7flcr\") pod \"2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7\" (UID: \"2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7\") " Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.270715 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84e55955-c37c-4897-ab18-f71812f3ccff-utilities\") pod \"84e55955-c37c-4897-ab18-f71812f3ccff\" (UID: \"84e55955-c37c-4897-ab18-f71812f3ccff\") " Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.270729 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84e55955-c37c-4897-ab18-f71812f3ccff-catalog-content\") pod \"84e55955-c37c-4897-ab18-f71812f3ccff\" (UID: \"84e55955-c37c-4897-ab18-f71812f3ccff\") " Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.270767 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7-marketplace-operator-metrics\") pod \"2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7\" (UID: \"2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7\") " Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.270792 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f662f8ad-fe9b-40c9-845e-8f2749a6482d-catalog-content\") pod \"f662f8ad-fe9b-40c9-845e-8f2749a6482d\" (UID: \"f662f8ad-fe9b-40c9-845e-8f2749a6482d\") " Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.271131 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08d47ff5-80a6-4395-8481-2e7f1c2c1409-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.271144 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz6xj\" (UniqueName: \"kubernetes.io/projected/08d47ff5-80a6-4395-8481-2e7f1c2c1409-kube-api-access-fz6xj\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.271154 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08d47ff5-80a6-4395-8481-2e7f1c2c1409-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.271669 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7" (UID: "2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.272524 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f662f8ad-fe9b-40c9-845e-8f2749a6482d-utilities" (OuterVolumeSpecName: "utilities") pod "f662f8ad-fe9b-40c9-845e-8f2749a6482d" (UID: "f662f8ad-fe9b-40c9-845e-8f2749a6482d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.274618 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed97f513-40b6-4273-b6a5-9f9f5150e4cd-utilities" (OuterVolumeSpecName: "utilities") pod "ed97f513-40b6-4273-b6a5-9f9f5150e4cd" (UID: "ed97f513-40b6-4273-b6a5-9f9f5150e4cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.276388 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84e55955-c37c-4897-ab18-f71812f3ccff-utilities" (OuterVolumeSpecName: "utilities") pod "84e55955-c37c-4897-ab18-f71812f3ccff" (UID: "84e55955-c37c-4897-ab18-f71812f3ccff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.277659 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed97f513-40b6-4273-b6a5-9f9f5150e4cd-kube-api-access-fkmtq" (OuterVolumeSpecName: "kube-api-access-fkmtq") pod "ed97f513-40b6-4273-b6a5-9f9f5150e4cd" (UID: "ed97f513-40b6-4273-b6a5-9f9f5150e4cd"). InnerVolumeSpecName "kube-api-access-fkmtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.278402 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7" (UID: "2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.292635 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84e55955-c37c-4897-ab18-f71812f3ccff-kube-api-access-6zhbb" (OuterVolumeSpecName: "kube-api-access-6zhbb") pod "84e55955-c37c-4897-ab18-f71812f3ccff" (UID: "84e55955-c37c-4897-ab18-f71812f3ccff"). InnerVolumeSpecName "kube-api-access-6zhbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.292790 4719 scope.go:117] "RemoveContainer" containerID="94567adc6b6d9802690c7ad5a5954f5d8baf4e91277c4077781542ca0f86219e" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.294028 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f662f8ad-fe9b-40c9-845e-8f2749a6482d-kube-api-access-j8lwd" (OuterVolumeSpecName: "kube-api-access-j8lwd") pod "f662f8ad-fe9b-40c9-845e-8f2749a6482d" (UID: "f662f8ad-fe9b-40c9-845e-8f2749a6482d"). InnerVolumeSpecName "kube-api-access-j8lwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.295907 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7-kube-api-access-7flcr" (OuterVolumeSpecName: "kube-api-access-7flcr") pod "2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7" (UID: "2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7"). InnerVolumeSpecName "kube-api-access-7flcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.296716 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84e55955-c37c-4897-ab18-f71812f3ccff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84e55955-c37c-4897-ab18-f71812f3ccff" (UID: "84e55955-c37c-4897-ab18-f71812f3ccff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.308883 4719 scope.go:117] "RemoveContainer" containerID="77437c01f6a56cdf406d96b76e103b566616312482bd0c803ef940280a4659a8" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.328065 4719 scope.go:117] "RemoveContainer" containerID="612d3eced1905fbcd994b1a686a5888fbbbf3f88233a858114ba4a4c5d88658a" Oct 09 15:22:16 crc kubenswrapper[4719]: E1009 15:22:16.328625 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"612d3eced1905fbcd994b1a686a5888fbbbf3f88233a858114ba4a4c5d88658a\": container with ID starting with 612d3eced1905fbcd994b1a686a5888fbbbf3f88233a858114ba4a4c5d88658a not found: ID does not exist" containerID="612d3eced1905fbcd994b1a686a5888fbbbf3f88233a858114ba4a4c5d88658a" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.328681 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"612d3eced1905fbcd994b1a686a5888fbbbf3f88233a858114ba4a4c5d88658a"} err="failed to get container status \"612d3eced1905fbcd994b1a686a5888fbbbf3f88233a858114ba4a4c5d88658a\": rpc error: code = NotFound desc = could not find container \"612d3eced1905fbcd994b1a686a5888fbbbf3f88233a858114ba4a4c5d88658a\": container with ID starting with 612d3eced1905fbcd994b1a686a5888fbbbf3f88233a858114ba4a4c5d88658a not found: ID does not exist" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.328722 4719 scope.go:117] "RemoveContainer" containerID="94567adc6b6d9802690c7ad5a5954f5d8baf4e91277c4077781542ca0f86219e" Oct 09 15:22:16 crc kubenswrapper[4719]: E1009 15:22:16.329033 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94567adc6b6d9802690c7ad5a5954f5d8baf4e91277c4077781542ca0f86219e\": container with ID starting with 94567adc6b6d9802690c7ad5a5954f5d8baf4e91277c4077781542ca0f86219e not found: ID does not exist" containerID="94567adc6b6d9802690c7ad5a5954f5d8baf4e91277c4077781542ca0f86219e" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.329058 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94567adc6b6d9802690c7ad5a5954f5d8baf4e91277c4077781542ca0f86219e"} err="failed to get container status \"94567adc6b6d9802690c7ad5a5954f5d8baf4e91277c4077781542ca0f86219e\": rpc error: code = NotFound desc = could not find container \"94567adc6b6d9802690c7ad5a5954f5d8baf4e91277c4077781542ca0f86219e\": container with ID starting with 94567adc6b6d9802690c7ad5a5954f5d8baf4e91277c4077781542ca0f86219e not found: ID does not exist" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.329074 4719 scope.go:117] "RemoveContainer" containerID="77437c01f6a56cdf406d96b76e103b566616312482bd0c803ef940280a4659a8" Oct 09 15:22:16 crc kubenswrapper[4719]: E1009 15:22:16.329739 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77437c01f6a56cdf406d96b76e103b566616312482bd0c803ef940280a4659a8\": container with ID starting with 77437c01f6a56cdf406d96b76e103b566616312482bd0c803ef940280a4659a8 not found: ID does not exist" containerID="77437c01f6a56cdf406d96b76e103b566616312482bd0c803ef940280a4659a8" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.329767 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77437c01f6a56cdf406d96b76e103b566616312482bd0c803ef940280a4659a8"} err="failed to get container status \"77437c01f6a56cdf406d96b76e103b566616312482bd0c803ef940280a4659a8\": rpc error: code = NotFound desc = could not find container \"77437c01f6a56cdf406d96b76e103b566616312482bd0c803ef940280a4659a8\": container with ID starting with 77437c01f6a56cdf406d96b76e103b566616312482bd0c803ef940280a4659a8 not found: ID does not exist" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.329785 4719 scope.go:117] "RemoveContainer" containerID="eb8deadd96a6798c61c0933c513d1ca3d045a55a795f8752f92c1952d5c3b99e" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.337637 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed97f513-40b6-4273-b6a5-9f9f5150e4cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed97f513-40b6-4273-b6a5-9f9f5150e4cd" (UID: "ed97f513-40b6-4273-b6a5-9f9f5150e4cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.343129 4719 scope.go:117] "RemoveContainer" containerID="ce55e4ace64cf0fbddbf1d92bb507aa29f2e15c65403636b84bc5dca3aab536d" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.355787 4719 scope.go:117] "RemoveContainer" containerID="5e6dcdee156e16599ba7119a35ec28ad5884c765620c4dc15220447b0524863f" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.369578 4719 scope.go:117] "RemoveContainer" containerID="eb8deadd96a6798c61c0933c513d1ca3d045a55a795f8752f92c1952d5c3b99e" Oct 09 15:22:16 crc kubenswrapper[4719]: E1009 15:22:16.370494 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb8deadd96a6798c61c0933c513d1ca3d045a55a795f8752f92c1952d5c3b99e\": container with ID starting with eb8deadd96a6798c61c0933c513d1ca3d045a55a795f8752f92c1952d5c3b99e not found: ID does not exist" containerID="eb8deadd96a6798c61c0933c513d1ca3d045a55a795f8752f92c1952d5c3b99e" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.370535 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb8deadd96a6798c61c0933c513d1ca3d045a55a795f8752f92c1952d5c3b99e"} err="failed to get container status \"eb8deadd96a6798c61c0933c513d1ca3d045a55a795f8752f92c1952d5c3b99e\": rpc error: code = NotFound desc = could not find container \"eb8deadd96a6798c61c0933c513d1ca3d045a55a795f8752f92c1952d5c3b99e\": container with ID starting with eb8deadd96a6798c61c0933c513d1ca3d045a55a795f8752f92c1952d5c3b99e not found: ID does not exist" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.370564 4719 scope.go:117] "RemoveContainer" containerID="ce55e4ace64cf0fbddbf1d92bb507aa29f2e15c65403636b84bc5dca3aab536d" Oct 09 15:22:16 crc kubenswrapper[4719]: E1009 15:22:16.370994 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce55e4ace64cf0fbddbf1d92bb507aa29f2e15c65403636b84bc5dca3aab536d\": container with ID starting with ce55e4ace64cf0fbddbf1d92bb507aa29f2e15c65403636b84bc5dca3aab536d not found: ID does not exist" containerID="ce55e4ace64cf0fbddbf1d92bb507aa29f2e15c65403636b84bc5dca3aab536d" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.371024 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce55e4ace64cf0fbddbf1d92bb507aa29f2e15c65403636b84bc5dca3aab536d"} err="failed to get container status \"ce55e4ace64cf0fbddbf1d92bb507aa29f2e15c65403636b84bc5dca3aab536d\": rpc error: code = NotFound desc = could not find container \"ce55e4ace64cf0fbddbf1d92bb507aa29f2e15c65403636b84bc5dca3aab536d\": container with ID starting with ce55e4ace64cf0fbddbf1d92bb507aa29f2e15c65403636b84bc5dca3aab536d not found: ID does not exist" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.371047 4719 scope.go:117] "RemoveContainer" containerID="5e6dcdee156e16599ba7119a35ec28ad5884c765620c4dc15220447b0524863f" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.372130 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8lwd\" (UniqueName: \"kubernetes.io/projected/f662f8ad-fe9b-40c9-845e-8f2749a6482d-kube-api-access-j8lwd\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.372494 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed97f513-40b6-4273-b6a5-9f9f5150e4cd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.372555 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f662f8ad-fe9b-40c9-845e-8f2749a6482d-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.372570 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed97f513-40b6-4273-b6a5-9f9f5150e4cd-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.372586 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkmtq\" (UniqueName: \"kubernetes.io/projected/ed97f513-40b6-4273-b6a5-9f9f5150e4cd-kube-api-access-fkmtq\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.372603 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zhbb\" (UniqueName: \"kubernetes.io/projected/84e55955-c37c-4897-ab18-f71812f3ccff-kube-api-access-6zhbb\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.372634 4719 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.372650 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7flcr\" (UniqueName: \"kubernetes.io/projected/2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7-kube-api-access-7flcr\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.372661 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84e55955-c37c-4897-ab18-f71812f3ccff-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.372694 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84e55955-c37c-4897-ab18-f71812f3ccff-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.372714 4719 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:16 crc kubenswrapper[4719]: E1009 15:22:16.375508 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e6dcdee156e16599ba7119a35ec28ad5884c765620c4dc15220447b0524863f\": container with ID starting with 5e6dcdee156e16599ba7119a35ec28ad5884c765620c4dc15220447b0524863f not found: ID does not exist" containerID="5e6dcdee156e16599ba7119a35ec28ad5884c765620c4dc15220447b0524863f" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.375548 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e6dcdee156e16599ba7119a35ec28ad5884c765620c4dc15220447b0524863f"} err="failed to get container status \"5e6dcdee156e16599ba7119a35ec28ad5884c765620c4dc15220447b0524863f\": rpc error: code = NotFound desc = could not find container \"5e6dcdee156e16599ba7119a35ec28ad5884c765620c4dc15220447b0524863f\": container with ID starting with 5e6dcdee156e16599ba7119a35ec28ad5884c765620c4dc15220447b0524863f not found: ID does not exist" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.375574 4719 scope.go:117] "RemoveContainer" containerID="adc204002ebc9268cee537b1aeb8b8104b65f70ed4d636a9b367993ea02baa12" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.380083 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f662f8ad-fe9b-40c9-845e-8f2749a6482d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f662f8ad-fe9b-40c9-845e-8f2749a6482d" (UID: "f662f8ad-fe9b-40c9-845e-8f2749a6482d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.387985 4719 scope.go:117] "RemoveContainer" containerID="144d2e281045c08d389c86154c7c4d11fcfc332bdb04e8eb4662ed80a07c38d2" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.400484 4719 scope.go:117] "RemoveContainer" containerID="abe54057dd1d529463de4a92df26291e1dd56cb065499bdc584985a25b277443" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.413227 4719 scope.go:117] "RemoveContainer" containerID="adc204002ebc9268cee537b1aeb8b8104b65f70ed4d636a9b367993ea02baa12" Oct 09 15:22:16 crc kubenswrapper[4719]: E1009 15:22:16.413615 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adc204002ebc9268cee537b1aeb8b8104b65f70ed4d636a9b367993ea02baa12\": container with ID starting with adc204002ebc9268cee537b1aeb8b8104b65f70ed4d636a9b367993ea02baa12 not found: ID does not exist" containerID="adc204002ebc9268cee537b1aeb8b8104b65f70ed4d636a9b367993ea02baa12" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.413658 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adc204002ebc9268cee537b1aeb8b8104b65f70ed4d636a9b367993ea02baa12"} err="failed to get container status \"adc204002ebc9268cee537b1aeb8b8104b65f70ed4d636a9b367993ea02baa12\": rpc error: code = NotFound desc = could not find container \"adc204002ebc9268cee537b1aeb8b8104b65f70ed4d636a9b367993ea02baa12\": container with ID starting with adc204002ebc9268cee537b1aeb8b8104b65f70ed4d636a9b367993ea02baa12 not found: ID does not exist" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.413697 4719 scope.go:117] "RemoveContainer" containerID="144d2e281045c08d389c86154c7c4d11fcfc332bdb04e8eb4662ed80a07c38d2" Oct 09 15:22:16 crc kubenswrapper[4719]: E1009 15:22:16.413968 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"144d2e281045c08d389c86154c7c4d11fcfc332bdb04e8eb4662ed80a07c38d2\": container with ID starting with 144d2e281045c08d389c86154c7c4d11fcfc332bdb04e8eb4662ed80a07c38d2 not found: ID does not exist" containerID="144d2e281045c08d389c86154c7c4d11fcfc332bdb04e8eb4662ed80a07c38d2" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.413998 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"144d2e281045c08d389c86154c7c4d11fcfc332bdb04e8eb4662ed80a07c38d2"} err="failed to get container status \"144d2e281045c08d389c86154c7c4d11fcfc332bdb04e8eb4662ed80a07c38d2\": rpc error: code = NotFound desc = could not find container \"144d2e281045c08d389c86154c7c4d11fcfc332bdb04e8eb4662ed80a07c38d2\": container with ID starting with 144d2e281045c08d389c86154c7c4d11fcfc332bdb04e8eb4662ed80a07c38d2 not found: ID does not exist" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.414012 4719 scope.go:117] "RemoveContainer" containerID="abe54057dd1d529463de4a92df26291e1dd56cb065499bdc584985a25b277443" Oct 09 15:22:16 crc kubenswrapper[4719]: E1009 15:22:16.414244 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abe54057dd1d529463de4a92df26291e1dd56cb065499bdc584985a25b277443\": container with ID starting with abe54057dd1d529463de4a92df26291e1dd56cb065499bdc584985a25b277443 not found: ID does not exist" containerID="abe54057dd1d529463de4a92df26291e1dd56cb065499bdc584985a25b277443" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.414283 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abe54057dd1d529463de4a92df26291e1dd56cb065499bdc584985a25b277443"} err="failed to get container status \"abe54057dd1d529463de4a92df26291e1dd56cb065499bdc584985a25b277443\": rpc error: code = NotFound desc = could not find container \"abe54057dd1d529463de4a92df26291e1dd56cb065499bdc584985a25b277443\": container with ID starting with abe54057dd1d529463de4a92df26291e1dd56cb065499bdc584985a25b277443 not found: ID does not exist" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.414310 4719 scope.go:117] "RemoveContainer" containerID="06db45c647b85950356f4849b194448ab307dd055ace46838c4317b0ed9d9479" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.427620 4719 scope.go:117] "RemoveContainer" containerID="06db45c647b85950356f4849b194448ab307dd055ace46838c4317b0ed9d9479" Oct 09 15:22:16 crc kubenswrapper[4719]: E1009 15:22:16.428100 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06db45c647b85950356f4849b194448ab307dd055ace46838c4317b0ed9d9479\": container with ID starting with 06db45c647b85950356f4849b194448ab307dd055ace46838c4317b0ed9d9479 not found: ID does not exist" containerID="06db45c647b85950356f4849b194448ab307dd055ace46838c4317b0ed9d9479" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.428150 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06db45c647b85950356f4849b194448ab307dd055ace46838c4317b0ed9d9479"} err="failed to get container status \"06db45c647b85950356f4849b194448ab307dd055ace46838c4317b0ed9d9479\": rpc error: code = NotFound desc = could not find container \"06db45c647b85950356f4849b194448ab307dd055ace46838c4317b0ed9d9479\": container with ID starting with 06db45c647b85950356f4849b194448ab307dd055ace46838c4317b0ed9d9479 not found: ID does not exist" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.474003 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f662f8ad-fe9b-40c9-845e-8f2749a6482d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.486056 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gj4pz"] Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.498863 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8726g"] Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.501945 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8726g"] Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.539931 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nzwhh"] Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.546240 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nzwhh"] Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.555506 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jrqj4"] Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.562242 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jrqj4"] Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.565234 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-458dz"] Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.567960 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-458dz"] Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.572625 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dwt2p"] Oct 09 15:22:16 crc kubenswrapper[4719]: I1009 15:22:16.572683 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dwt2p"] Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.168110 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08d47ff5-80a6-4395-8481-2e7f1c2c1409" path="/var/lib/kubelet/pods/08d47ff5-80a6-4395-8481-2e7f1c2c1409/volumes" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.169498 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7" path="/var/lib/kubelet/pods/2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7/volumes" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.170089 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84e55955-c37c-4897-ab18-f71812f3ccff" path="/var/lib/kubelet/pods/84e55955-c37c-4897-ab18-f71812f3ccff/volumes" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.171340 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed97f513-40b6-4273-b6a5-9f9f5150e4cd" path="/var/lib/kubelet/pods/ed97f513-40b6-4273-b6a5-9f9f5150e4cd/volumes" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.172026 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f662f8ad-fe9b-40c9-845e-8f2749a6482d" path="/var/lib/kubelet/pods/f662f8ad-fe9b-40c9-845e-8f2749a6482d/volumes" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.225273 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gj4pz" event={"ID":"9080569c-497b-4281-a120-7c538380a16c","Type":"ContainerStarted","Data":"f3c569938f9ba8368478ccea0f41909ca620b7a9f521f4b7522e0b8d662773ff"} Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.225618 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gj4pz" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.225674 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gj4pz" event={"ID":"9080569c-497b-4281-a120-7c538380a16c","Type":"ContainerStarted","Data":"100bd364b98523abfc554eeb350419177ed100e527578733e1694ab6c8b94736"} Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.228229 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gj4pz" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.243024 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-gj4pz" podStartSLOduration=2.243006267 podStartE2EDuration="2.243006267s" podCreationTimestamp="2025-10-09 15:22:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:22:17.241700127 +0000 UTC m=+242.751411412" watchObservedRunningTime="2025-10-09 15:22:17.243006267 +0000 UTC m=+242.752717552" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.826614 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8d5wd"] Oct 09 15:22:17 crc kubenswrapper[4719]: E1009 15:22:17.826854 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f662f8ad-fe9b-40c9-845e-8f2749a6482d" containerName="extract-content" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.826868 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="f662f8ad-fe9b-40c9-845e-8f2749a6482d" containerName="extract-content" Oct 09 15:22:17 crc kubenswrapper[4719]: E1009 15:22:17.826885 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e55955-c37c-4897-ab18-f71812f3ccff" containerName="registry-server" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.826895 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e55955-c37c-4897-ab18-f71812f3ccff" containerName="registry-server" Oct 09 15:22:17 crc kubenswrapper[4719]: E1009 15:22:17.826906 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed97f513-40b6-4273-b6a5-9f9f5150e4cd" containerName="extract-content" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.826918 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed97f513-40b6-4273-b6a5-9f9f5150e4cd" containerName="extract-content" Oct 09 15:22:17 crc kubenswrapper[4719]: E1009 15:22:17.826958 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08d47ff5-80a6-4395-8481-2e7f1c2c1409" containerName="registry-server" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.826966 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d47ff5-80a6-4395-8481-2e7f1c2c1409" containerName="registry-server" Oct 09 15:22:17 crc kubenswrapper[4719]: E1009 15:22:17.826973 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f662f8ad-fe9b-40c9-845e-8f2749a6482d" containerName="extract-utilities" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.826980 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="f662f8ad-fe9b-40c9-845e-8f2749a6482d" containerName="extract-utilities" Oct 09 15:22:17 crc kubenswrapper[4719]: E1009 15:22:17.826995 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08d47ff5-80a6-4395-8481-2e7f1c2c1409" containerName="extract-content" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.827005 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d47ff5-80a6-4395-8481-2e7f1c2c1409" containerName="extract-content" Oct 09 15:22:17 crc kubenswrapper[4719]: E1009 15:22:17.827018 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e55955-c37c-4897-ab18-f71812f3ccff" containerName="extract-content" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.827034 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e55955-c37c-4897-ab18-f71812f3ccff" containerName="extract-content" Oct 09 15:22:17 crc kubenswrapper[4719]: E1009 15:22:17.827050 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f662f8ad-fe9b-40c9-845e-8f2749a6482d" containerName="registry-server" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.827067 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="f662f8ad-fe9b-40c9-845e-8f2749a6482d" containerName="registry-server" Oct 09 15:22:17 crc kubenswrapper[4719]: E1009 15:22:17.827085 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7" containerName="marketplace-operator" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.827097 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7" containerName="marketplace-operator" Oct 09 15:22:17 crc kubenswrapper[4719]: E1009 15:22:17.827113 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e55955-c37c-4897-ab18-f71812f3ccff" containerName="extract-utilities" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.827127 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e55955-c37c-4897-ab18-f71812f3ccff" containerName="extract-utilities" Oct 09 15:22:17 crc kubenswrapper[4719]: E1009 15:22:17.827145 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed97f513-40b6-4273-b6a5-9f9f5150e4cd" containerName="registry-server" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.827216 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed97f513-40b6-4273-b6a5-9f9f5150e4cd" containerName="registry-server" Oct 09 15:22:17 crc kubenswrapper[4719]: E1009 15:22:17.827234 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08d47ff5-80a6-4395-8481-2e7f1c2c1409" containerName="extract-utilities" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.827248 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d47ff5-80a6-4395-8481-2e7f1c2c1409" containerName="extract-utilities" Oct 09 15:22:17 crc kubenswrapper[4719]: E1009 15:22:17.827265 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed97f513-40b6-4273-b6a5-9f9f5150e4cd" containerName="extract-utilities" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.827332 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed97f513-40b6-4273-b6a5-9f9f5150e4cd" containerName="extract-utilities" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.831537 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed97f513-40b6-4273-b6a5-9f9f5150e4cd" containerName="registry-server" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.831696 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="08d47ff5-80a6-4395-8481-2e7f1c2c1409" containerName="registry-server" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.831750 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b1a8c7a-d66f-45fe-b870-5c0f38b38fc7" containerName="marketplace-operator" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.831764 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="f662f8ad-fe9b-40c9-845e-8f2749a6482d" containerName="registry-server" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.831836 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="84e55955-c37c-4897-ab18-f71812f3ccff" containerName="registry-server" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.835336 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8d5wd" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.845453 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.847736 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8d5wd"] Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.992706 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4c82774-ac3f-4330-b575-1cfb72f5dbf7-utilities\") pod \"redhat-marketplace-8d5wd\" (UID: \"f4c82774-ac3f-4330-b575-1cfb72f5dbf7\") " pod="openshift-marketplace/redhat-marketplace-8d5wd" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.992798 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4c82774-ac3f-4330-b575-1cfb72f5dbf7-catalog-content\") pod \"redhat-marketplace-8d5wd\" (UID: \"f4c82774-ac3f-4330-b575-1cfb72f5dbf7\") " pod="openshift-marketplace/redhat-marketplace-8d5wd" Oct 09 15:22:17 crc kubenswrapper[4719]: I1009 15:22:17.992827 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4brl\" (UniqueName: \"kubernetes.io/projected/f4c82774-ac3f-4330-b575-1cfb72f5dbf7-kube-api-access-c4brl\") pod \"redhat-marketplace-8d5wd\" (UID: \"f4c82774-ac3f-4330-b575-1cfb72f5dbf7\") " pod="openshift-marketplace/redhat-marketplace-8d5wd" Oct 09 15:22:18 crc kubenswrapper[4719]: I1009 15:22:18.026309 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zjkt4"] Oct 09 15:22:18 crc kubenswrapper[4719]: I1009 15:22:18.028607 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zjkt4" Oct 09 15:22:18 crc kubenswrapper[4719]: I1009 15:22:18.030480 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 09 15:22:18 crc kubenswrapper[4719]: I1009 15:22:18.037633 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zjkt4"] Oct 09 15:22:18 crc kubenswrapper[4719]: I1009 15:22:18.093533 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4c82774-ac3f-4330-b575-1cfb72f5dbf7-catalog-content\") pod \"redhat-marketplace-8d5wd\" (UID: \"f4c82774-ac3f-4330-b575-1cfb72f5dbf7\") " pod="openshift-marketplace/redhat-marketplace-8d5wd" Oct 09 15:22:18 crc kubenswrapper[4719]: I1009 15:22:18.093614 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4brl\" (UniqueName: \"kubernetes.io/projected/f4c82774-ac3f-4330-b575-1cfb72f5dbf7-kube-api-access-c4brl\") pod \"redhat-marketplace-8d5wd\" (UID: \"f4c82774-ac3f-4330-b575-1cfb72f5dbf7\") " pod="openshift-marketplace/redhat-marketplace-8d5wd" Oct 09 15:22:18 crc kubenswrapper[4719]: I1009 15:22:18.093702 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4c82774-ac3f-4330-b575-1cfb72f5dbf7-utilities\") pod \"redhat-marketplace-8d5wd\" (UID: \"f4c82774-ac3f-4330-b575-1cfb72f5dbf7\") " pod="openshift-marketplace/redhat-marketplace-8d5wd" Oct 09 15:22:18 crc kubenswrapper[4719]: I1009 15:22:18.094161 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4c82774-ac3f-4330-b575-1cfb72f5dbf7-catalog-content\") pod \"redhat-marketplace-8d5wd\" (UID: \"f4c82774-ac3f-4330-b575-1cfb72f5dbf7\") " pod="openshift-marketplace/redhat-marketplace-8d5wd" Oct 09 15:22:18 crc kubenswrapper[4719]: I1009 15:22:18.094242 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4c82774-ac3f-4330-b575-1cfb72f5dbf7-utilities\") pod \"redhat-marketplace-8d5wd\" (UID: \"f4c82774-ac3f-4330-b575-1cfb72f5dbf7\") " pod="openshift-marketplace/redhat-marketplace-8d5wd" Oct 09 15:22:18 crc kubenswrapper[4719]: I1009 15:22:18.118182 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4brl\" (UniqueName: \"kubernetes.io/projected/f4c82774-ac3f-4330-b575-1cfb72f5dbf7-kube-api-access-c4brl\") pod \"redhat-marketplace-8d5wd\" (UID: \"f4c82774-ac3f-4330-b575-1cfb72f5dbf7\") " pod="openshift-marketplace/redhat-marketplace-8d5wd" Oct 09 15:22:18 crc kubenswrapper[4719]: I1009 15:22:18.160795 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8d5wd" Oct 09 15:22:18 crc kubenswrapper[4719]: I1009 15:22:18.194739 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rpnb\" (UniqueName: \"kubernetes.io/projected/430dd9b6-25a9-482d-8fa6-d2dec5d84507-kube-api-access-2rpnb\") pod \"certified-operators-zjkt4\" (UID: \"430dd9b6-25a9-482d-8fa6-d2dec5d84507\") " pod="openshift-marketplace/certified-operators-zjkt4" Oct 09 15:22:18 crc kubenswrapper[4719]: I1009 15:22:18.194813 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/430dd9b6-25a9-482d-8fa6-d2dec5d84507-catalog-content\") pod \"certified-operators-zjkt4\" (UID: \"430dd9b6-25a9-482d-8fa6-d2dec5d84507\") " pod="openshift-marketplace/certified-operators-zjkt4" Oct 09 15:22:18 crc kubenswrapper[4719]: I1009 15:22:18.194851 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/430dd9b6-25a9-482d-8fa6-d2dec5d84507-utilities\") pod \"certified-operators-zjkt4\" (UID: \"430dd9b6-25a9-482d-8fa6-d2dec5d84507\") " pod="openshift-marketplace/certified-operators-zjkt4" Oct 09 15:22:18 crc kubenswrapper[4719]: I1009 15:22:18.298187 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/430dd9b6-25a9-482d-8fa6-d2dec5d84507-utilities\") pod \"certified-operators-zjkt4\" (UID: \"430dd9b6-25a9-482d-8fa6-d2dec5d84507\") " pod="openshift-marketplace/certified-operators-zjkt4" Oct 09 15:22:18 crc kubenswrapper[4719]: I1009 15:22:18.298292 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rpnb\" (UniqueName: \"kubernetes.io/projected/430dd9b6-25a9-482d-8fa6-d2dec5d84507-kube-api-access-2rpnb\") pod \"certified-operators-zjkt4\" (UID: \"430dd9b6-25a9-482d-8fa6-d2dec5d84507\") " pod="openshift-marketplace/certified-operators-zjkt4" Oct 09 15:22:18 crc kubenswrapper[4719]: I1009 15:22:18.298329 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/430dd9b6-25a9-482d-8fa6-d2dec5d84507-catalog-content\") pod \"certified-operators-zjkt4\" (UID: \"430dd9b6-25a9-482d-8fa6-d2dec5d84507\") " pod="openshift-marketplace/certified-operators-zjkt4" Oct 09 15:22:18 crc kubenswrapper[4719]: I1009 15:22:18.298780 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/430dd9b6-25a9-482d-8fa6-d2dec5d84507-catalog-content\") pod \"certified-operators-zjkt4\" (UID: \"430dd9b6-25a9-482d-8fa6-d2dec5d84507\") " pod="openshift-marketplace/certified-operators-zjkt4" Oct 09 15:22:18 crc kubenswrapper[4719]: I1009 15:22:18.298933 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/430dd9b6-25a9-482d-8fa6-d2dec5d84507-utilities\") pod \"certified-operators-zjkt4\" (UID: \"430dd9b6-25a9-482d-8fa6-d2dec5d84507\") " pod="openshift-marketplace/certified-operators-zjkt4" Oct 09 15:22:18 crc kubenswrapper[4719]: I1009 15:22:18.329887 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rpnb\" (UniqueName: \"kubernetes.io/projected/430dd9b6-25a9-482d-8fa6-d2dec5d84507-kube-api-access-2rpnb\") pod \"certified-operators-zjkt4\" (UID: \"430dd9b6-25a9-482d-8fa6-d2dec5d84507\") " pod="openshift-marketplace/certified-operators-zjkt4" Oct 09 15:22:18 crc kubenswrapper[4719]: I1009 15:22:18.343696 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zjkt4" Oct 09 15:22:18 crc kubenswrapper[4719]: I1009 15:22:18.551727 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8d5wd"] Oct 09 15:22:18 crc kubenswrapper[4719]: W1009 15:22:18.559218 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4c82774_ac3f_4330_b575_1cfb72f5dbf7.slice/crio-be685797243e89b2de249e3f387d40e7c162b6472c8fd22afdd42d67da46f181 WatchSource:0}: Error finding container be685797243e89b2de249e3f387d40e7c162b6472c8fd22afdd42d67da46f181: Status 404 returned error can't find the container with id be685797243e89b2de249e3f387d40e7c162b6472c8fd22afdd42d67da46f181 Oct 09 15:22:18 crc kubenswrapper[4719]: I1009 15:22:18.732298 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zjkt4"] Oct 09 15:22:18 crc kubenswrapper[4719]: W1009 15:22:18.769341 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod430dd9b6_25a9_482d_8fa6_d2dec5d84507.slice/crio-b4706ca360345e870404da7fafb4b646f30bdfcbd8f1e932de8977439c3065ad WatchSource:0}: Error finding container b4706ca360345e870404da7fafb4b646f30bdfcbd8f1e932de8977439c3065ad: Status 404 returned error can't find the container with id b4706ca360345e870404da7fafb4b646f30bdfcbd8f1e932de8977439c3065ad Oct 09 15:22:19 crc kubenswrapper[4719]: I1009 15:22:19.245620 4719 generic.go:334] "Generic (PLEG): container finished" podID="f4c82774-ac3f-4330-b575-1cfb72f5dbf7" containerID="af0cccc3db517ee3b835ec110742e2392b35bd1024cfc6c197d304c73462f29b" exitCode=0 Oct 09 15:22:19 crc kubenswrapper[4719]: I1009 15:22:19.245703 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8d5wd" event={"ID":"f4c82774-ac3f-4330-b575-1cfb72f5dbf7","Type":"ContainerDied","Data":"af0cccc3db517ee3b835ec110742e2392b35bd1024cfc6c197d304c73462f29b"} Oct 09 15:22:19 crc kubenswrapper[4719]: I1009 15:22:19.245786 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8d5wd" event={"ID":"f4c82774-ac3f-4330-b575-1cfb72f5dbf7","Type":"ContainerStarted","Data":"be685797243e89b2de249e3f387d40e7c162b6472c8fd22afdd42d67da46f181"} Oct 09 15:22:19 crc kubenswrapper[4719]: I1009 15:22:19.248009 4719 generic.go:334] "Generic (PLEG): container finished" podID="430dd9b6-25a9-482d-8fa6-d2dec5d84507" containerID="873a72bdf8e1d59739e4f586481f37c64810d29fa2e403650f82082be5cf0b4d" exitCode=0 Oct 09 15:22:19 crc kubenswrapper[4719]: I1009 15:22:19.248072 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zjkt4" event={"ID":"430dd9b6-25a9-482d-8fa6-d2dec5d84507","Type":"ContainerDied","Data":"873a72bdf8e1d59739e4f586481f37c64810d29fa2e403650f82082be5cf0b4d"} Oct 09 15:22:19 crc kubenswrapper[4719]: I1009 15:22:19.248110 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zjkt4" event={"ID":"430dd9b6-25a9-482d-8fa6-d2dec5d84507","Type":"ContainerStarted","Data":"b4706ca360345e870404da7fafb4b646f30bdfcbd8f1e932de8977439c3065ad"} Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.225816 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vk29b"] Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.227666 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vk29b" Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.230076 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.238909 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vk29b"] Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.283499 4719 generic.go:334] "Generic (PLEG): container finished" podID="f4c82774-ac3f-4330-b575-1cfb72f5dbf7" containerID="fe4a5a5dec5e804d05355a6c7153f1c271bbb2fbdef34024da5a3900446e7edf" exitCode=0 Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.283563 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8d5wd" event={"ID":"f4c82774-ac3f-4330-b575-1cfb72f5dbf7","Type":"ContainerDied","Data":"fe4a5a5dec5e804d05355a6c7153f1c271bbb2fbdef34024da5a3900446e7edf"} Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.423929 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f56vs"] Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.425161 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f56vs" Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.426977 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.431022 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x7cr\" (UniqueName: \"kubernetes.io/projected/0ad43ecb-75f5-4453-89e5-2c7891c537a7-kube-api-access-4x7cr\") pod \"redhat-operators-vk29b\" (UID: \"0ad43ecb-75f5-4453-89e5-2c7891c537a7\") " pod="openshift-marketplace/redhat-operators-vk29b" Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.431061 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ad43ecb-75f5-4453-89e5-2c7891c537a7-catalog-content\") pod \"redhat-operators-vk29b\" (UID: \"0ad43ecb-75f5-4453-89e5-2c7891c537a7\") " pod="openshift-marketplace/redhat-operators-vk29b" Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.431107 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ad43ecb-75f5-4453-89e5-2c7891c537a7-utilities\") pod \"redhat-operators-vk29b\" (UID: \"0ad43ecb-75f5-4453-89e5-2c7891c537a7\") " pod="openshift-marketplace/redhat-operators-vk29b" Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.436079 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f56vs"] Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.532343 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/400debb1-678f-4731-84d3-8d0b3c455305-utilities\") pod \"community-operators-f56vs\" (UID: \"400debb1-678f-4731-84d3-8d0b3c455305\") " pod="openshift-marketplace/community-operators-f56vs" Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.532404 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ad43ecb-75f5-4453-89e5-2c7891c537a7-utilities\") pod \"redhat-operators-vk29b\" (UID: \"0ad43ecb-75f5-4453-89e5-2c7891c537a7\") " pod="openshift-marketplace/redhat-operators-vk29b" Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.532462 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/400debb1-678f-4731-84d3-8d0b3c455305-catalog-content\") pod \"community-operators-f56vs\" (UID: \"400debb1-678f-4731-84d3-8d0b3c455305\") " pod="openshift-marketplace/community-operators-f56vs" Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.532482 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x7cr\" (UniqueName: \"kubernetes.io/projected/0ad43ecb-75f5-4453-89e5-2c7891c537a7-kube-api-access-4x7cr\") pod \"redhat-operators-vk29b\" (UID: \"0ad43ecb-75f5-4453-89e5-2c7891c537a7\") " pod="openshift-marketplace/redhat-operators-vk29b" Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.532503 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ad43ecb-75f5-4453-89e5-2c7891c537a7-catalog-content\") pod \"redhat-operators-vk29b\" (UID: \"0ad43ecb-75f5-4453-89e5-2c7891c537a7\") " pod="openshift-marketplace/redhat-operators-vk29b" Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.532529 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmr66\" (UniqueName: \"kubernetes.io/projected/400debb1-678f-4731-84d3-8d0b3c455305-kube-api-access-cmr66\") pod \"community-operators-f56vs\" (UID: \"400debb1-678f-4731-84d3-8d0b3c455305\") " pod="openshift-marketplace/community-operators-f56vs" Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.533564 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ad43ecb-75f5-4453-89e5-2c7891c537a7-utilities\") pod \"redhat-operators-vk29b\" (UID: \"0ad43ecb-75f5-4453-89e5-2c7891c537a7\") " pod="openshift-marketplace/redhat-operators-vk29b" Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.533606 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ad43ecb-75f5-4453-89e5-2c7891c537a7-catalog-content\") pod \"redhat-operators-vk29b\" (UID: \"0ad43ecb-75f5-4453-89e5-2c7891c537a7\") " pod="openshift-marketplace/redhat-operators-vk29b" Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.550233 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x7cr\" (UniqueName: \"kubernetes.io/projected/0ad43ecb-75f5-4453-89e5-2c7891c537a7-kube-api-access-4x7cr\") pod \"redhat-operators-vk29b\" (UID: \"0ad43ecb-75f5-4453-89e5-2c7891c537a7\") " pod="openshift-marketplace/redhat-operators-vk29b" Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.633488 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/400debb1-678f-4731-84d3-8d0b3c455305-utilities\") pod \"community-operators-f56vs\" (UID: \"400debb1-678f-4731-84d3-8d0b3c455305\") " pod="openshift-marketplace/community-operators-f56vs" Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.633587 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/400debb1-678f-4731-84d3-8d0b3c455305-catalog-content\") pod \"community-operators-f56vs\" (UID: \"400debb1-678f-4731-84d3-8d0b3c455305\") " pod="openshift-marketplace/community-operators-f56vs" Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.633626 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmr66\" (UniqueName: \"kubernetes.io/projected/400debb1-678f-4731-84d3-8d0b3c455305-kube-api-access-cmr66\") pod \"community-operators-f56vs\" (UID: \"400debb1-678f-4731-84d3-8d0b3c455305\") " pod="openshift-marketplace/community-operators-f56vs" Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.634099 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/400debb1-678f-4731-84d3-8d0b3c455305-catalog-content\") pod \"community-operators-f56vs\" (UID: \"400debb1-678f-4731-84d3-8d0b3c455305\") " pod="openshift-marketplace/community-operators-f56vs" Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.634389 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/400debb1-678f-4731-84d3-8d0b3c455305-utilities\") pod \"community-operators-f56vs\" (UID: \"400debb1-678f-4731-84d3-8d0b3c455305\") " pod="openshift-marketplace/community-operators-f56vs" Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.651378 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmr66\" (UniqueName: \"kubernetes.io/projected/400debb1-678f-4731-84d3-8d0b3c455305-kube-api-access-cmr66\") pod \"community-operators-f56vs\" (UID: \"400debb1-678f-4731-84d3-8d0b3c455305\") " pod="openshift-marketplace/community-operators-f56vs" Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.743892 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f56vs" Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.849805 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vk29b" Oct 09 15:22:20 crc kubenswrapper[4719]: I1009 15:22:20.910630 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f56vs"] Oct 09 15:22:21 crc kubenswrapper[4719]: I1009 15:22:21.259429 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vk29b"] Oct 09 15:22:21 crc kubenswrapper[4719]: W1009 15:22:21.267157 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ad43ecb_75f5_4453_89e5_2c7891c537a7.slice/crio-99f8b5fa38070037bb4787780875b8f7aaeb435e03cf33b3a72006239299a561 WatchSource:0}: Error finding container 99f8b5fa38070037bb4787780875b8f7aaeb435e03cf33b3a72006239299a561: Status 404 returned error can't find the container with id 99f8b5fa38070037bb4787780875b8f7aaeb435e03cf33b3a72006239299a561 Oct 09 15:22:21 crc kubenswrapper[4719]: I1009 15:22:21.298826 4719 generic.go:334] "Generic (PLEG): container finished" podID="430dd9b6-25a9-482d-8fa6-d2dec5d84507" containerID="07516255b1c26ef4837d9b95b89dcd17a4ab8d0d0784779ea44cce8a0cb9c2f2" exitCode=0 Oct 09 15:22:21 crc kubenswrapper[4719]: I1009 15:22:21.299030 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zjkt4" event={"ID":"430dd9b6-25a9-482d-8fa6-d2dec5d84507","Type":"ContainerDied","Data":"07516255b1c26ef4837d9b95b89dcd17a4ab8d0d0784779ea44cce8a0cb9c2f2"} Oct 09 15:22:21 crc kubenswrapper[4719]: I1009 15:22:21.301932 4719 generic.go:334] "Generic (PLEG): container finished" podID="400debb1-678f-4731-84d3-8d0b3c455305" containerID="5cda6afce16ce949c9e2d33375f32a69a6a61a4b06bb9ce8965221041ee23bd4" exitCode=0 Oct 09 15:22:21 crc kubenswrapper[4719]: I1009 15:22:21.301974 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f56vs" event={"ID":"400debb1-678f-4731-84d3-8d0b3c455305","Type":"ContainerDied","Data":"5cda6afce16ce949c9e2d33375f32a69a6a61a4b06bb9ce8965221041ee23bd4"} Oct 09 15:22:21 crc kubenswrapper[4719]: I1009 15:22:21.301994 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f56vs" event={"ID":"400debb1-678f-4731-84d3-8d0b3c455305","Type":"ContainerStarted","Data":"33be7a860e12b1a16cd97270ec95240a85f3b0920cf21ee52e9b1717f4347159"} Oct 09 15:22:21 crc kubenswrapper[4719]: I1009 15:22:21.310067 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vk29b" event={"ID":"0ad43ecb-75f5-4453-89e5-2c7891c537a7","Type":"ContainerStarted","Data":"99f8b5fa38070037bb4787780875b8f7aaeb435e03cf33b3a72006239299a561"} Oct 09 15:22:21 crc kubenswrapper[4719]: I1009 15:22:21.316969 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8d5wd" event={"ID":"f4c82774-ac3f-4330-b575-1cfb72f5dbf7","Type":"ContainerStarted","Data":"1a6fc4e59d4bd0d555d95ebb87323a0e835dd649fedd9bfcebf21971d9ee209f"} Oct 09 15:22:21 crc kubenswrapper[4719]: I1009 15:22:21.356314 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8d5wd" podStartSLOduration=2.928072285 podStartE2EDuration="4.3562654s" podCreationTimestamp="2025-10-09 15:22:17 +0000 UTC" firstStartedPulling="2025-10-09 15:22:19.247483671 +0000 UTC m=+244.757194956" lastFinishedPulling="2025-10-09 15:22:20.675676796 +0000 UTC m=+246.185388071" observedRunningTime="2025-10-09 15:22:21.352117123 +0000 UTC m=+246.861828408" watchObservedRunningTime="2025-10-09 15:22:21.3562654 +0000 UTC m=+246.865976685" Oct 09 15:22:22 crc kubenswrapper[4719]: I1009 15:22:22.322706 4719 generic.go:334] "Generic (PLEG): container finished" podID="0ad43ecb-75f5-4453-89e5-2c7891c537a7" containerID="99fa451f27f607614ae79c56846964dd0936b0429dd64d5ad8b1860ebbb544cb" exitCode=0 Oct 09 15:22:22 crc kubenswrapper[4719]: I1009 15:22:22.322777 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vk29b" event={"ID":"0ad43ecb-75f5-4453-89e5-2c7891c537a7","Type":"ContainerDied","Data":"99fa451f27f607614ae79c56846964dd0936b0429dd64d5ad8b1860ebbb544cb"} Oct 09 15:22:22 crc kubenswrapper[4719]: I1009 15:22:22.330328 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zjkt4" event={"ID":"430dd9b6-25a9-482d-8fa6-d2dec5d84507","Type":"ContainerStarted","Data":"fec203b1a4a020f58d6ef7648d562cffda5f53c20f33fa2eb41bbbc9da2ea990"} Oct 09 15:22:22 crc kubenswrapper[4719]: I1009 15:22:22.332281 4719 generic.go:334] "Generic (PLEG): container finished" podID="400debb1-678f-4731-84d3-8d0b3c455305" containerID="7f2e096a67d6ff2869da0ebe3b4db4a486af68b11e1f61ca6a502cbfbfb98bd0" exitCode=0 Oct 09 15:22:22 crc kubenswrapper[4719]: I1009 15:22:22.333244 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f56vs" event={"ID":"400debb1-678f-4731-84d3-8d0b3c455305","Type":"ContainerDied","Data":"7f2e096a67d6ff2869da0ebe3b4db4a486af68b11e1f61ca6a502cbfbfb98bd0"} Oct 09 15:22:22 crc kubenswrapper[4719]: I1009 15:22:22.369718 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zjkt4" podStartSLOduration=1.891984715 podStartE2EDuration="4.369701521s" podCreationTimestamp="2025-10-09 15:22:18 +0000 UTC" firstStartedPulling="2025-10-09 15:22:19.249382859 +0000 UTC m=+244.759094144" lastFinishedPulling="2025-10-09 15:22:21.727099665 +0000 UTC m=+247.236810950" observedRunningTime="2025-10-09 15:22:22.365792651 +0000 UTC m=+247.875503936" watchObservedRunningTime="2025-10-09 15:22:22.369701521 +0000 UTC m=+247.879412816" Oct 09 15:22:24 crc kubenswrapper[4719]: I1009 15:22:24.344260 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f56vs" event={"ID":"400debb1-678f-4731-84d3-8d0b3c455305","Type":"ContainerStarted","Data":"6899fbb659238ea4a8790938798ae560e2b559a23b9a9186be420ead89e6a380"} Oct 09 15:22:24 crc kubenswrapper[4719]: I1009 15:22:24.346494 4719 generic.go:334] "Generic (PLEG): container finished" podID="0ad43ecb-75f5-4453-89e5-2c7891c537a7" containerID="7963627e9df815ea1c5ce3da578bd6797dea86d92961f7f0f93b41f030f4a12e" exitCode=0 Oct 09 15:22:24 crc kubenswrapper[4719]: I1009 15:22:24.346530 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vk29b" event={"ID":"0ad43ecb-75f5-4453-89e5-2c7891c537a7","Type":"ContainerDied","Data":"7963627e9df815ea1c5ce3da578bd6797dea86d92961f7f0f93b41f030f4a12e"} Oct 09 15:22:24 crc kubenswrapper[4719]: I1009 15:22:24.363748 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f56vs" podStartSLOduration=2.943250215 podStartE2EDuration="4.363719093s" podCreationTimestamp="2025-10-09 15:22:20 +0000 UTC" firstStartedPulling="2025-10-09 15:22:21.303279232 +0000 UTC m=+246.812990517" lastFinishedPulling="2025-10-09 15:22:22.72374811 +0000 UTC m=+248.233459395" observedRunningTime="2025-10-09 15:22:24.360663489 +0000 UTC m=+249.870374784" watchObservedRunningTime="2025-10-09 15:22:24.363719093 +0000 UTC m=+249.873430368" Oct 09 15:22:25 crc kubenswrapper[4719]: I1009 15:22:25.353281 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vk29b" event={"ID":"0ad43ecb-75f5-4453-89e5-2c7891c537a7","Type":"ContainerStarted","Data":"513b9fe0a66b73be918e7e2eefd26dbef03c225ddf01357c85601fd6a6abd7e4"} Oct 09 15:22:25 crc kubenswrapper[4719]: I1009 15:22:25.383725 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vk29b" podStartSLOduration=2.975139506 podStartE2EDuration="5.383704246s" podCreationTimestamp="2025-10-09 15:22:20 +0000 UTC" firstStartedPulling="2025-10-09 15:22:22.324192713 +0000 UTC m=+247.833903998" lastFinishedPulling="2025-10-09 15:22:24.732757453 +0000 UTC m=+250.242468738" observedRunningTime="2025-10-09 15:22:25.376954388 +0000 UTC m=+250.886665683" watchObservedRunningTime="2025-10-09 15:22:25.383704246 +0000 UTC m=+250.893415531" Oct 09 15:22:28 crc kubenswrapper[4719]: I1009 15:22:28.161093 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8d5wd" Oct 09 15:22:28 crc kubenswrapper[4719]: I1009 15:22:28.161449 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8d5wd" Oct 09 15:22:28 crc kubenswrapper[4719]: I1009 15:22:28.203471 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8d5wd" Oct 09 15:22:28 crc kubenswrapper[4719]: I1009 15:22:28.344131 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zjkt4" Oct 09 15:22:28 crc kubenswrapper[4719]: I1009 15:22:28.344211 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zjkt4" Oct 09 15:22:28 crc kubenswrapper[4719]: I1009 15:22:28.387487 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zjkt4" Oct 09 15:22:28 crc kubenswrapper[4719]: I1009 15:22:28.409428 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8d5wd" Oct 09 15:22:28 crc kubenswrapper[4719]: I1009 15:22:28.423971 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zjkt4" Oct 09 15:22:30 crc kubenswrapper[4719]: I1009 15:22:30.744048 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f56vs" Oct 09 15:22:30 crc kubenswrapper[4719]: I1009 15:22:30.744412 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f56vs" Oct 09 15:22:30 crc kubenswrapper[4719]: I1009 15:22:30.787526 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f56vs" Oct 09 15:22:30 crc kubenswrapper[4719]: I1009 15:22:30.850775 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vk29b" Oct 09 15:22:30 crc kubenswrapper[4719]: I1009 15:22:30.850818 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vk29b" Oct 09 15:22:30 crc kubenswrapper[4719]: I1009 15:22:30.885972 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vk29b" Oct 09 15:22:31 crc kubenswrapper[4719]: I1009 15:22:31.417861 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f56vs" Oct 09 15:22:31 crc kubenswrapper[4719]: I1009 15:22:31.418139 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vk29b" Oct 09 15:24:06 crc kubenswrapper[4719]: I1009 15:24:06.976662 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:24:06 crc kubenswrapper[4719]: I1009 15:24:06.977246 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:24:36 crc kubenswrapper[4719]: I1009 15:24:36.977114 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:24:36 crc kubenswrapper[4719]: I1009 15:24:36.977610 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:25:06 crc kubenswrapper[4719]: I1009 15:25:06.976495 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:25:06 crc kubenswrapper[4719]: I1009 15:25:06.976938 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:25:06 crc kubenswrapper[4719]: I1009 15:25:06.976982 4719 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 15:25:06 crc kubenswrapper[4719]: I1009 15:25:06.978460 4719 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"58630cc589d6ba8e40a40e1e3c93cc21531a1b6e5470575e2e8a4d654789d22a"} pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 15:25:06 crc kubenswrapper[4719]: I1009 15:25:06.978540 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" containerID="cri-o://58630cc589d6ba8e40a40e1e3c93cc21531a1b6e5470575e2e8a4d654789d22a" gracePeriod=600 Oct 09 15:25:07 crc kubenswrapper[4719]: I1009 15:25:07.149541 4719 generic.go:334] "Generic (PLEG): container finished" podID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerID="58630cc589d6ba8e40a40e1e3c93cc21531a1b6e5470575e2e8a4d654789d22a" exitCode=0 Oct 09 15:25:07 crc kubenswrapper[4719]: I1009 15:25:07.149628 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerDied","Data":"58630cc589d6ba8e40a40e1e3c93cc21531a1b6e5470575e2e8a4d654789d22a"} Oct 09 15:25:07 crc kubenswrapper[4719]: I1009 15:25:07.149926 4719 scope.go:117] "RemoveContainer" containerID="b8b3908283c24f180df8f6a04d52c46e7252cdfd4f0587f7cccf3e9a0f37127a" Oct 09 15:25:08 crc kubenswrapper[4719]: I1009 15:25:08.162166 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerStarted","Data":"8f170c1640fe33d3a488ada64feda2ff2ffd3c8d5fb1e790430375c1ffcc2527"} Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.579994 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-kgw9v"] Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.581378 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.592025 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-kgw9v"] Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.690808 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/df3c1bf4-8277-40ec-ac89-19235ca9fb69-ca-trust-extracted\") pod \"image-registry-66df7c8f76-kgw9v\" (UID: \"df3c1bf4-8277-40ec-ac89-19235ca9fb69\") " pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.690868 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-kgw9v\" (UID: \"df3c1bf4-8277-40ec-ac89-19235ca9fb69\") " pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.690890 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df3c1bf4-8277-40ec-ac89-19235ca9fb69-registry-tls\") pod \"image-registry-66df7c8f76-kgw9v\" (UID: \"df3c1bf4-8277-40ec-ac89-19235ca9fb69\") " pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.690919 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/df3c1bf4-8277-40ec-ac89-19235ca9fb69-installation-pull-secrets\") pod \"image-registry-66df7c8f76-kgw9v\" (UID: \"df3c1bf4-8277-40ec-ac89-19235ca9fb69\") " pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.690955 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spvxb\" (UniqueName: \"kubernetes.io/projected/df3c1bf4-8277-40ec-ac89-19235ca9fb69-kube-api-access-spvxb\") pod \"image-registry-66df7c8f76-kgw9v\" (UID: \"df3c1bf4-8277-40ec-ac89-19235ca9fb69\") " pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.690975 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/df3c1bf4-8277-40ec-ac89-19235ca9fb69-registry-certificates\") pod \"image-registry-66df7c8f76-kgw9v\" (UID: \"df3c1bf4-8277-40ec-ac89-19235ca9fb69\") " pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.690992 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df3c1bf4-8277-40ec-ac89-19235ca9fb69-trusted-ca\") pod \"image-registry-66df7c8f76-kgw9v\" (UID: \"df3c1bf4-8277-40ec-ac89-19235ca9fb69\") " pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.691183 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df3c1bf4-8277-40ec-ac89-19235ca9fb69-bound-sa-token\") pod \"image-registry-66df7c8f76-kgw9v\" (UID: \"df3c1bf4-8277-40ec-ac89-19235ca9fb69\") " pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.710593 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-kgw9v\" (UID: \"df3c1bf4-8277-40ec-ac89-19235ca9fb69\") " pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.792136 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/df3c1bf4-8277-40ec-ac89-19235ca9fb69-ca-trust-extracted\") pod \"image-registry-66df7c8f76-kgw9v\" (UID: \"df3c1bf4-8277-40ec-ac89-19235ca9fb69\") " pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.792188 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df3c1bf4-8277-40ec-ac89-19235ca9fb69-registry-tls\") pod \"image-registry-66df7c8f76-kgw9v\" (UID: \"df3c1bf4-8277-40ec-ac89-19235ca9fb69\") " pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.792214 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/df3c1bf4-8277-40ec-ac89-19235ca9fb69-installation-pull-secrets\") pod \"image-registry-66df7c8f76-kgw9v\" (UID: \"df3c1bf4-8277-40ec-ac89-19235ca9fb69\") " pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.792258 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spvxb\" (UniqueName: \"kubernetes.io/projected/df3c1bf4-8277-40ec-ac89-19235ca9fb69-kube-api-access-spvxb\") pod \"image-registry-66df7c8f76-kgw9v\" (UID: \"df3c1bf4-8277-40ec-ac89-19235ca9fb69\") " pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.792278 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/df3c1bf4-8277-40ec-ac89-19235ca9fb69-registry-certificates\") pod \"image-registry-66df7c8f76-kgw9v\" (UID: \"df3c1bf4-8277-40ec-ac89-19235ca9fb69\") " pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.792293 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df3c1bf4-8277-40ec-ac89-19235ca9fb69-trusted-ca\") pod \"image-registry-66df7c8f76-kgw9v\" (UID: \"df3c1bf4-8277-40ec-ac89-19235ca9fb69\") " pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.792316 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df3c1bf4-8277-40ec-ac89-19235ca9fb69-bound-sa-token\") pod \"image-registry-66df7c8f76-kgw9v\" (UID: \"df3c1bf4-8277-40ec-ac89-19235ca9fb69\") " pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.792721 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/df3c1bf4-8277-40ec-ac89-19235ca9fb69-ca-trust-extracted\") pod \"image-registry-66df7c8f76-kgw9v\" (UID: \"df3c1bf4-8277-40ec-ac89-19235ca9fb69\") " pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.793571 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/df3c1bf4-8277-40ec-ac89-19235ca9fb69-registry-certificates\") pod \"image-registry-66df7c8f76-kgw9v\" (UID: \"df3c1bf4-8277-40ec-ac89-19235ca9fb69\") " pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.793747 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df3c1bf4-8277-40ec-ac89-19235ca9fb69-trusted-ca\") pod \"image-registry-66df7c8f76-kgw9v\" (UID: \"df3c1bf4-8277-40ec-ac89-19235ca9fb69\") " pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.799099 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df3c1bf4-8277-40ec-ac89-19235ca9fb69-registry-tls\") pod \"image-registry-66df7c8f76-kgw9v\" (UID: \"df3c1bf4-8277-40ec-ac89-19235ca9fb69\") " pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.805409 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/df3c1bf4-8277-40ec-ac89-19235ca9fb69-installation-pull-secrets\") pod \"image-registry-66df7c8f76-kgw9v\" (UID: \"df3c1bf4-8277-40ec-ac89-19235ca9fb69\") " pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.807785 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df3c1bf4-8277-40ec-ac89-19235ca9fb69-bound-sa-token\") pod \"image-registry-66df7c8f76-kgw9v\" (UID: \"df3c1bf4-8277-40ec-ac89-19235ca9fb69\") " pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.808081 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spvxb\" (UniqueName: \"kubernetes.io/projected/df3c1bf4-8277-40ec-ac89-19235ca9fb69-kube-api-access-spvxb\") pod \"image-registry-66df7c8f76-kgw9v\" (UID: \"df3c1bf4-8277-40ec-ac89-19235ca9fb69\") " pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:30 crc kubenswrapper[4719]: I1009 15:26:30.898470 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:31 crc kubenswrapper[4719]: I1009 15:26:31.069604 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-kgw9v"] Oct 09 15:26:31 crc kubenswrapper[4719]: I1009 15:26:31.580181 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" event={"ID":"df3c1bf4-8277-40ec-ac89-19235ca9fb69","Type":"ContainerStarted","Data":"dbc430c6795b035ace5b2063b704bb1e9c2e403ef9bc19b6c406e2e083429523"} Oct 09 15:26:31 crc kubenswrapper[4719]: I1009 15:26:31.580236 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" event={"ID":"df3c1bf4-8277-40ec-ac89-19235ca9fb69","Type":"ContainerStarted","Data":"e11163d754b91b78e9423ff4404fcfde4c300145b41c58b6ad6b6f98d154648b"} Oct 09 15:26:31 crc kubenswrapper[4719]: I1009 15:26:31.580363 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:31 crc kubenswrapper[4719]: I1009 15:26:31.601948 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" podStartSLOduration=1.601926876 podStartE2EDuration="1.601926876s" podCreationTimestamp="2025-10-09 15:26:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:26:31.600600244 +0000 UTC m=+497.110311549" watchObservedRunningTime="2025-10-09 15:26:31.601926876 +0000 UTC m=+497.111638181" Oct 09 15:26:50 crc kubenswrapper[4719]: I1009 15:26:50.902537 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-kgw9v" Oct 09 15:26:50 crc kubenswrapper[4719]: I1009 15:26:50.950046 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cqrnr"] Oct 09 15:27:15 crc kubenswrapper[4719]: I1009 15:27:15.262332 4719 scope.go:117] "RemoveContainer" containerID="368651fe1e7a4e823d7ef7fc1036b74d1ed08a186e6c6f7bbee4c752bb869142" Oct 09 15:27:15 crc kubenswrapper[4719]: I1009 15:27:15.985694 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" podUID="92f1494f-b7f7-4e94-90ce-132cc3a14a62" containerName="registry" containerID="cri-o://73a707c6d3843d9bd942924f9d2ef2134ef47e4dfd97c0dbc3520eb26069c5fa" gracePeriod=30 Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.301306 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.486275 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.486315 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/92f1494f-b7f7-4e94-90ce-132cc3a14a62-registry-tls\") pod \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.486377 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxr5n\" (UniqueName: \"kubernetes.io/projected/92f1494f-b7f7-4e94-90ce-132cc3a14a62-kube-api-access-nxr5n\") pod \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.486440 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92f1494f-b7f7-4e94-90ce-132cc3a14a62-bound-sa-token\") pod \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.486464 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92f1494f-b7f7-4e94-90ce-132cc3a14a62-trusted-ca\") pod \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.486483 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/92f1494f-b7f7-4e94-90ce-132cc3a14a62-ca-trust-extracted\") pod \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.486502 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/92f1494f-b7f7-4e94-90ce-132cc3a14a62-installation-pull-secrets\") pod \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.486541 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/92f1494f-b7f7-4e94-90ce-132cc3a14a62-registry-certificates\") pod \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\" (UID: \"92f1494f-b7f7-4e94-90ce-132cc3a14a62\") " Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.487760 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92f1494f-b7f7-4e94-90ce-132cc3a14a62-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "92f1494f-b7f7-4e94-90ce-132cc3a14a62" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.487889 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92f1494f-b7f7-4e94-90ce-132cc3a14a62-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "92f1494f-b7f7-4e94-90ce-132cc3a14a62" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.494656 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92f1494f-b7f7-4e94-90ce-132cc3a14a62-kube-api-access-nxr5n" (OuterVolumeSpecName: "kube-api-access-nxr5n") pod "92f1494f-b7f7-4e94-90ce-132cc3a14a62" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62"). InnerVolumeSpecName "kube-api-access-nxr5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.499489 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92f1494f-b7f7-4e94-90ce-132cc3a14a62-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "92f1494f-b7f7-4e94-90ce-132cc3a14a62" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.499682 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92f1494f-b7f7-4e94-90ce-132cc3a14a62-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "92f1494f-b7f7-4e94-90ce-132cc3a14a62" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.503183 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "92f1494f-b7f7-4e94-90ce-132cc3a14a62" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.503196 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92f1494f-b7f7-4e94-90ce-132cc3a14a62-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "92f1494f-b7f7-4e94-90ce-132cc3a14a62" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.510861 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92f1494f-b7f7-4e94-90ce-132cc3a14a62-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "92f1494f-b7f7-4e94-90ce-132cc3a14a62" (UID: "92f1494f-b7f7-4e94-90ce-132cc3a14a62"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.587520 4719 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92f1494f-b7f7-4e94-90ce-132cc3a14a62-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.587554 4719 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92f1494f-b7f7-4e94-90ce-132cc3a14a62-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.587564 4719 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/92f1494f-b7f7-4e94-90ce-132cc3a14a62-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.587575 4719 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/92f1494f-b7f7-4e94-90ce-132cc3a14a62-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.587585 4719 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/92f1494f-b7f7-4e94-90ce-132cc3a14a62-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.587594 4719 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/92f1494f-b7f7-4e94-90ce-132cc3a14a62-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.587602 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxr5n\" (UniqueName: \"kubernetes.io/projected/92f1494f-b7f7-4e94-90ce-132cc3a14a62-kube-api-access-nxr5n\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.816275 4719 generic.go:334] "Generic (PLEG): container finished" podID="92f1494f-b7f7-4e94-90ce-132cc3a14a62" containerID="73a707c6d3843d9bd942924f9d2ef2134ef47e4dfd97c0dbc3520eb26069c5fa" exitCode=0 Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.816317 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" event={"ID":"92f1494f-b7f7-4e94-90ce-132cc3a14a62","Type":"ContainerDied","Data":"73a707c6d3843d9bd942924f9d2ef2134ef47e4dfd97c0dbc3520eb26069c5fa"} Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.816380 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" event={"ID":"92f1494f-b7f7-4e94-90ce-132cc3a14a62","Type":"ContainerDied","Data":"ffdb6ed3d78a73e8b850ec7354ead2b7cc4ffbff6a4b837f4ce909549944a38f"} Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.816396 4719 scope.go:117] "RemoveContainer" containerID="73a707c6d3843d9bd942924f9d2ef2134ef47e4dfd97c0dbc3520eb26069c5fa" Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.816744 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cqrnr" Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.832412 4719 scope.go:117] "RemoveContainer" containerID="73a707c6d3843d9bd942924f9d2ef2134ef47e4dfd97c0dbc3520eb26069c5fa" Oct 09 15:27:16 crc kubenswrapper[4719]: E1009 15:27:16.832893 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73a707c6d3843d9bd942924f9d2ef2134ef47e4dfd97c0dbc3520eb26069c5fa\": container with ID starting with 73a707c6d3843d9bd942924f9d2ef2134ef47e4dfd97c0dbc3520eb26069c5fa not found: ID does not exist" containerID="73a707c6d3843d9bd942924f9d2ef2134ef47e4dfd97c0dbc3520eb26069c5fa" Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.833127 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73a707c6d3843d9bd942924f9d2ef2134ef47e4dfd97c0dbc3520eb26069c5fa"} err="failed to get container status \"73a707c6d3843d9bd942924f9d2ef2134ef47e4dfd97c0dbc3520eb26069c5fa\": rpc error: code = NotFound desc = could not find container \"73a707c6d3843d9bd942924f9d2ef2134ef47e4dfd97c0dbc3520eb26069c5fa\": container with ID starting with 73a707c6d3843d9bd942924f9d2ef2134ef47e4dfd97c0dbc3520eb26069c5fa not found: ID does not exist" Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.847506 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cqrnr"] Oct 09 15:27:16 crc kubenswrapper[4719]: I1009 15:27:16.850300 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cqrnr"] Oct 09 15:27:17 crc kubenswrapper[4719]: I1009 15:27:17.167898 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92f1494f-b7f7-4e94-90ce-132cc3a14a62" path="/var/lib/kubelet/pods/92f1494f-b7f7-4e94-90ce-132cc3a14a62/volumes" Oct 09 15:27:36 crc kubenswrapper[4719]: I1009 15:27:36.976904 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:27:36 crc kubenswrapper[4719]: I1009 15:27:36.978581 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.169298 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-mr6c9"] Oct 09 15:27:45 crc kubenswrapper[4719]: E1009 15:27:45.170004 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92f1494f-b7f7-4e94-90ce-132cc3a14a62" containerName="registry" Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.170015 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="92f1494f-b7f7-4e94-90ce-132cc3a14a62" containerName="registry" Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.170103 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="92f1494f-b7f7-4e94-90ce-132cc3a14a62" containerName="registry" Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.170446 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-mr6c9" Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.172868 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.173096 4719 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-972m7" Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.173189 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.184998 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-2kc72"] Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.185808 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-2kc72" Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.187823 4719 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-4b5nz" Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.189674 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-mr6c9"] Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.197428 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-2kc72"] Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.200773 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-s42jm"] Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.201546 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-s42jm" Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.203652 4719 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-4pzsg" Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.228492 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-s42jm"] Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.254398 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blzpp\" (UniqueName: \"kubernetes.io/projected/a2b1f94e-9754-4aeb-9d99-a5c2258290ca-kube-api-access-blzpp\") pod \"cert-manager-cainjector-7f985d654d-mr6c9\" (UID: \"a2b1f94e-9754-4aeb-9d99-a5c2258290ca\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-mr6c9" Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.254479 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9rf4\" (UniqueName: \"kubernetes.io/projected/79d0f7b5-f165-44ee-8220-f31bcc6df1fd-kube-api-access-w9rf4\") pod \"cert-manager-webhook-5655c58dd6-s42jm\" (UID: \"79d0f7b5-f165-44ee-8220-f31bcc6df1fd\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-s42jm" Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.254522 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqxl2\" (UniqueName: \"kubernetes.io/projected/419868d8-7886-45fb-be57-2c476ba8d305-kube-api-access-zqxl2\") pod \"cert-manager-5b446d88c5-2kc72\" (UID: \"419868d8-7886-45fb-be57-2c476ba8d305\") " pod="cert-manager/cert-manager-5b446d88c5-2kc72" Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.356236 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqxl2\" (UniqueName: \"kubernetes.io/projected/419868d8-7886-45fb-be57-2c476ba8d305-kube-api-access-zqxl2\") pod \"cert-manager-5b446d88c5-2kc72\" (UID: \"419868d8-7886-45fb-be57-2c476ba8d305\") " pod="cert-manager/cert-manager-5b446d88c5-2kc72" Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.356381 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blzpp\" (UniqueName: \"kubernetes.io/projected/a2b1f94e-9754-4aeb-9d99-a5c2258290ca-kube-api-access-blzpp\") pod \"cert-manager-cainjector-7f985d654d-mr6c9\" (UID: \"a2b1f94e-9754-4aeb-9d99-a5c2258290ca\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-mr6c9" Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.356432 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9rf4\" (UniqueName: \"kubernetes.io/projected/79d0f7b5-f165-44ee-8220-f31bcc6df1fd-kube-api-access-w9rf4\") pod \"cert-manager-webhook-5655c58dd6-s42jm\" (UID: \"79d0f7b5-f165-44ee-8220-f31bcc6df1fd\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-s42jm" Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.385502 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqxl2\" (UniqueName: \"kubernetes.io/projected/419868d8-7886-45fb-be57-2c476ba8d305-kube-api-access-zqxl2\") pod \"cert-manager-5b446d88c5-2kc72\" (UID: \"419868d8-7886-45fb-be57-2c476ba8d305\") " pod="cert-manager/cert-manager-5b446d88c5-2kc72" Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.385679 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9rf4\" (UniqueName: \"kubernetes.io/projected/79d0f7b5-f165-44ee-8220-f31bcc6df1fd-kube-api-access-w9rf4\") pod \"cert-manager-webhook-5655c58dd6-s42jm\" (UID: \"79d0f7b5-f165-44ee-8220-f31bcc6df1fd\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-s42jm" Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.388071 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blzpp\" (UniqueName: \"kubernetes.io/projected/a2b1f94e-9754-4aeb-9d99-a5c2258290ca-kube-api-access-blzpp\") pod \"cert-manager-cainjector-7f985d654d-mr6c9\" (UID: \"a2b1f94e-9754-4aeb-9d99-a5c2258290ca\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-mr6c9" Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.499422 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-mr6c9" Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.519874 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-2kc72" Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.526951 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-s42jm" Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.926530 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-mr6c9"] Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.937428 4719 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.985168 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-2kc72"] Oct 09 15:27:45 crc kubenswrapper[4719]: I1009 15:27:45.995966 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-s42jm"] Oct 09 15:27:46 crc kubenswrapper[4719]: I1009 15:27:46.001713 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-mr6c9" event={"ID":"a2b1f94e-9754-4aeb-9d99-a5c2258290ca","Type":"ContainerStarted","Data":"cc4646b0373cca96761491d78c0e21d8f794cf01d23a00c8d7e85a490d14c1d4"} Oct 09 15:27:46 crc kubenswrapper[4719]: I1009 15:27:46.006090 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-2kc72" event={"ID":"419868d8-7886-45fb-be57-2c476ba8d305","Type":"ContainerStarted","Data":"50ee17249912a155fcdcee3dcf7dc2d243ef9735dc7df9cb580a6cea809742c8"} Oct 09 15:27:47 crc kubenswrapper[4719]: I1009 15:27:47.013090 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-s42jm" event={"ID":"79d0f7b5-f165-44ee-8220-f31bcc6df1fd","Type":"ContainerStarted","Data":"1f56793de7f01ee83b81575f414dbe1e10bf44619f1d94683927f6e38ed081de"} Oct 09 15:27:50 crc kubenswrapper[4719]: I1009 15:27:50.030263 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-mr6c9" event={"ID":"a2b1f94e-9754-4aeb-9d99-a5c2258290ca","Type":"ContainerStarted","Data":"019eaf9e60194b23729d666acbb90ea2291b7101be67b1d69633ff92d4672b5a"} Oct 09 15:27:50 crc kubenswrapper[4719]: I1009 15:27:50.032071 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-2kc72" event={"ID":"419868d8-7886-45fb-be57-2c476ba8d305","Type":"ContainerStarted","Data":"9e917ab577bb85725c491e761ada5794168615637e624a99d40fa59ee44a4385"} Oct 09 15:27:50 crc kubenswrapper[4719]: I1009 15:27:50.033471 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-s42jm" event={"ID":"79d0f7b5-f165-44ee-8220-f31bcc6df1fd","Type":"ContainerStarted","Data":"eab613f4d13e9be9b220e6d0839c016f2fd97be7df9f9d9b80d35803f79d89e5"} Oct 09 15:27:50 crc kubenswrapper[4719]: I1009 15:27:50.033595 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-s42jm" Oct 09 15:27:50 crc kubenswrapper[4719]: I1009 15:27:50.043904 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-mr6c9" podStartSLOduration=1.94837147 podStartE2EDuration="5.043890062s" podCreationTimestamp="2025-10-09 15:27:45 +0000 UTC" firstStartedPulling="2025-10-09 15:27:45.9371537 +0000 UTC m=+571.446864985" lastFinishedPulling="2025-10-09 15:27:49.032672292 +0000 UTC m=+574.542383577" observedRunningTime="2025-10-09 15:27:50.040916247 +0000 UTC m=+575.550627532" watchObservedRunningTime="2025-10-09 15:27:50.043890062 +0000 UTC m=+575.553601337" Oct 09 15:27:50 crc kubenswrapper[4719]: I1009 15:27:50.054119 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-2kc72" podStartSLOduration=2.008878201 podStartE2EDuration="5.054096407s" podCreationTimestamp="2025-10-09 15:27:45 +0000 UTC" firstStartedPulling="2025-10-09 15:27:45.987422315 +0000 UTC m=+571.497133600" lastFinishedPulling="2025-10-09 15:27:49.032640521 +0000 UTC m=+574.542351806" observedRunningTime="2025-10-09 15:27:50.050339488 +0000 UTC m=+575.560050773" watchObservedRunningTime="2025-10-09 15:27:50.054096407 +0000 UTC m=+575.563807712" Oct 09 15:27:50 crc kubenswrapper[4719]: I1009 15:27:50.096850 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-s42jm" podStartSLOduration=2.005263717 podStartE2EDuration="5.096832082s" podCreationTimestamp="2025-10-09 15:27:45 +0000 UTC" firstStartedPulling="2025-10-09 15:27:46.00229053 +0000 UTC m=+571.512001815" lastFinishedPulling="2025-10-09 15:27:49.093858895 +0000 UTC m=+574.603570180" observedRunningTime="2025-10-09 15:27:50.093721782 +0000 UTC m=+575.603433077" watchObservedRunningTime="2025-10-09 15:27:50.096832082 +0000 UTC m=+575.606543367" Oct 09 15:27:55 crc kubenswrapper[4719]: I1009 15:27:55.529391 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-s42jm" Oct 09 15:27:55 crc kubenswrapper[4719]: I1009 15:27:55.811059 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zv8jk"] Oct 09 15:27:55 crc kubenswrapper[4719]: I1009 15:27:55.811499 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="ovn-controller" containerID="cri-o://e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47" gracePeriod=30 Oct 09 15:27:55 crc kubenswrapper[4719]: I1009 15:27:55.811888 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="sbdb" containerID="cri-o://65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e" gracePeriod=30 Oct 09 15:27:55 crc kubenswrapper[4719]: I1009 15:27:55.811941 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="nbdb" containerID="cri-o://59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a" gracePeriod=30 Oct 09 15:27:55 crc kubenswrapper[4719]: I1009 15:27:55.811994 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="northd" containerID="cri-o://d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b" gracePeriod=30 Oct 09 15:27:55 crc kubenswrapper[4719]: I1009 15:27:55.812033 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="ovn-acl-logging" containerID="cri-o://80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e" gracePeriod=30 Oct 09 15:27:55 crc kubenswrapper[4719]: I1009 15:27:55.812079 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0" gracePeriod=30 Oct 09 15:27:55 crc kubenswrapper[4719]: I1009 15:27:55.812067 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="kube-rbac-proxy-node" containerID="cri-o://4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337" gracePeriod=30 Oct 09 15:27:55 crc kubenswrapper[4719]: I1009 15:27:55.840596 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="ovnkube-controller" containerID="cri-o://f682329c6f1662ef1c3d1654d5d65f347ebb1061a2e011ba9e36bbd51b862d22" gracePeriod=30 Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.063145 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zv8jk_fea6a48c-769c-41bf-95ce-649cc31eb4e5/ovnkube-controller/3.log" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.065291 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zv8jk_fea6a48c-769c-41bf-95ce-649cc31eb4e5/ovn-acl-logging/0.log" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.065848 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zv8jk_fea6a48c-769c-41bf-95ce-649cc31eb4e5/ovn-controller/0.log" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.066218 4719 generic.go:334] "Generic (PLEG): container finished" podID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerID="f682329c6f1662ef1c3d1654d5d65f347ebb1061a2e011ba9e36bbd51b862d22" exitCode=0 Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.066240 4719 generic.go:334] "Generic (PLEG): container finished" podID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerID="65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e" exitCode=0 Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.066248 4719 generic.go:334] "Generic (PLEG): container finished" podID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerID="59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a" exitCode=0 Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.066255 4719 generic.go:334] "Generic (PLEG): container finished" podID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerID="d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b" exitCode=0 Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.066261 4719 generic.go:334] "Generic (PLEG): container finished" podID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerID="f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0" exitCode=0 Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.066267 4719 generic.go:334] "Generic (PLEG): container finished" podID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerID="4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337" exitCode=0 Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.066273 4719 generic.go:334] "Generic (PLEG): container finished" podID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerID="80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e" exitCode=143 Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.066279 4719 generic.go:334] "Generic (PLEG): container finished" podID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerID="e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47" exitCode=143 Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.066320 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerDied","Data":"f682329c6f1662ef1c3d1654d5d65f347ebb1061a2e011ba9e36bbd51b862d22"} Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.066358 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerDied","Data":"65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e"} Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.066368 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerDied","Data":"59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a"} Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.066378 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerDied","Data":"d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b"} Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.066386 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerDied","Data":"f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0"} Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.066395 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerDied","Data":"4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337"} Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.066403 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerDied","Data":"80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e"} Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.066412 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerDied","Data":"e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47"} Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.066420 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" event={"ID":"fea6a48c-769c-41bf-95ce-649cc31eb4e5","Type":"ContainerDied","Data":"27dad12d3d4a004efdc84622336987da28adafc12291f6e1ae7aadd3b5a54473"} Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.066428 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27dad12d3d4a004efdc84622336987da28adafc12291f6e1ae7aadd3b5a54473" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.066442 4719 scope.go:117] "RemoveContainer" containerID="4859b0f970ed0dea88b96ebd820f8f3806673c1ffff2ad8398b0934dec9535a8" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.068443 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kmbvp_6a7f4c67-0335-4c58-896a-b3059d9a9a3f/kube-multus/2.log" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.068873 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kmbvp_6a7f4c67-0335-4c58-896a-b3059d9a9a3f/kube-multus/1.log" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.068906 4719 generic.go:334] "Generic (PLEG): container finished" podID="6a7f4c67-0335-4c58-896a-b3059d9a9a3f" containerID="64908969d19b71a3974eeabf4e47002eb2af4a3eeee316c375b203ecfe43212f" exitCode=2 Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.068922 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kmbvp" event={"ID":"6a7f4c67-0335-4c58-896a-b3059d9a9a3f","Type":"ContainerDied","Data":"64908969d19b71a3974eeabf4e47002eb2af4a3eeee316c375b203ecfe43212f"} Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.069298 4719 scope.go:117] "RemoveContainer" containerID="64908969d19b71a3974eeabf4e47002eb2af4a3eeee316c375b203ecfe43212f" Oct 09 15:27:56 crc kubenswrapper[4719]: E1009 15:27:56.069690 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-kmbvp_openshift-multus(6a7f4c67-0335-4c58-896a-b3059d9a9a3f)\"" pod="openshift-multus/multus-kmbvp" podUID="6a7f4c67-0335-4c58-896a-b3059d9a9a3f" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.141834 4719 scope.go:117] "RemoveContainer" containerID="201751e1a01c1fefb61309835c66a89743c507dff1e0d6e75a5ecf3447831840" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.147407 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zv8jk_fea6a48c-769c-41bf-95ce-649cc31eb4e5/ovn-acl-logging/0.log" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.147875 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zv8jk_fea6a48c-769c-41bf-95ce-649cc31eb4e5/ovn-controller/0.log" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.148269 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.189919 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-var-lib-openvswitch\") pod \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.190220 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-run-ovn\") pod \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.190478 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-node-log\") pod \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.190075 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "fea6a48c-769c-41bf-95ce-649cc31eb4e5" (UID: "fea6a48c-769c-41bf-95ce-649cc31eb4e5"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.190339 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "fea6a48c-769c-41bf-95ce-649cc31eb4e5" (UID: "fea6a48c-769c-41bf-95ce-649cc31eb4e5"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.190571 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-node-log" (OuterVolumeSpecName: "node-log") pod "fea6a48c-769c-41bf-95ce-649cc31eb4e5" (UID: "fea6a48c-769c-41bf-95ce-649cc31eb4e5"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.190599 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-run-systemd\") pod \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.190907 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-run-ovn-kubernetes\") pod \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.191017 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-run-openvswitch\") pod \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.190985 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "fea6a48c-769c-41bf-95ce-649cc31eb4e5" (UID: "fea6a48c-769c-41bf-95ce-649cc31eb4e5"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.191115 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-cni-bin\") pod \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.191373 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84cf8\" (UniqueName: \"kubernetes.io/projected/fea6a48c-769c-41bf-95ce-649cc31eb4e5-kube-api-access-84cf8\") pod \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.191222 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "fea6a48c-769c-41bf-95ce-649cc31eb4e5" (UID: "fea6a48c-769c-41bf-95ce-649cc31eb4e5"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.191280 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "fea6a48c-769c-41bf-95ce-649cc31eb4e5" (UID: "fea6a48c-769c-41bf-95ce-649cc31eb4e5"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.194109 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-cni-netd\") pod \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.194151 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-systemd-units\") pod \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.194175 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "fea6a48c-769c-41bf-95ce-649cc31eb4e5" (UID: "fea6a48c-769c-41bf-95ce-649cc31eb4e5"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.194195 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-log-socket\") pod \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.194221 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-kubelet\") pod \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.194259 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fea6a48c-769c-41bf-95ce-649cc31eb4e5-ovn-node-metrics-cert\") pod \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.194257 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "fea6a48c-769c-41bf-95ce-649cc31eb4e5" (UID: "fea6a48c-769c-41bf-95ce-649cc31eb4e5"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.194294 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fea6a48c-769c-41bf-95ce-649cc31eb4e5-ovnkube-config\") pod \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.194264 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-log-socket" (OuterVolumeSpecName: "log-socket") pod "fea6a48c-769c-41bf-95ce-649cc31eb4e5" (UID: "fea6a48c-769c-41bf-95ce-649cc31eb4e5"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.194319 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "fea6a48c-769c-41bf-95ce-649cc31eb4e5" (UID: "fea6a48c-769c-41bf-95ce-649cc31eb4e5"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.194399 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-slash\") pod \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.194432 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-run-netns\") pod \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.194490 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fea6a48c-769c-41bf-95ce-649cc31eb4e5-ovnkube-script-lib\") pod \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.194527 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-etc-openvswitch\") pod \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.194562 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-slash" (OuterVolumeSpecName: "host-slash") pod "fea6a48c-769c-41bf-95ce-649cc31eb4e5" (UID: "fea6a48c-769c-41bf-95ce-649cc31eb4e5"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.194596 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.194635 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "fea6a48c-769c-41bf-95ce-649cc31eb4e5" (UID: "fea6a48c-769c-41bf-95ce-649cc31eb4e5"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.194750 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "fea6a48c-769c-41bf-95ce-649cc31eb4e5" (UID: "fea6a48c-769c-41bf-95ce-649cc31eb4e5"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.194767 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fea6a48c-769c-41bf-95ce-649cc31eb4e5-env-overrides\") pod \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\" (UID: \"fea6a48c-769c-41bf-95ce-649cc31eb4e5\") " Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.194797 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "fea6a48c-769c-41bf-95ce-649cc31eb4e5" (UID: "fea6a48c-769c-41bf-95ce-649cc31eb4e5"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.195068 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fea6a48c-769c-41bf-95ce-649cc31eb4e5-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "fea6a48c-769c-41bf-95ce-649cc31eb4e5" (UID: "fea6a48c-769c-41bf-95ce-649cc31eb4e5"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.195123 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fea6a48c-769c-41bf-95ce-649cc31eb4e5-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "fea6a48c-769c-41bf-95ce-649cc31eb4e5" (UID: "fea6a48c-769c-41bf-95ce-649cc31eb4e5"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.195674 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fea6a48c-769c-41bf-95ce-649cc31eb4e5-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "fea6a48c-769c-41bf-95ce-649cc31eb4e5" (UID: "fea6a48c-769c-41bf-95ce-649cc31eb4e5"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.198819 4719 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.198860 4719 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.198878 4719 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fea6a48c-769c-41bf-95ce-649cc31eb4e5-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.198894 4719 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.198909 4719 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.198925 4719 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-node-log\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.198939 4719 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.198958 4719 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.198970 4719 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.198982 4719 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.198993 4719 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.199004 4719 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-log-socket\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.199016 4719 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.199030 4719 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fea6a48c-769c-41bf-95ce-649cc31eb4e5-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.199041 4719 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-slash\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.199052 4719 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.199064 4719 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fea6a48c-769c-41bf-95ce-649cc31eb4e5-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.200053 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fea6a48c-769c-41bf-95ce-649cc31eb4e5-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "fea6a48c-769c-41bf-95ce-649cc31eb4e5" (UID: "fea6a48c-769c-41bf-95ce-649cc31eb4e5"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.205612 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nq24q"] Oct 09 15:27:56 crc kubenswrapper[4719]: E1009 15:27:56.205890 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="kube-rbac-proxy-ovn-metrics" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.205912 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="kube-rbac-proxy-ovn-metrics" Oct 09 15:27:56 crc kubenswrapper[4719]: E1009 15:27:56.205922 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="nbdb" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.205930 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="nbdb" Oct 09 15:27:56 crc kubenswrapper[4719]: E1009 15:27:56.205937 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="ovnkube-controller" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.205947 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="ovnkube-controller" Oct 09 15:27:56 crc kubenswrapper[4719]: E1009 15:27:56.205956 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="sbdb" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.205963 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="sbdb" Oct 09 15:27:56 crc kubenswrapper[4719]: E1009 15:27:56.205974 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="ovn-controller" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.205982 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="ovn-controller" Oct 09 15:27:56 crc kubenswrapper[4719]: E1009 15:27:56.205993 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="kubecfg-setup" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.205999 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="kubecfg-setup" Oct 09 15:27:56 crc kubenswrapper[4719]: E1009 15:27:56.206010 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="ovnkube-controller" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.206017 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="ovnkube-controller" Oct 09 15:27:56 crc kubenswrapper[4719]: E1009 15:27:56.206026 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="ovnkube-controller" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.206033 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="ovnkube-controller" Oct 09 15:27:56 crc kubenswrapper[4719]: E1009 15:27:56.206043 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="ovn-acl-logging" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.206049 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="ovn-acl-logging" Oct 09 15:27:56 crc kubenswrapper[4719]: E1009 15:27:56.206058 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="ovnkube-controller" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.206064 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="ovnkube-controller" Oct 09 15:27:56 crc kubenswrapper[4719]: E1009 15:27:56.206078 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="kube-rbac-proxy-node" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.206084 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="kube-rbac-proxy-node" Oct 09 15:27:56 crc kubenswrapper[4719]: E1009 15:27:56.206094 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="northd" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.206100 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="northd" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.206236 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="ovn-acl-logging" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.206248 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="ovnkube-controller" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.206257 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="ovn-controller" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.206265 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="ovnkube-controller" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.206272 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="northd" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.206281 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="ovnkube-controller" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.206291 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="kube-rbac-proxy-node" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.206298 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="kube-rbac-proxy-ovn-metrics" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.206307 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="ovnkube-controller" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.206315 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="nbdb" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.206338 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="ovnkube-controller" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.206364 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="sbdb" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.206932 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fea6a48c-769c-41bf-95ce-649cc31eb4e5-kube-api-access-84cf8" (OuterVolumeSpecName: "kube-api-access-84cf8") pod "fea6a48c-769c-41bf-95ce-649cc31eb4e5" (UID: "fea6a48c-769c-41bf-95ce-649cc31eb4e5"). InnerVolumeSpecName "kube-api-access-84cf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:27:56 crc kubenswrapper[4719]: E1009 15:27:56.207304 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="ovnkube-controller" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.207323 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" containerName="ovnkube-controller" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.210994 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "fea6a48c-769c-41bf-95ce-649cc31eb4e5" (UID: "fea6a48c-769c-41bf-95ce-649cc31eb4e5"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.211974 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.299683 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-host-cni-bin\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.299945 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-etc-openvswitch\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.300041 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-ovn-node-metrics-cert\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.300142 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-node-log\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.300245 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-host-run-ovn-kubernetes\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.300327 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-systemd-units\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.300439 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-run-ovn\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.300532 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-log-socket\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.300618 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-host-cni-netd\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.300710 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-ovnkube-config\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.300803 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-host-slash\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.300872 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-ovnkube-script-lib\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.300954 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-var-lib-openvswitch\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.301033 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-host-kubelet\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.301105 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-487xn\" (UniqueName: \"kubernetes.io/projected/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-kube-api-access-487xn\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.301173 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-run-systemd\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.301242 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.301310 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-env-overrides\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.301408 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-host-run-netns\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.301633 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-run-openvswitch\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.301726 4719 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fea6a48c-769c-41bf-95ce-649cc31eb4e5-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.301787 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84cf8\" (UniqueName: \"kubernetes.io/projected/fea6a48c-769c-41bf-95ce-649cc31eb4e5-kube-api-access-84cf8\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.301839 4719 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fea6a48c-769c-41bf-95ce-649cc31eb4e5-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.404468 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-log-socket\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.404544 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-host-cni-netd\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.404580 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-ovnkube-config\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.404605 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-host-slash\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.404623 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-ovnkube-script-lib\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.404652 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-var-lib-openvswitch\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.404665 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-log-socket\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.404735 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-host-kubelet\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.404691 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-host-cni-netd\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.404689 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-host-kubelet\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.404794 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-487xn\" (UniqueName: \"kubernetes.io/projected/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-kube-api-access-487xn\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.404815 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-run-systemd\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.404832 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.404851 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-env-overrides\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.404879 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-host-run-netns\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.404896 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-run-openvswitch\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.404918 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-host-cni-bin\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.404934 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-etc-openvswitch\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.404948 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-ovn-node-metrics-cert\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.404965 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-node-log\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.404985 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-host-run-ovn-kubernetes\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.405004 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-systemd-units\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.405022 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-run-ovn\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.405072 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-run-ovn\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.405115 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-host-run-netns\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.405168 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-run-systemd\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.405196 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.405467 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-var-lib-openvswitch\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.404771 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-host-slash\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.405630 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-ovnkube-config\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.405686 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-host-cni-bin\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.405761 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-run-openvswitch\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.405797 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-host-run-ovn-kubernetes\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.405829 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-node-log\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.405842 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-env-overrides\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.405859 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-etc-openvswitch\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.405887 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-systemd-units\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.405927 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-ovnkube-script-lib\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.412160 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-ovn-node-metrics-cert\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.427570 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-487xn\" (UniqueName: \"kubernetes.io/projected/222f07c6-bf03-47a8-91ff-f4d9b50e7aef-kube-api-access-487xn\") pod \"ovnkube-node-nq24q\" (UID: \"222f07c6-bf03-47a8-91ff-f4d9b50e7aef\") " pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:56 crc kubenswrapper[4719]: I1009 15:27:56.532038 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:27:57 crc kubenswrapper[4719]: I1009 15:27:57.077639 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zv8jk_fea6a48c-769c-41bf-95ce-649cc31eb4e5/ovn-acl-logging/0.log" Oct 09 15:27:57 crc kubenswrapper[4719]: I1009 15:27:57.078279 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zv8jk_fea6a48c-769c-41bf-95ce-649cc31eb4e5/ovn-controller/0.log" Oct 09 15:27:57 crc kubenswrapper[4719]: I1009 15:27:57.078813 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zv8jk" Oct 09 15:27:57 crc kubenswrapper[4719]: I1009 15:27:57.081873 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kmbvp_6a7f4c67-0335-4c58-896a-b3059d9a9a3f/kube-multus/2.log" Oct 09 15:27:57 crc kubenswrapper[4719]: I1009 15:27:57.083571 4719 generic.go:334] "Generic (PLEG): container finished" podID="222f07c6-bf03-47a8-91ff-f4d9b50e7aef" containerID="a4344cb8e5f7014ba8a55620166d9c8b67bea20dccfdd3be1c556909346b2444" exitCode=0 Oct 09 15:27:57 crc kubenswrapper[4719]: I1009 15:27:57.083610 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" event={"ID":"222f07c6-bf03-47a8-91ff-f4d9b50e7aef","Type":"ContainerDied","Data":"a4344cb8e5f7014ba8a55620166d9c8b67bea20dccfdd3be1c556909346b2444"} Oct 09 15:27:57 crc kubenswrapper[4719]: I1009 15:27:57.083640 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" event={"ID":"222f07c6-bf03-47a8-91ff-f4d9b50e7aef","Type":"ContainerStarted","Data":"a38096aa760c32a28b32ac941edd9d805a2f7f39a8464c167ab2fbbce5c763f1"} Oct 09 15:27:57 crc kubenswrapper[4719]: I1009 15:27:57.140592 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zv8jk"] Oct 09 15:27:57 crc kubenswrapper[4719]: I1009 15:27:57.144937 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zv8jk"] Oct 09 15:27:57 crc kubenswrapper[4719]: I1009 15:27:57.172626 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fea6a48c-769c-41bf-95ce-649cc31eb4e5" path="/var/lib/kubelet/pods/fea6a48c-769c-41bf-95ce-649cc31eb4e5/volumes" Oct 09 15:27:58 crc kubenswrapper[4719]: I1009 15:27:58.100068 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" event={"ID":"222f07c6-bf03-47a8-91ff-f4d9b50e7aef","Type":"ContainerStarted","Data":"faa9a40b1022350a8298ba636887fa74427626568decf1d2f6a30bef1eab09cb"} Oct 09 15:27:58 crc kubenswrapper[4719]: I1009 15:27:58.100106 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" event={"ID":"222f07c6-bf03-47a8-91ff-f4d9b50e7aef","Type":"ContainerStarted","Data":"75a8bba78d42554f516c3f5fe2aef7f4d6a7c769ee4252416f662c31a3365b37"} Oct 09 15:27:58 crc kubenswrapper[4719]: I1009 15:27:58.100116 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" event={"ID":"222f07c6-bf03-47a8-91ff-f4d9b50e7aef","Type":"ContainerStarted","Data":"dc7aba1d38d22d70e81d791db809fe5f44b9d3bd979483c6a10b4ce473c1d343"} Oct 09 15:27:58 crc kubenswrapper[4719]: I1009 15:27:58.100125 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" event={"ID":"222f07c6-bf03-47a8-91ff-f4d9b50e7aef","Type":"ContainerStarted","Data":"08b3833c2de8c77516cadd86a889e35aae99e3b4e0f645110903dba92eb98982"} Oct 09 15:27:58 crc kubenswrapper[4719]: I1009 15:27:58.100132 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" event={"ID":"222f07c6-bf03-47a8-91ff-f4d9b50e7aef","Type":"ContainerStarted","Data":"9e2a4e0e0b4999ed3564c771899e85cbe80be447400d4988dbc21da28598b99e"} Oct 09 15:27:58 crc kubenswrapper[4719]: I1009 15:27:58.100139 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" event={"ID":"222f07c6-bf03-47a8-91ff-f4d9b50e7aef","Type":"ContainerStarted","Data":"01f02044e8a0332cd2b6911ec386c275372a21ace5e2f85a1712dd1879fe15ec"} Oct 09 15:28:00 crc kubenswrapper[4719]: I1009 15:28:00.120041 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" event={"ID":"222f07c6-bf03-47a8-91ff-f4d9b50e7aef","Type":"ContainerStarted","Data":"4a85c46e90ad97e2cb8a1e0dc0763bef76809f25514d913a496893b5a4f4afb8"} Oct 09 15:28:02 crc kubenswrapper[4719]: I1009 15:28:02.132838 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" event={"ID":"222f07c6-bf03-47a8-91ff-f4d9b50e7aef","Type":"ContainerStarted","Data":"3c1ea7a01ea72f3851390cf081320d0c09745581f70a0482bd7dae077c09db49"} Oct 09 15:28:02 crc kubenswrapper[4719]: I1009 15:28:02.133504 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:28:02 crc kubenswrapper[4719]: I1009 15:28:02.134057 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:28:02 crc kubenswrapper[4719]: I1009 15:28:02.134329 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:28:02 crc kubenswrapper[4719]: I1009 15:28:02.165075 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:28:02 crc kubenswrapper[4719]: I1009 15:28:02.168250 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" podStartSLOduration=6.168239154 podStartE2EDuration="6.168239154s" podCreationTimestamp="2025-10-09 15:27:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:28:02.16560459 +0000 UTC m=+587.675315885" watchObservedRunningTime="2025-10-09 15:28:02.168239154 +0000 UTC m=+587.677950439" Oct 09 15:28:02 crc kubenswrapper[4719]: I1009 15:28:02.194016 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:28:06 crc kubenswrapper[4719]: I1009 15:28:06.976624 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:28:06 crc kubenswrapper[4719]: I1009 15:28:06.977113 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:28:08 crc kubenswrapper[4719]: I1009 15:28:08.160419 4719 scope.go:117] "RemoveContainer" containerID="64908969d19b71a3974eeabf4e47002eb2af4a3eeee316c375b203ecfe43212f" Oct 09 15:28:08 crc kubenswrapper[4719]: E1009 15:28:08.160631 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-kmbvp_openshift-multus(6a7f4c67-0335-4c58-896a-b3059d9a9a3f)\"" pod="openshift-multus/multus-kmbvp" podUID="6a7f4c67-0335-4c58-896a-b3059d9a9a3f" Oct 09 15:28:15 crc kubenswrapper[4719]: I1009 15:28:15.294871 4719 scope.go:117] "RemoveContainer" containerID="59a6c607affaa28a2c8af16a995f53baf008a1efd42061bb5e3c01b5acac636a" Oct 09 15:28:15 crc kubenswrapper[4719]: I1009 15:28:15.324419 4719 scope.go:117] "RemoveContainer" containerID="f682329c6f1662ef1c3d1654d5d65f347ebb1061a2e011ba9e36bbd51b862d22" Oct 09 15:28:15 crc kubenswrapper[4719]: I1009 15:28:15.351142 4719 scope.go:117] "RemoveContainer" containerID="7b7a2de376156624c9692e083f0ca2e7cca8782bac2e18f9a0de3a67a084094f" Oct 09 15:28:15 crc kubenswrapper[4719]: I1009 15:28:15.365306 4719 scope.go:117] "RemoveContainer" containerID="f2246a5642d4fa1b9e182af8a19980e6a76aea32cc9669e7d30185d6672435b0" Oct 09 15:28:15 crc kubenswrapper[4719]: I1009 15:28:15.388319 4719 scope.go:117] "RemoveContainer" containerID="e5228008f4bbd33c0b6ea86640368c02b6cdf301b43494a232b37fa73ea72e47" Oct 09 15:28:15 crc kubenswrapper[4719]: I1009 15:28:15.402279 4719 scope.go:117] "RemoveContainer" containerID="65b32ef1116f7849b70aa3607bb4fc7b4bff9f58843c24742fc94aed9bb9a68e" Oct 09 15:28:15 crc kubenswrapper[4719]: I1009 15:28:15.413191 4719 scope.go:117] "RemoveContainer" containerID="4c0cb44eacc810e970c6b32e259ae1841fb312f20576d34ac183089a91000337" Oct 09 15:28:15 crc kubenswrapper[4719]: I1009 15:28:15.424898 4719 scope.go:117] "RemoveContainer" containerID="d1a911f9dd87ad57268bacc90fd4b3821f54d4ad91fcdde7066d3706aa8feb4b" Oct 09 15:28:15 crc kubenswrapper[4719]: I1009 15:28:15.439911 4719 scope.go:117] "RemoveContainer" containerID="80fe00a302db3a637794464b7cccf806ad3fa8efbdaea15f965ea41276188d1e" Oct 09 15:28:20 crc kubenswrapper[4719]: I1009 15:28:20.161248 4719 scope.go:117] "RemoveContainer" containerID="64908969d19b71a3974eeabf4e47002eb2af4a3eeee316c375b203ecfe43212f" Oct 09 15:28:21 crc kubenswrapper[4719]: I1009 15:28:21.232417 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kmbvp_6a7f4c67-0335-4c58-896a-b3059d9a9a3f/kube-multus/2.log" Oct 09 15:28:21 crc kubenswrapper[4719]: I1009 15:28:21.232768 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kmbvp" event={"ID":"6a7f4c67-0335-4c58-896a-b3059d9a9a3f","Type":"ContainerStarted","Data":"c400a4f6b1e888c0e34abc52ab43558a336072a4b4fd25379adee65a72c8e219"} Oct 09 15:28:24 crc kubenswrapper[4719]: I1009 15:28:24.572642 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4"] Oct 09 15:28:24 crc kubenswrapper[4719]: I1009 15:28:24.574702 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4" Oct 09 15:28:24 crc kubenswrapper[4719]: I1009 15:28:24.578090 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 09 15:28:24 crc kubenswrapper[4719]: I1009 15:28:24.580328 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4"] Oct 09 15:28:24 crc kubenswrapper[4719]: I1009 15:28:24.641611 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c118d4a-6a5b-4138-90ff-2270ea2dabd9-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4\" (UID: \"2c118d4a-6a5b-4138-90ff-2270ea2dabd9\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4" Oct 09 15:28:24 crc kubenswrapper[4719]: I1009 15:28:24.641698 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c118d4a-6a5b-4138-90ff-2270ea2dabd9-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4\" (UID: \"2c118d4a-6a5b-4138-90ff-2270ea2dabd9\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4" Oct 09 15:28:24 crc kubenswrapper[4719]: I1009 15:28:24.642004 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvfgx\" (UniqueName: \"kubernetes.io/projected/2c118d4a-6a5b-4138-90ff-2270ea2dabd9-kube-api-access-lvfgx\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4\" (UID: \"2c118d4a-6a5b-4138-90ff-2270ea2dabd9\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4" Oct 09 15:28:24 crc kubenswrapper[4719]: I1009 15:28:24.743565 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c118d4a-6a5b-4138-90ff-2270ea2dabd9-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4\" (UID: \"2c118d4a-6a5b-4138-90ff-2270ea2dabd9\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4" Oct 09 15:28:24 crc kubenswrapper[4719]: I1009 15:28:24.743634 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c118d4a-6a5b-4138-90ff-2270ea2dabd9-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4\" (UID: \"2c118d4a-6a5b-4138-90ff-2270ea2dabd9\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4" Oct 09 15:28:24 crc kubenswrapper[4719]: I1009 15:28:24.743682 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvfgx\" (UniqueName: \"kubernetes.io/projected/2c118d4a-6a5b-4138-90ff-2270ea2dabd9-kube-api-access-lvfgx\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4\" (UID: \"2c118d4a-6a5b-4138-90ff-2270ea2dabd9\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4" Oct 09 15:28:24 crc kubenswrapper[4719]: I1009 15:28:24.744106 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c118d4a-6a5b-4138-90ff-2270ea2dabd9-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4\" (UID: \"2c118d4a-6a5b-4138-90ff-2270ea2dabd9\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4" Oct 09 15:28:24 crc kubenswrapper[4719]: I1009 15:28:24.744216 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c118d4a-6a5b-4138-90ff-2270ea2dabd9-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4\" (UID: \"2c118d4a-6a5b-4138-90ff-2270ea2dabd9\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4" Oct 09 15:28:24 crc kubenswrapper[4719]: I1009 15:28:24.764206 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvfgx\" (UniqueName: \"kubernetes.io/projected/2c118d4a-6a5b-4138-90ff-2270ea2dabd9-kube-api-access-lvfgx\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4\" (UID: \"2c118d4a-6a5b-4138-90ff-2270ea2dabd9\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4" Oct 09 15:28:24 crc kubenswrapper[4719]: I1009 15:28:24.898153 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4" Oct 09 15:28:25 crc kubenswrapper[4719]: I1009 15:28:25.102064 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4"] Oct 09 15:28:25 crc kubenswrapper[4719]: I1009 15:28:25.255974 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4" event={"ID":"2c118d4a-6a5b-4138-90ff-2270ea2dabd9","Type":"ContainerStarted","Data":"ddc7856a1bcef4527820a49c0a58bdbacebc22f712361527f46e0a448fa53810"} Oct 09 15:28:25 crc kubenswrapper[4719]: I1009 15:28:25.256029 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4" event={"ID":"2c118d4a-6a5b-4138-90ff-2270ea2dabd9","Type":"ContainerStarted","Data":"01c8c4ce81a9713debaecec51c559e626b4a731ac4b7bd60bec04d5d93f5ddc4"} Oct 09 15:28:26 crc kubenswrapper[4719]: I1009 15:28:26.559150 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nq24q" Oct 09 15:28:27 crc kubenswrapper[4719]: I1009 15:28:27.266237 4719 generic.go:334] "Generic (PLEG): container finished" podID="2c118d4a-6a5b-4138-90ff-2270ea2dabd9" containerID="ddc7856a1bcef4527820a49c0a58bdbacebc22f712361527f46e0a448fa53810" exitCode=0 Oct 09 15:28:27 crc kubenswrapper[4719]: I1009 15:28:27.266313 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4" event={"ID":"2c118d4a-6a5b-4138-90ff-2270ea2dabd9","Type":"ContainerDied","Data":"ddc7856a1bcef4527820a49c0a58bdbacebc22f712361527f46e0a448fa53810"} Oct 09 15:28:29 crc kubenswrapper[4719]: I1009 15:28:29.288166 4719 generic.go:334] "Generic (PLEG): container finished" podID="2c118d4a-6a5b-4138-90ff-2270ea2dabd9" containerID="b190f41f13ae0443ab5aabfa90c4cd8cb9ded4b4f570703bdbcb6334632d1a5f" exitCode=0 Oct 09 15:28:29 crc kubenswrapper[4719]: I1009 15:28:29.288211 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4" event={"ID":"2c118d4a-6a5b-4138-90ff-2270ea2dabd9","Type":"ContainerDied","Data":"b190f41f13ae0443ab5aabfa90c4cd8cb9ded4b4f570703bdbcb6334632d1a5f"} Oct 09 15:28:30 crc kubenswrapper[4719]: I1009 15:28:30.295283 4719 generic.go:334] "Generic (PLEG): container finished" podID="2c118d4a-6a5b-4138-90ff-2270ea2dabd9" containerID="1512ab39e6d2b6fae31d4ea6f615679fd8f42829ac2c3536f809afb753422341" exitCode=0 Oct 09 15:28:30 crc kubenswrapper[4719]: I1009 15:28:30.295330 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4" event={"ID":"2c118d4a-6a5b-4138-90ff-2270ea2dabd9","Type":"ContainerDied","Data":"1512ab39e6d2b6fae31d4ea6f615679fd8f42829ac2c3536f809afb753422341"} Oct 09 15:28:31 crc kubenswrapper[4719]: I1009 15:28:31.499284 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4" Oct 09 15:28:31 crc kubenswrapper[4719]: I1009 15:28:31.625757 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c118d4a-6a5b-4138-90ff-2270ea2dabd9-bundle\") pod \"2c118d4a-6a5b-4138-90ff-2270ea2dabd9\" (UID: \"2c118d4a-6a5b-4138-90ff-2270ea2dabd9\") " Oct 09 15:28:31 crc kubenswrapper[4719]: I1009 15:28:31.627520 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvfgx\" (UniqueName: \"kubernetes.io/projected/2c118d4a-6a5b-4138-90ff-2270ea2dabd9-kube-api-access-lvfgx\") pod \"2c118d4a-6a5b-4138-90ff-2270ea2dabd9\" (UID: \"2c118d4a-6a5b-4138-90ff-2270ea2dabd9\") " Oct 09 15:28:31 crc kubenswrapper[4719]: I1009 15:28:31.627568 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c118d4a-6a5b-4138-90ff-2270ea2dabd9-util\") pod \"2c118d4a-6a5b-4138-90ff-2270ea2dabd9\" (UID: \"2c118d4a-6a5b-4138-90ff-2270ea2dabd9\") " Oct 09 15:28:31 crc kubenswrapper[4719]: I1009 15:28:31.628989 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c118d4a-6a5b-4138-90ff-2270ea2dabd9-bundle" (OuterVolumeSpecName: "bundle") pod "2c118d4a-6a5b-4138-90ff-2270ea2dabd9" (UID: "2c118d4a-6a5b-4138-90ff-2270ea2dabd9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:28:31 crc kubenswrapper[4719]: I1009 15:28:31.633934 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c118d4a-6a5b-4138-90ff-2270ea2dabd9-kube-api-access-lvfgx" (OuterVolumeSpecName: "kube-api-access-lvfgx") pod "2c118d4a-6a5b-4138-90ff-2270ea2dabd9" (UID: "2c118d4a-6a5b-4138-90ff-2270ea2dabd9"). InnerVolumeSpecName "kube-api-access-lvfgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:28:31 crc kubenswrapper[4719]: I1009 15:28:31.641510 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c118d4a-6a5b-4138-90ff-2270ea2dabd9-util" (OuterVolumeSpecName: "util") pod "2c118d4a-6a5b-4138-90ff-2270ea2dabd9" (UID: "2c118d4a-6a5b-4138-90ff-2270ea2dabd9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:28:31 crc kubenswrapper[4719]: I1009 15:28:31.729626 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvfgx\" (UniqueName: \"kubernetes.io/projected/2c118d4a-6a5b-4138-90ff-2270ea2dabd9-kube-api-access-lvfgx\") on node \"crc\" DevicePath \"\"" Oct 09 15:28:31 crc kubenswrapper[4719]: I1009 15:28:31.729665 4719 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c118d4a-6a5b-4138-90ff-2270ea2dabd9-util\") on node \"crc\" DevicePath \"\"" Oct 09 15:28:31 crc kubenswrapper[4719]: I1009 15:28:31.729700 4719 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c118d4a-6a5b-4138-90ff-2270ea2dabd9-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:28:32 crc kubenswrapper[4719]: I1009 15:28:32.307880 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4" event={"ID":"2c118d4a-6a5b-4138-90ff-2270ea2dabd9","Type":"ContainerDied","Data":"01c8c4ce81a9713debaecec51c559e626b4a731ac4b7bd60bec04d5d93f5ddc4"} Oct 09 15:28:32 crc kubenswrapper[4719]: I1009 15:28:32.307928 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01c8c4ce81a9713debaecec51c559e626b4a731ac4b7bd60bec04d5d93f5ddc4" Oct 09 15:28:32 crc kubenswrapper[4719]: I1009 15:28:32.307998 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4" Oct 09 15:28:36 crc kubenswrapper[4719]: I1009 15:28:36.976934 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:28:36 crc kubenswrapper[4719]: I1009 15:28:36.977519 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:28:36 crc kubenswrapper[4719]: I1009 15:28:36.977575 4719 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 15:28:36 crc kubenswrapper[4719]: I1009 15:28:36.978236 4719 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8f170c1640fe33d3a488ada64feda2ff2ffd3c8d5fb1e790430375c1ffcc2527"} pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 15:28:36 crc kubenswrapper[4719]: I1009 15:28:36.978300 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" containerID="cri-o://8f170c1640fe33d3a488ada64feda2ff2ffd3c8d5fb1e790430375c1ffcc2527" gracePeriod=600 Oct 09 15:28:37 crc kubenswrapper[4719]: I1009 15:28:37.341399 4719 generic.go:334] "Generic (PLEG): container finished" podID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerID="8f170c1640fe33d3a488ada64feda2ff2ffd3c8d5fb1e790430375c1ffcc2527" exitCode=0 Oct 09 15:28:37 crc kubenswrapper[4719]: I1009 15:28:37.341477 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerDied","Data":"8f170c1640fe33d3a488ada64feda2ff2ffd3c8d5fb1e790430375c1ffcc2527"} Oct 09 15:28:37 crc kubenswrapper[4719]: I1009 15:28:37.341737 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerStarted","Data":"68d8ab72b367a09fd501bf52a95e52e96b2dd8454309c2056f29b2264d60dcdd"} Oct 09 15:28:37 crc kubenswrapper[4719]: I1009 15:28:37.341777 4719 scope.go:117] "RemoveContainer" containerID="58630cc589d6ba8e40a40e1e3c93cc21531a1b6e5470575e2e8a4d654789d22a" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.339717 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-z6j76"] Oct 09 15:28:41 crc kubenswrapper[4719]: E1009 15:28:41.340416 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c118d4a-6a5b-4138-90ff-2270ea2dabd9" containerName="util" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.340428 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c118d4a-6a5b-4138-90ff-2270ea2dabd9" containerName="util" Oct 09 15:28:41 crc kubenswrapper[4719]: E1009 15:28:41.340438 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c118d4a-6a5b-4138-90ff-2270ea2dabd9" containerName="extract" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.340443 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c118d4a-6a5b-4138-90ff-2270ea2dabd9" containerName="extract" Oct 09 15:28:41 crc kubenswrapper[4719]: E1009 15:28:41.340459 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c118d4a-6a5b-4138-90ff-2270ea2dabd9" containerName="pull" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.340466 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c118d4a-6a5b-4138-90ff-2270ea2dabd9" containerName="pull" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.340567 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c118d4a-6a5b-4138-90ff-2270ea2dabd9" containerName="extract" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.340932 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-z6j76" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.342560 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.342910 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.343969 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-zkgmj" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.351697 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-z6j76"] Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.385678 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-2f97q"] Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.387041 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-2f97q" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.402998 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-6mr8v" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.407343 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.413782 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-2f97q"] Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.416392 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-6px5r"] Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.417064 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-6px5r" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.448385 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1ae73acd-0d93-4281-b807-4798a207506b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5bfc464c98-2f97q\" (UID: \"1ae73acd-0d93-4281-b807-4798a207506b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-2f97q" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.448464 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1ae73acd-0d93-4281-b807-4798a207506b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5bfc464c98-2f97q\" (UID: \"1ae73acd-0d93-4281-b807-4798a207506b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-2f97q" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.448494 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmlr8\" (UniqueName: \"kubernetes.io/projected/50a5cc44-22d4-4ef1-a265-800ebc36afd4-kube-api-access-jmlr8\") pod \"obo-prometheus-operator-7c8cf85677-z6j76\" (UID: \"50a5cc44-22d4-4ef1-a265-800ebc36afd4\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-z6j76" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.456201 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-6px5r"] Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.549149 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1ae73acd-0d93-4281-b807-4798a207506b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5bfc464c98-2f97q\" (UID: \"1ae73acd-0d93-4281-b807-4798a207506b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-2f97q" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.549212 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmlr8\" (UniqueName: \"kubernetes.io/projected/50a5cc44-22d4-4ef1-a265-800ebc36afd4-kube-api-access-jmlr8\") pod \"obo-prometheus-operator-7c8cf85677-z6j76\" (UID: \"50a5cc44-22d4-4ef1-a265-800ebc36afd4\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-z6j76" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.549260 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1ae73acd-0d93-4281-b807-4798a207506b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5bfc464c98-2f97q\" (UID: \"1ae73acd-0d93-4281-b807-4798a207506b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-2f97q" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.549317 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/db3811fa-89b7-44a6-94e8-92ca398d8d2c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5bfc464c98-6px5r\" (UID: \"db3811fa-89b7-44a6-94e8-92ca398d8d2c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-6px5r" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.549366 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/db3811fa-89b7-44a6-94e8-92ca398d8d2c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5bfc464c98-6px5r\" (UID: \"db3811fa-89b7-44a6-94e8-92ca398d8d2c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-6px5r" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.556070 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1ae73acd-0d93-4281-b807-4798a207506b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5bfc464c98-2f97q\" (UID: \"1ae73acd-0d93-4281-b807-4798a207506b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-2f97q" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.556736 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1ae73acd-0d93-4281-b807-4798a207506b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5bfc464c98-2f97q\" (UID: \"1ae73acd-0d93-4281-b807-4798a207506b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-2f97q" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.576117 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmlr8\" (UniqueName: \"kubernetes.io/projected/50a5cc44-22d4-4ef1-a265-800ebc36afd4-kube-api-access-jmlr8\") pod \"obo-prometheus-operator-7c8cf85677-z6j76\" (UID: \"50a5cc44-22d4-4ef1-a265-800ebc36afd4\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-z6j76" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.586418 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-ff5rt"] Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.587450 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-ff5rt" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.590323 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.590706 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-f6rqk" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.598890 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-ff5rt"] Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.650613 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjhzf\" (UniqueName: \"kubernetes.io/projected/efab9597-d673-43a0-bedd-f1ec483ae194-kube-api-access-mjhzf\") pod \"observability-operator-cc5f78dfc-ff5rt\" (UID: \"efab9597-d673-43a0-bedd-f1ec483ae194\") " pod="openshift-operators/observability-operator-cc5f78dfc-ff5rt" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.650782 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/db3811fa-89b7-44a6-94e8-92ca398d8d2c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5bfc464c98-6px5r\" (UID: \"db3811fa-89b7-44a6-94e8-92ca398d8d2c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-6px5r" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.650849 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/db3811fa-89b7-44a6-94e8-92ca398d8d2c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5bfc464c98-6px5r\" (UID: \"db3811fa-89b7-44a6-94e8-92ca398d8d2c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-6px5r" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.650950 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/efab9597-d673-43a0-bedd-f1ec483ae194-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-ff5rt\" (UID: \"efab9597-d673-43a0-bedd-f1ec483ae194\") " pod="openshift-operators/observability-operator-cc5f78dfc-ff5rt" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.655099 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/db3811fa-89b7-44a6-94e8-92ca398d8d2c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5bfc464c98-6px5r\" (UID: \"db3811fa-89b7-44a6-94e8-92ca398d8d2c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-6px5r" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.659774 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-z6j76" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.664515 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/db3811fa-89b7-44a6-94e8-92ca398d8d2c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5bfc464c98-6px5r\" (UID: \"db3811fa-89b7-44a6-94e8-92ca398d8d2c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-6px5r" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.697025 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-kc8zw"] Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.697774 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-kc8zw" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.702919 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-tzv67" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.713418 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-kc8zw"] Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.716618 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-2f97q" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.735299 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-6px5r" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.752881 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8fxw\" (UniqueName: \"kubernetes.io/projected/4eb8b96a-c47f-424d-bcbc-20ff193b8d7f-kube-api-access-n8fxw\") pod \"perses-operator-54bc95c9fb-kc8zw\" (UID: \"4eb8b96a-c47f-424d-bcbc-20ff193b8d7f\") " pod="openshift-operators/perses-operator-54bc95c9fb-kc8zw" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.752950 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/4eb8b96a-c47f-424d-bcbc-20ff193b8d7f-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-kc8zw\" (UID: \"4eb8b96a-c47f-424d-bcbc-20ff193b8d7f\") " pod="openshift-operators/perses-operator-54bc95c9fb-kc8zw" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.752975 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/efab9597-d673-43a0-bedd-f1ec483ae194-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-ff5rt\" (UID: \"efab9597-d673-43a0-bedd-f1ec483ae194\") " pod="openshift-operators/observability-operator-cc5f78dfc-ff5rt" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.753021 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjhzf\" (UniqueName: \"kubernetes.io/projected/efab9597-d673-43a0-bedd-f1ec483ae194-kube-api-access-mjhzf\") pod \"observability-operator-cc5f78dfc-ff5rt\" (UID: \"efab9597-d673-43a0-bedd-f1ec483ae194\") " pod="openshift-operators/observability-operator-cc5f78dfc-ff5rt" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.759542 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/efab9597-d673-43a0-bedd-f1ec483ae194-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-ff5rt\" (UID: \"efab9597-d673-43a0-bedd-f1ec483ae194\") " pod="openshift-operators/observability-operator-cc5f78dfc-ff5rt" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.776135 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjhzf\" (UniqueName: \"kubernetes.io/projected/efab9597-d673-43a0-bedd-f1ec483ae194-kube-api-access-mjhzf\") pod \"observability-operator-cc5f78dfc-ff5rt\" (UID: \"efab9597-d673-43a0-bedd-f1ec483ae194\") " pod="openshift-operators/observability-operator-cc5f78dfc-ff5rt" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.858984 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8fxw\" (UniqueName: \"kubernetes.io/projected/4eb8b96a-c47f-424d-bcbc-20ff193b8d7f-kube-api-access-n8fxw\") pod \"perses-operator-54bc95c9fb-kc8zw\" (UID: \"4eb8b96a-c47f-424d-bcbc-20ff193b8d7f\") " pod="openshift-operators/perses-operator-54bc95c9fb-kc8zw" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.859390 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/4eb8b96a-c47f-424d-bcbc-20ff193b8d7f-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-kc8zw\" (UID: \"4eb8b96a-c47f-424d-bcbc-20ff193b8d7f\") " pod="openshift-operators/perses-operator-54bc95c9fb-kc8zw" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.861584 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/4eb8b96a-c47f-424d-bcbc-20ff193b8d7f-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-kc8zw\" (UID: \"4eb8b96a-c47f-424d-bcbc-20ff193b8d7f\") " pod="openshift-operators/perses-operator-54bc95c9fb-kc8zw" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.884582 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8fxw\" (UniqueName: \"kubernetes.io/projected/4eb8b96a-c47f-424d-bcbc-20ff193b8d7f-kube-api-access-n8fxw\") pod \"perses-operator-54bc95c9fb-kc8zw\" (UID: \"4eb8b96a-c47f-424d-bcbc-20ff193b8d7f\") " pod="openshift-operators/perses-operator-54bc95c9fb-kc8zw" Oct 09 15:28:41 crc kubenswrapper[4719]: I1009 15:28:41.919257 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-ff5rt" Oct 09 15:28:42 crc kubenswrapper[4719]: I1009 15:28:42.025643 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-kc8zw" Oct 09 15:28:42 crc kubenswrapper[4719]: I1009 15:28:42.083264 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-z6j76"] Oct 09 15:28:42 crc kubenswrapper[4719]: W1009 15:28:42.134256 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50a5cc44_22d4_4ef1_a265_800ebc36afd4.slice/crio-bdf3449235f084049ad30e5fabc4dff4dce5339eca28e69ff7c3114a623c7391 WatchSource:0}: Error finding container bdf3449235f084049ad30e5fabc4dff4dce5339eca28e69ff7c3114a623c7391: Status 404 returned error can't find the container with id bdf3449235f084049ad30e5fabc4dff4dce5339eca28e69ff7c3114a623c7391 Oct 09 15:28:42 crc kubenswrapper[4719]: I1009 15:28:42.182748 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-2f97q"] Oct 09 15:28:42 crc kubenswrapper[4719]: I1009 15:28:42.300150 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-6px5r"] Oct 09 15:28:42 crc kubenswrapper[4719]: I1009 15:28:42.374626 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-2f97q" event={"ID":"1ae73acd-0d93-4281-b807-4798a207506b","Type":"ContainerStarted","Data":"96bb6fff3346d82a5b9400416e656a3e3c662441f17f268b44ea359075a5de9b"} Oct 09 15:28:42 crc kubenswrapper[4719]: I1009 15:28:42.376146 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-z6j76" event={"ID":"50a5cc44-22d4-4ef1-a265-800ebc36afd4","Type":"ContainerStarted","Data":"bdf3449235f084049ad30e5fabc4dff4dce5339eca28e69ff7c3114a623c7391"} Oct 09 15:28:42 crc kubenswrapper[4719]: I1009 15:28:42.379285 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-6px5r" event={"ID":"db3811fa-89b7-44a6-94e8-92ca398d8d2c","Type":"ContainerStarted","Data":"65e6dcc065912a44df0829049a6e69015302fa98d166afc55b1ce03b1953c5a8"} Oct 09 15:28:42 crc kubenswrapper[4719]: I1009 15:28:42.400647 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-ff5rt"] Oct 09 15:28:42 crc kubenswrapper[4719]: W1009 15:28:42.405044 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefab9597_d673_43a0_bedd_f1ec483ae194.slice/crio-fccd7eea777d36bb1bf1640b65e4f76e3dc6450add39ce00eec02c09417a81ad WatchSource:0}: Error finding container fccd7eea777d36bb1bf1640b65e4f76e3dc6450add39ce00eec02c09417a81ad: Status 404 returned error can't find the container with id fccd7eea777d36bb1bf1640b65e4f76e3dc6450add39ce00eec02c09417a81ad Oct 09 15:28:42 crc kubenswrapper[4719]: I1009 15:28:42.559298 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-kc8zw"] Oct 09 15:28:42 crc kubenswrapper[4719]: W1009 15:28:42.569769 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4eb8b96a_c47f_424d_bcbc_20ff193b8d7f.slice/crio-0ab2adb3bfb654a221c1556973e8837bb456e0547f382a6ae90fbc762a4f529d WatchSource:0}: Error finding container 0ab2adb3bfb654a221c1556973e8837bb456e0547f382a6ae90fbc762a4f529d: Status 404 returned error can't find the container with id 0ab2adb3bfb654a221c1556973e8837bb456e0547f382a6ae90fbc762a4f529d Oct 09 15:28:43 crc kubenswrapper[4719]: I1009 15:28:43.384856 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-ff5rt" event={"ID":"efab9597-d673-43a0-bedd-f1ec483ae194","Type":"ContainerStarted","Data":"fccd7eea777d36bb1bf1640b65e4f76e3dc6450add39ce00eec02c09417a81ad"} Oct 09 15:28:43 crc kubenswrapper[4719]: I1009 15:28:43.385900 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-kc8zw" event={"ID":"4eb8b96a-c47f-424d-bcbc-20ff193b8d7f","Type":"ContainerStarted","Data":"0ab2adb3bfb654a221c1556973e8837bb456e0547f382a6ae90fbc762a4f529d"} Oct 09 15:28:56 crc kubenswrapper[4719]: I1009 15:28:56.496159 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-6px5r" event={"ID":"db3811fa-89b7-44a6-94e8-92ca398d8d2c","Type":"ContainerStarted","Data":"dc98bfe33c726d485d09b1b4e1775d85cd9041b2e61f8bf3b7ca4531789c7563"} Oct 09 15:28:56 crc kubenswrapper[4719]: I1009 15:28:56.497823 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-kc8zw" event={"ID":"4eb8b96a-c47f-424d-bcbc-20ff193b8d7f","Type":"ContainerStarted","Data":"7406c72f8def13e4a0683752b39f146ff56b7632aa00b7df19e5f48f1900ef16"} Oct 09 15:28:56 crc kubenswrapper[4719]: I1009 15:28:56.497947 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-kc8zw" Oct 09 15:28:56 crc kubenswrapper[4719]: I1009 15:28:56.499500 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-ff5rt" event={"ID":"efab9597-d673-43a0-bedd-f1ec483ae194","Type":"ContainerStarted","Data":"bbe7907c9aff54f8e066e2e68e16e2276d6cac23d12d506bdd6c4a6b1c0d407b"} Oct 09 15:28:56 crc kubenswrapper[4719]: I1009 15:28:56.499688 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-ff5rt" Oct 09 15:28:56 crc kubenswrapper[4719]: I1009 15:28:56.501479 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-ff5rt" Oct 09 15:28:56 crc kubenswrapper[4719]: I1009 15:28:56.503093 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-2f97q" event={"ID":"1ae73acd-0d93-4281-b807-4798a207506b","Type":"ContainerStarted","Data":"a75197df401e06893a81659489d17f1950d84dc076b54fbacdf7ffaa6e0a2f8c"} Oct 09 15:28:56 crc kubenswrapper[4719]: I1009 15:28:56.504740 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-z6j76" event={"ID":"50a5cc44-22d4-4ef1-a265-800ebc36afd4","Type":"ContainerStarted","Data":"6a931eb65d9f97119cad98414a86fc2014656b55457defb9421c5578ddb21567"} Oct 09 15:28:56 crc kubenswrapper[4719]: I1009 15:28:56.560999 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-6px5r" podStartSLOduration=2.466502929 podStartE2EDuration="15.560981322s" podCreationTimestamp="2025-10-09 15:28:41 +0000 UTC" firstStartedPulling="2025-10-09 15:28:42.312564363 +0000 UTC m=+627.822275648" lastFinishedPulling="2025-10-09 15:28:55.407042746 +0000 UTC m=+640.916754041" observedRunningTime="2025-10-09 15:28:56.526719899 +0000 UTC m=+642.036431224" watchObservedRunningTime="2025-10-09 15:28:56.560981322 +0000 UTC m=+642.070692607" Oct 09 15:28:56 crc kubenswrapper[4719]: I1009 15:28:56.562156 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-kc8zw" podStartSLOduration=2.6969388050000003 podStartE2EDuration="15.56215064s" podCreationTimestamp="2025-10-09 15:28:41 +0000 UTC" firstStartedPulling="2025-10-09 15:28:42.572649685 +0000 UTC m=+628.082360970" lastFinishedPulling="2025-10-09 15:28:55.43786152 +0000 UTC m=+640.947572805" observedRunningTime="2025-10-09 15:28:56.561555591 +0000 UTC m=+642.071266886" watchObservedRunningTime="2025-10-09 15:28:56.56215064 +0000 UTC m=+642.071861925" Oct 09 15:28:56 crc kubenswrapper[4719]: I1009 15:28:56.589382 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bfc464c98-2f97q" podStartSLOduration=2.377066512 podStartE2EDuration="15.589358818s" podCreationTimestamp="2025-10-09 15:28:41 +0000 UTC" firstStartedPulling="2025-10-09 15:28:42.202905442 +0000 UTC m=+627.712616727" lastFinishedPulling="2025-10-09 15:28:55.415197758 +0000 UTC m=+640.924909033" observedRunningTime="2025-10-09 15:28:56.585846086 +0000 UTC m=+642.095557371" watchObservedRunningTime="2025-10-09 15:28:56.589358818 +0000 UTC m=+642.099070113" Oct 09 15:28:56 crc kubenswrapper[4719]: I1009 15:28:56.622864 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-z6j76" podStartSLOduration=2.269162129 podStartE2EDuration="15.622840947s" podCreationTimestamp="2025-10-09 15:28:41 +0000 UTC" firstStartedPulling="2025-10-09 15:28:42.137540395 +0000 UTC m=+627.647251670" lastFinishedPulling="2025-10-09 15:28:55.491219203 +0000 UTC m=+641.000930488" observedRunningTime="2025-10-09 15:28:56.618390625 +0000 UTC m=+642.128101920" watchObservedRunningTime="2025-10-09 15:28:56.622840947 +0000 UTC m=+642.132552232" Oct 09 15:28:56 crc kubenswrapper[4719]: I1009 15:28:56.645414 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-ff5rt" podStartSLOduration=2.562448201 podStartE2EDuration="15.645397457s" podCreationTimestamp="2025-10-09 15:28:41 +0000 UTC" firstStartedPulling="2025-10-09 15:28:42.411280354 +0000 UTC m=+627.920991639" lastFinishedPulling="2025-10-09 15:28:55.49422961 +0000 UTC m=+641.003940895" observedRunningTime="2025-10-09 15:28:56.643940581 +0000 UTC m=+642.153651886" watchObservedRunningTime="2025-10-09 15:28:56.645397457 +0000 UTC m=+642.155108742" Oct 09 15:29:02 crc kubenswrapper[4719]: I1009 15:29:02.034461 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-kc8zw" Oct 09 15:29:19 crc kubenswrapper[4719]: I1009 15:29:19.012189 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz"] Oct 09 15:29:19 crc kubenswrapper[4719]: I1009 15:29:19.014062 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz" Oct 09 15:29:19 crc kubenswrapper[4719]: I1009 15:29:19.016105 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 09 15:29:19 crc kubenswrapper[4719]: I1009 15:29:19.023105 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz"] Oct 09 15:29:19 crc kubenswrapper[4719]: I1009 15:29:19.081529 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57298b91-7d64-40fd-be0e-c400cdfd9b93-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz\" (UID: \"57298b91-7d64-40fd-be0e-c400cdfd9b93\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz" Oct 09 15:29:19 crc kubenswrapper[4719]: I1009 15:29:19.081605 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdsmw\" (UniqueName: \"kubernetes.io/projected/57298b91-7d64-40fd-be0e-c400cdfd9b93-kube-api-access-hdsmw\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz\" (UID: \"57298b91-7d64-40fd-be0e-c400cdfd9b93\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz" Oct 09 15:29:19 crc kubenswrapper[4719]: I1009 15:29:19.081686 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57298b91-7d64-40fd-be0e-c400cdfd9b93-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz\" (UID: \"57298b91-7d64-40fd-be0e-c400cdfd9b93\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz" Oct 09 15:29:19 crc kubenswrapper[4719]: I1009 15:29:19.182999 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57298b91-7d64-40fd-be0e-c400cdfd9b93-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz\" (UID: \"57298b91-7d64-40fd-be0e-c400cdfd9b93\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz" Oct 09 15:29:19 crc kubenswrapper[4719]: I1009 15:29:19.183085 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57298b91-7d64-40fd-be0e-c400cdfd9b93-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz\" (UID: \"57298b91-7d64-40fd-be0e-c400cdfd9b93\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz" Oct 09 15:29:19 crc kubenswrapper[4719]: I1009 15:29:19.183119 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdsmw\" (UniqueName: \"kubernetes.io/projected/57298b91-7d64-40fd-be0e-c400cdfd9b93-kube-api-access-hdsmw\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz\" (UID: \"57298b91-7d64-40fd-be0e-c400cdfd9b93\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz" Oct 09 15:29:19 crc kubenswrapper[4719]: I1009 15:29:19.183622 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57298b91-7d64-40fd-be0e-c400cdfd9b93-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz\" (UID: \"57298b91-7d64-40fd-be0e-c400cdfd9b93\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz" Oct 09 15:29:19 crc kubenswrapper[4719]: I1009 15:29:19.183661 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57298b91-7d64-40fd-be0e-c400cdfd9b93-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz\" (UID: \"57298b91-7d64-40fd-be0e-c400cdfd9b93\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz" Oct 09 15:29:19 crc kubenswrapper[4719]: I1009 15:29:19.206708 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdsmw\" (UniqueName: \"kubernetes.io/projected/57298b91-7d64-40fd-be0e-c400cdfd9b93-kube-api-access-hdsmw\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz\" (UID: \"57298b91-7d64-40fd-be0e-c400cdfd9b93\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz" Oct 09 15:29:19 crc kubenswrapper[4719]: I1009 15:29:19.333378 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz" Oct 09 15:29:19 crc kubenswrapper[4719]: I1009 15:29:19.613274 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz"] Oct 09 15:29:20 crc kubenswrapper[4719]: I1009 15:29:20.624778 4719 generic.go:334] "Generic (PLEG): container finished" podID="57298b91-7d64-40fd-be0e-c400cdfd9b93" containerID="438fbbdc9e7d4e6cf03b26159d708f34b6a76848315d6e68ffa6fbaa9d54d5d7" exitCode=0 Oct 09 15:29:20 crc kubenswrapper[4719]: I1009 15:29:20.624821 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz" event={"ID":"57298b91-7d64-40fd-be0e-c400cdfd9b93","Type":"ContainerDied","Data":"438fbbdc9e7d4e6cf03b26159d708f34b6a76848315d6e68ffa6fbaa9d54d5d7"} Oct 09 15:29:20 crc kubenswrapper[4719]: I1009 15:29:20.625053 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz" event={"ID":"57298b91-7d64-40fd-be0e-c400cdfd9b93","Type":"ContainerStarted","Data":"c4cd2ac7808ff7215ac7079183bf4c17ec925f34d3cff87d86b3b7aa260ac5d7"} Oct 09 15:29:23 crc kubenswrapper[4719]: I1009 15:29:23.642787 4719 generic.go:334] "Generic (PLEG): container finished" podID="57298b91-7d64-40fd-be0e-c400cdfd9b93" containerID="666830ba36364a1f3d10b61433dda9f89c9892d4d08f23112645dad943580e19" exitCode=0 Oct 09 15:29:23 crc kubenswrapper[4719]: I1009 15:29:23.642861 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz" event={"ID":"57298b91-7d64-40fd-be0e-c400cdfd9b93","Type":"ContainerDied","Data":"666830ba36364a1f3d10b61433dda9f89c9892d4d08f23112645dad943580e19"} Oct 09 15:29:24 crc kubenswrapper[4719]: I1009 15:29:24.648966 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz" event={"ID":"57298b91-7d64-40fd-be0e-c400cdfd9b93","Type":"ContainerStarted","Data":"1d98a0c5b61695439dd38b7c0a32c05b963c6110dc97a65e4090db6bebe93539"} Oct 09 15:29:24 crc kubenswrapper[4719]: I1009 15:29:24.668109 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz" podStartSLOduration=4.589210157 podStartE2EDuration="6.668090301s" podCreationTimestamp="2025-10-09 15:29:18 +0000 UTC" firstStartedPulling="2025-10-09 15:29:20.626294671 +0000 UTC m=+666.136005956" lastFinishedPulling="2025-10-09 15:29:22.705174815 +0000 UTC m=+668.214886100" observedRunningTime="2025-10-09 15:29:24.667888505 +0000 UTC m=+670.177599810" watchObservedRunningTime="2025-10-09 15:29:24.668090301 +0000 UTC m=+670.177801586" Oct 09 15:29:25 crc kubenswrapper[4719]: I1009 15:29:25.655901 4719 generic.go:334] "Generic (PLEG): container finished" podID="57298b91-7d64-40fd-be0e-c400cdfd9b93" containerID="1d98a0c5b61695439dd38b7c0a32c05b963c6110dc97a65e4090db6bebe93539" exitCode=0 Oct 09 15:29:25 crc kubenswrapper[4719]: I1009 15:29:25.655938 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz" event={"ID":"57298b91-7d64-40fd-be0e-c400cdfd9b93","Type":"ContainerDied","Data":"1d98a0c5b61695439dd38b7c0a32c05b963c6110dc97a65e4090db6bebe93539"} Oct 09 15:29:27 crc kubenswrapper[4719]: I1009 15:29:27.052284 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz" Oct 09 15:29:27 crc kubenswrapper[4719]: I1009 15:29:27.179932 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57298b91-7d64-40fd-be0e-c400cdfd9b93-bundle\") pod \"57298b91-7d64-40fd-be0e-c400cdfd9b93\" (UID: \"57298b91-7d64-40fd-be0e-c400cdfd9b93\") " Oct 09 15:29:27 crc kubenswrapper[4719]: I1009 15:29:27.180050 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57298b91-7d64-40fd-be0e-c400cdfd9b93-util\") pod \"57298b91-7d64-40fd-be0e-c400cdfd9b93\" (UID: \"57298b91-7d64-40fd-be0e-c400cdfd9b93\") " Oct 09 15:29:27 crc kubenswrapper[4719]: I1009 15:29:27.180412 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57298b91-7d64-40fd-be0e-c400cdfd9b93-bundle" (OuterVolumeSpecName: "bundle") pod "57298b91-7d64-40fd-be0e-c400cdfd9b93" (UID: "57298b91-7d64-40fd-be0e-c400cdfd9b93"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:29:27 crc kubenswrapper[4719]: I1009 15:29:27.180911 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdsmw\" (UniqueName: \"kubernetes.io/projected/57298b91-7d64-40fd-be0e-c400cdfd9b93-kube-api-access-hdsmw\") pod \"57298b91-7d64-40fd-be0e-c400cdfd9b93\" (UID: \"57298b91-7d64-40fd-be0e-c400cdfd9b93\") " Oct 09 15:29:27 crc kubenswrapper[4719]: I1009 15:29:27.182140 4719 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57298b91-7d64-40fd-be0e-c400cdfd9b93-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:29:27 crc kubenswrapper[4719]: I1009 15:29:27.186414 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57298b91-7d64-40fd-be0e-c400cdfd9b93-kube-api-access-hdsmw" (OuterVolumeSpecName: "kube-api-access-hdsmw") pod "57298b91-7d64-40fd-be0e-c400cdfd9b93" (UID: "57298b91-7d64-40fd-be0e-c400cdfd9b93"). InnerVolumeSpecName "kube-api-access-hdsmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:29:27 crc kubenswrapper[4719]: I1009 15:29:27.191059 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57298b91-7d64-40fd-be0e-c400cdfd9b93-util" (OuterVolumeSpecName: "util") pod "57298b91-7d64-40fd-be0e-c400cdfd9b93" (UID: "57298b91-7d64-40fd-be0e-c400cdfd9b93"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:29:27 crc kubenswrapper[4719]: I1009 15:29:27.284627 4719 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57298b91-7d64-40fd-be0e-c400cdfd9b93-util\") on node \"crc\" DevicePath \"\"" Oct 09 15:29:27 crc kubenswrapper[4719]: I1009 15:29:27.284743 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdsmw\" (UniqueName: \"kubernetes.io/projected/57298b91-7d64-40fd-be0e-c400cdfd9b93-kube-api-access-hdsmw\") on node \"crc\" DevicePath \"\"" Oct 09 15:29:27 crc kubenswrapper[4719]: I1009 15:29:27.667775 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz" event={"ID":"57298b91-7d64-40fd-be0e-c400cdfd9b93","Type":"ContainerDied","Data":"c4cd2ac7808ff7215ac7079183bf4c17ec925f34d3cff87d86b3b7aa260ac5d7"} Oct 09 15:29:27 crc kubenswrapper[4719]: I1009 15:29:27.667822 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4cd2ac7808ff7215ac7079183bf4c17ec925f34d3cff87d86b3b7aa260ac5d7" Oct 09 15:29:27 crc kubenswrapper[4719]: I1009 15:29:27.667862 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz" Oct 09 15:29:30 crc kubenswrapper[4719]: I1009 15:29:30.626566 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-l4s9n"] Oct 09 15:29:30 crc kubenswrapper[4719]: E1009 15:29:30.627117 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57298b91-7d64-40fd-be0e-c400cdfd9b93" containerName="util" Oct 09 15:29:30 crc kubenswrapper[4719]: I1009 15:29:30.627132 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="57298b91-7d64-40fd-be0e-c400cdfd9b93" containerName="util" Oct 09 15:29:30 crc kubenswrapper[4719]: E1009 15:29:30.627148 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57298b91-7d64-40fd-be0e-c400cdfd9b93" containerName="extract" Oct 09 15:29:30 crc kubenswrapper[4719]: I1009 15:29:30.627156 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="57298b91-7d64-40fd-be0e-c400cdfd9b93" containerName="extract" Oct 09 15:29:30 crc kubenswrapper[4719]: E1009 15:29:30.627170 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57298b91-7d64-40fd-be0e-c400cdfd9b93" containerName="pull" Oct 09 15:29:30 crc kubenswrapper[4719]: I1009 15:29:30.627178 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="57298b91-7d64-40fd-be0e-c400cdfd9b93" containerName="pull" Oct 09 15:29:30 crc kubenswrapper[4719]: I1009 15:29:30.627314 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="57298b91-7d64-40fd-be0e-c400cdfd9b93" containerName="extract" Oct 09 15:29:30 crc kubenswrapper[4719]: I1009 15:29:30.627762 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-l4s9n" Oct 09 15:29:30 crc kubenswrapper[4719]: I1009 15:29:30.629471 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-59vx6" Oct 09 15:29:30 crc kubenswrapper[4719]: I1009 15:29:30.629510 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 09 15:29:30 crc kubenswrapper[4719]: I1009 15:29:30.632288 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 09 15:29:30 crc kubenswrapper[4719]: I1009 15:29:30.634334 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-l4s9n"] Oct 09 15:29:30 crc kubenswrapper[4719]: I1009 15:29:30.762425 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blnst\" (UniqueName: \"kubernetes.io/projected/15fe7687-ec6e-42eb-9131-980871159a78-kube-api-access-blnst\") pod \"nmstate-operator-858ddd8f98-l4s9n\" (UID: \"15fe7687-ec6e-42eb-9131-980871159a78\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-l4s9n" Oct 09 15:29:30 crc kubenswrapper[4719]: I1009 15:29:30.863732 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blnst\" (UniqueName: \"kubernetes.io/projected/15fe7687-ec6e-42eb-9131-980871159a78-kube-api-access-blnst\") pod \"nmstate-operator-858ddd8f98-l4s9n\" (UID: \"15fe7687-ec6e-42eb-9131-980871159a78\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-l4s9n" Oct 09 15:29:30 crc kubenswrapper[4719]: I1009 15:29:30.888973 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blnst\" (UniqueName: \"kubernetes.io/projected/15fe7687-ec6e-42eb-9131-980871159a78-kube-api-access-blnst\") pod \"nmstate-operator-858ddd8f98-l4s9n\" (UID: \"15fe7687-ec6e-42eb-9131-980871159a78\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-l4s9n" Oct 09 15:29:30 crc kubenswrapper[4719]: I1009 15:29:30.942868 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-l4s9n" Oct 09 15:29:31 crc kubenswrapper[4719]: I1009 15:29:31.199577 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-l4s9n"] Oct 09 15:29:31 crc kubenswrapper[4719]: I1009 15:29:31.688310 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-l4s9n" event={"ID":"15fe7687-ec6e-42eb-9131-980871159a78","Type":"ContainerStarted","Data":"7d98ffc71f3ecf09ca099710c77a543e0f8c75eee155f5c4b2d0aa1fcd7965f0"} Oct 09 15:29:34 crc kubenswrapper[4719]: I1009 15:29:34.705713 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-l4s9n" event={"ID":"15fe7687-ec6e-42eb-9131-980871159a78","Type":"ContainerStarted","Data":"9f69ccdb2af18f02d070ecdb83054d7869deb7b261bb471527db334ccc12467c"} Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.137785 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-l4s9n" podStartSLOduration=7.520714649 podStartE2EDuration="10.137764798s" podCreationTimestamp="2025-10-09 15:29:30 +0000 UTC" firstStartedPulling="2025-10-09 15:29:31.226511669 +0000 UTC m=+676.736222954" lastFinishedPulling="2025-10-09 15:29:33.843561818 +0000 UTC m=+679.353273103" observedRunningTime="2025-10-09 15:29:34.727764886 +0000 UTC m=+680.237476171" watchObservedRunningTime="2025-10-09 15:29:40.137764798 +0000 UTC m=+685.647476083" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.139523 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-wgq7f"] Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.140422 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-wgq7f" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.142203 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-mns4p" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.149501 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-wgq7f"] Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.172110 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-m6lsq"] Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.172990 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6lsq" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.174253 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.184086 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-xwghc"] Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.185031 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-xwghc" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.189478 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-m6lsq"] Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.278586 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/feb926a1-9332-41f0-80b7-b100b62f8664-nmstate-lock\") pod \"nmstate-handler-xwghc\" (UID: \"feb926a1-9332-41f0-80b7-b100b62f8664\") " pod="openshift-nmstate/nmstate-handler-xwghc" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.278910 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hcpk\" (UniqueName: \"kubernetes.io/projected/036064c5-e3a3-49a7-b457-5e64df820401-kube-api-access-6hcpk\") pod \"nmstate-webhook-6cdbc54649-m6lsq\" (UID: \"036064c5-e3a3-49a7-b457-5e64df820401\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6lsq" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.278944 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/036064c5-e3a3-49a7-b457-5e64df820401-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-m6lsq\" (UID: \"036064c5-e3a3-49a7-b457-5e64df820401\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6lsq" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.278969 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/feb926a1-9332-41f0-80b7-b100b62f8664-dbus-socket\") pod \"nmstate-handler-xwghc\" (UID: \"feb926a1-9332-41f0-80b7-b100b62f8664\") " pod="openshift-nmstate/nmstate-handler-xwghc" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.279001 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9ttp\" (UniqueName: \"kubernetes.io/projected/feb926a1-9332-41f0-80b7-b100b62f8664-kube-api-access-v9ttp\") pod \"nmstate-handler-xwghc\" (UID: \"feb926a1-9332-41f0-80b7-b100b62f8664\") " pod="openshift-nmstate/nmstate-handler-xwghc" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.279019 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s8nz\" (UniqueName: \"kubernetes.io/projected/07cbbe5f-3176-4cfa-97e0-a7b3e6613c7b-kube-api-access-4s8nz\") pod \"nmstate-metrics-fdff9cb8d-wgq7f\" (UID: \"07cbbe5f-3176-4cfa-97e0-a7b3e6613c7b\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-wgq7f" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.279037 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/feb926a1-9332-41f0-80b7-b100b62f8664-ovs-socket\") pod \"nmstate-handler-xwghc\" (UID: \"feb926a1-9332-41f0-80b7-b100b62f8664\") " pod="openshift-nmstate/nmstate-handler-xwghc" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.278787 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-jz9ct"] Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.280191 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-jz9ct" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.281797 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.282078 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-g84nd" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.291476 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.295395 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-jz9ct"] Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.380422 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/feb926a1-9332-41f0-80b7-b100b62f8664-ovs-socket\") pod \"nmstate-handler-xwghc\" (UID: \"feb926a1-9332-41f0-80b7-b100b62f8664\") " pod="openshift-nmstate/nmstate-handler-xwghc" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.380476 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/feb926a1-9332-41f0-80b7-b100b62f8664-nmstate-lock\") pod \"nmstate-handler-xwghc\" (UID: \"feb926a1-9332-41f0-80b7-b100b62f8664\") " pod="openshift-nmstate/nmstate-handler-xwghc" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.380519 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hcpk\" (UniqueName: \"kubernetes.io/projected/036064c5-e3a3-49a7-b457-5e64df820401-kube-api-access-6hcpk\") pod \"nmstate-webhook-6cdbc54649-m6lsq\" (UID: \"036064c5-e3a3-49a7-b457-5e64df820401\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6lsq" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.380542 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/db938ad9-d041-4874-855a-83d6fa385b3e-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-jz9ct\" (UID: \"db938ad9-d041-4874-855a-83d6fa385b3e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-jz9ct" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.380571 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/036064c5-e3a3-49a7-b457-5e64df820401-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-m6lsq\" (UID: \"036064c5-e3a3-49a7-b457-5e64df820401\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6lsq" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.380589 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/feb926a1-9332-41f0-80b7-b100b62f8664-nmstate-lock\") pod \"nmstate-handler-xwghc\" (UID: \"feb926a1-9332-41f0-80b7-b100b62f8664\") " pod="openshift-nmstate/nmstate-handler-xwghc" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.380596 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/feb926a1-9332-41f0-80b7-b100b62f8664-ovs-socket\") pod \"nmstate-handler-xwghc\" (UID: \"feb926a1-9332-41f0-80b7-b100b62f8664\") " pod="openshift-nmstate/nmstate-handler-xwghc" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.380598 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/db938ad9-d041-4874-855a-83d6fa385b3e-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-jz9ct\" (UID: \"db938ad9-d041-4874-855a-83d6fa385b3e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-jz9ct" Oct 09 15:29:40 crc kubenswrapper[4719]: E1009 15:29:40.380701 4719 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.380703 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/feb926a1-9332-41f0-80b7-b100b62f8664-dbus-socket\") pod \"nmstate-handler-xwghc\" (UID: \"feb926a1-9332-41f0-80b7-b100b62f8664\") " pod="openshift-nmstate/nmstate-handler-xwghc" Oct 09 15:29:40 crc kubenswrapper[4719]: E1009 15:29:40.380760 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/036064c5-e3a3-49a7-b457-5e64df820401-tls-key-pair podName:036064c5-e3a3-49a7-b457-5e64df820401 nodeName:}" failed. No retries permitted until 2025-10-09 15:29:40.880741086 +0000 UTC m=+686.390452361 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/036064c5-e3a3-49a7-b457-5e64df820401-tls-key-pair") pod "nmstate-webhook-6cdbc54649-m6lsq" (UID: "036064c5-e3a3-49a7-b457-5e64df820401") : secret "openshift-nmstate-webhook" not found Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.380790 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9ttp\" (UniqueName: \"kubernetes.io/projected/feb926a1-9332-41f0-80b7-b100b62f8664-kube-api-access-v9ttp\") pod \"nmstate-handler-xwghc\" (UID: \"feb926a1-9332-41f0-80b7-b100b62f8664\") " pod="openshift-nmstate/nmstate-handler-xwghc" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.380828 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s8nz\" (UniqueName: \"kubernetes.io/projected/07cbbe5f-3176-4cfa-97e0-a7b3e6613c7b-kube-api-access-4s8nz\") pod \"nmstate-metrics-fdff9cb8d-wgq7f\" (UID: \"07cbbe5f-3176-4cfa-97e0-a7b3e6613c7b\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-wgq7f" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.380857 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pr5p\" (UniqueName: \"kubernetes.io/projected/db938ad9-d041-4874-855a-83d6fa385b3e-kube-api-access-7pr5p\") pod \"nmstate-console-plugin-6b874cbd85-jz9ct\" (UID: \"db938ad9-d041-4874-855a-83d6fa385b3e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-jz9ct" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.381001 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/feb926a1-9332-41f0-80b7-b100b62f8664-dbus-socket\") pod \"nmstate-handler-xwghc\" (UID: \"feb926a1-9332-41f0-80b7-b100b62f8664\") " pod="openshift-nmstate/nmstate-handler-xwghc" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.403018 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hcpk\" (UniqueName: \"kubernetes.io/projected/036064c5-e3a3-49a7-b457-5e64df820401-kube-api-access-6hcpk\") pod \"nmstate-webhook-6cdbc54649-m6lsq\" (UID: \"036064c5-e3a3-49a7-b457-5e64df820401\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6lsq" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.405018 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s8nz\" (UniqueName: \"kubernetes.io/projected/07cbbe5f-3176-4cfa-97e0-a7b3e6613c7b-kube-api-access-4s8nz\") pod \"nmstate-metrics-fdff9cb8d-wgq7f\" (UID: \"07cbbe5f-3176-4cfa-97e0-a7b3e6613c7b\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-wgq7f" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.409538 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9ttp\" (UniqueName: \"kubernetes.io/projected/feb926a1-9332-41f0-80b7-b100b62f8664-kube-api-access-v9ttp\") pod \"nmstate-handler-xwghc\" (UID: \"feb926a1-9332-41f0-80b7-b100b62f8664\") " pod="openshift-nmstate/nmstate-handler-xwghc" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.481497 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pr5p\" (UniqueName: \"kubernetes.io/projected/db938ad9-d041-4874-855a-83d6fa385b3e-kube-api-access-7pr5p\") pod \"nmstate-console-plugin-6b874cbd85-jz9ct\" (UID: \"db938ad9-d041-4874-855a-83d6fa385b3e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-jz9ct" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.481578 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/db938ad9-d041-4874-855a-83d6fa385b3e-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-jz9ct\" (UID: \"db938ad9-d041-4874-855a-83d6fa385b3e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-jz9ct" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.481636 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/db938ad9-d041-4874-855a-83d6fa385b3e-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-jz9ct\" (UID: \"db938ad9-d041-4874-855a-83d6fa385b3e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-jz9ct" Oct 09 15:29:40 crc kubenswrapper[4719]: E1009 15:29:40.481743 4719 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 09 15:29:40 crc kubenswrapper[4719]: E1009 15:29:40.481790 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db938ad9-d041-4874-855a-83d6fa385b3e-plugin-serving-cert podName:db938ad9-d041-4874-855a-83d6fa385b3e nodeName:}" failed. No retries permitted until 2025-10-09 15:29:40.981776737 +0000 UTC m=+686.491488022 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/db938ad9-d041-4874-855a-83d6fa385b3e-plugin-serving-cert") pod "nmstate-console-plugin-6b874cbd85-jz9ct" (UID: "db938ad9-d041-4874-855a-83d6fa385b3e") : secret "plugin-serving-cert" not found Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.482753 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/db938ad9-d041-4874-855a-83d6fa385b3e-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-jz9ct\" (UID: \"db938ad9-d041-4874-855a-83d6fa385b3e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-jz9ct" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.487433 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-8657cd4df-krc5j"] Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.488177 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.498059 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8657cd4df-krc5j"] Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.507564 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-wgq7f" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.513252 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pr5p\" (UniqueName: \"kubernetes.io/projected/db938ad9-d041-4874-855a-83d6fa385b3e-kube-api-access-7pr5p\") pod \"nmstate-console-plugin-6b874cbd85-jz9ct\" (UID: \"db938ad9-d041-4874-855a-83d6fa385b3e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-jz9ct" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.532042 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-xwghc" Oct 09 15:29:40 crc kubenswrapper[4719]: W1009 15:29:40.570193 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfeb926a1_9332_41f0_80b7_b100b62f8664.slice/crio-ec5f50ac2ef34bd3a6558317c2d0d1b7478d76b0b9e6b6f350ca62b50de81c7b WatchSource:0}: Error finding container ec5f50ac2ef34bd3a6558317c2d0d1b7478d76b0b9e6b6f350ca62b50de81c7b: Status 404 returned error can't find the container with id ec5f50ac2ef34bd3a6558317c2d0d1b7478d76b0b9e6b6f350ca62b50de81c7b Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.582431 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b53c0e13-bb5e-41ef-b5f3-754a02b284be-console-oauth-config\") pod \"console-8657cd4df-krc5j\" (UID: \"b53c0e13-bb5e-41ef-b5f3-754a02b284be\") " pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.582479 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b53c0e13-bb5e-41ef-b5f3-754a02b284be-console-serving-cert\") pod \"console-8657cd4df-krc5j\" (UID: \"b53c0e13-bb5e-41ef-b5f3-754a02b284be\") " pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.582536 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b53c0e13-bb5e-41ef-b5f3-754a02b284be-trusted-ca-bundle\") pod \"console-8657cd4df-krc5j\" (UID: \"b53c0e13-bb5e-41ef-b5f3-754a02b284be\") " pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.582593 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnmxk\" (UniqueName: \"kubernetes.io/projected/b53c0e13-bb5e-41ef-b5f3-754a02b284be-kube-api-access-nnmxk\") pod \"console-8657cd4df-krc5j\" (UID: \"b53c0e13-bb5e-41ef-b5f3-754a02b284be\") " pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.582618 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b53c0e13-bb5e-41ef-b5f3-754a02b284be-console-config\") pod \"console-8657cd4df-krc5j\" (UID: \"b53c0e13-bb5e-41ef-b5f3-754a02b284be\") " pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.582665 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b53c0e13-bb5e-41ef-b5f3-754a02b284be-service-ca\") pod \"console-8657cd4df-krc5j\" (UID: \"b53c0e13-bb5e-41ef-b5f3-754a02b284be\") " pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.582687 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b53c0e13-bb5e-41ef-b5f3-754a02b284be-oauth-serving-cert\") pod \"console-8657cd4df-krc5j\" (UID: \"b53c0e13-bb5e-41ef-b5f3-754a02b284be\") " pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.684333 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b53c0e13-bb5e-41ef-b5f3-754a02b284be-oauth-serving-cert\") pod \"console-8657cd4df-krc5j\" (UID: \"b53c0e13-bb5e-41ef-b5f3-754a02b284be\") " pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.684674 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b53c0e13-bb5e-41ef-b5f3-754a02b284be-console-oauth-config\") pod \"console-8657cd4df-krc5j\" (UID: \"b53c0e13-bb5e-41ef-b5f3-754a02b284be\") " pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.684702 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b53c0e13-bb5e-41ef-b5f3-754a02b284be-console-serving-cert\") pod \"console-8657cd4df-krc5j\" (UID: \"b53c0e13-bb5e-41ef-b5f3-754a02b284be\") " pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.684743 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b53c0e13-bb5e-41ef-b5f3-754a02b284be-trusted-ca-bundle\") pod \"console-8657cd4df-krc5j\" (UID: \"b53c0e13-bb5e-41ef-b5f3-754a02b284be\") " pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.684822 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnmxk\" (UniqueName: \"kubernetes.io/projected/b53c0e13-bb5e-41ef-b5f3-754a02b284be-kube-api-access-nnmxk\") pod \"console-8657cd4df-krc5j\" (UID: \"b53c0e13-bb5e-41ef-b5f3-754a02b284be\") " pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.684843 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b53c0e13-bb5e-41ef-b5f3-754a02b284be-console-config\") pod \"console-8657cd4df-krc5j\" (UID: \"b53c0e13-bb5e-41ef-b5f3-754a02b284be\") " pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.684863 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b53c0e13-bb5e-41ef-b5f3-754a02b284be-service-ca\") pod \"console-8657cd4df-krc5j\" (UID: \"b53c0e13-bb5e-41ef-b5f3-754a02b284be\") " pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.685337 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b53c0e13-bb5e-41ef-b5f3-754a02b284be-oauth-serving-cert\") pod \"console-8657cd4df-krc5j\" (UID: \"b53c0e13-bb5e-41ef-b5f3-754a02b284be\") " pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.685540 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b53c0e13-bb5e-41ef-b5f3-754a02b284be-service-ca\") pod \"console-8657cd4df-krc5j\" (UID: \"b53c0e13-bb5e-41ef-b5f3-754a02b284be\") " pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.685897 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b53c0e13-bb5e-41ef-b5f3-754a02b284be-console-config\") pod \"console-8657cd4df-krc5j\" (UID: \"b53c0e13-bb5e-41ef-b5f3-754a02b284be\") " pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.686333 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b53c0e13-bb5e-41ef-b5f3-754a02b284be-trusted-ca-bundle\") pod \"console-8657cd4df-krc5j\" (UID: \"b53c0e13-bb5e-41ef-b5f3-754a02b284be\") " pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.691324 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b53c0e13-bb5e-41ef-b5f3-754a02b284be-console-oauth-config\") pod \"console-8657cd4df-krc5j\" (UID: \"b53c0e13-bb5e-41ef-b5f3-754a02b284be\") " pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.691884 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b53c0e13-bb5e-41ef-b5f3-754a02b284be-console-serving-cert\") pod \"console-8657cd4df-krc5j\" (UID: \"b53c0e13-bb5e-41ef-b5f3-754a02b284be\") " pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.701538 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnmxk\" (UniqueName: \"kubernetes.io/projected/b53c0e13-bb5e-41ef-b5f3-754a02b284be-kube-api-access-nnmxk\") pod \"console-8657cd4df-krc5j\" (UID: \"b53c0e13-bb5e-41ef-b5f3-754a02b284be\") " pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.726736 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-wgq7f"] Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.741992 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-xwghc" event={"ID":"feb926a1-9332-41f0-80b7-b100b62f8664","Type":"ContainerStarted","Data":"ec5f50ac2ef34bd3a6558317c2d0d1b7478d76b0b9e6b6f350ca62b50de81c7b"} Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.743203 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-wgq7f" event={"ID":"07cbbe5f-3176-4cfa-97e0-a7b3e6613c7b","Type":"ContainerStarted","Data":"408377ec465747a2e7ddef64a029ff4c29b0f19476df007e728095925e749139"} Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.804216 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.887125 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/036064c5-e3a3-49a7-b457-5e64df820401-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-m6lsq\" (UID: \"036064c5-e3a3-49a7-b457-5e64df820401\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6lsq" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.891170 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/036064c5-e3a3-49a7-b457-5e64df820401-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-m6lsq\" (UID: \"036064c5-e3a3-49a7-b457-5e64df820401\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6lsq" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.965297 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8657cd4df-krc5j"] Oct 09 15:29:40 crc kubenswrapper[4719]: W1009 15:29:40.970373 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb53c0e13_bb5e_41ef_b5f3_754a02b284be.slice/crio-c94cba640f6b16d5589ea8758da468c903e980efe447c59fb01624236d6963fe WatchSource:0}: Error finding container c94cba640f6b16d5589ea8758da468c903e980efe447c59fb01624236d6963fe: Status 404 returned error can't find the container with id c94cba640f6b16d5589ea8758da468c903e980efe447c59fb01624236d6963fe Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.988856 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/db938ad9-d041-4874-855a-83d6fa385b3e-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-jz9ct\" (UID: \"db938ad9-d041-4874-855a-83d6fa385b3e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-jz9ct" Oct 09 15:29:40 crc kubenswrapper[4719]: I1009 15:29:40.992696 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/db938ad9-d041-4874-855a-83d6fa385b3e-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-jz9ct\" (UID: \"db938ad9-d041-4874-855a-83d6fa385b3e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-jz9ct" Oct 09 15:29:41 crc kubenswrapper[4719]: I1009 15:29:41.121612 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6lsq" Oct 09 15:29:41 crc kubenswrapper[4719]: I1009 15:29:41.195177 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-jz9ct" Oct 09 15:29:41 crc kubenswrapper[4719]: I1009 15:29:41.295484 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-m6lsq"] Oct 09 15:29:41 crc kubenswrapper[4719]: W1009 15:29:41.303205 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod036064c5_e3a3_49a7_b457_5e64df820401.slice/crio-c2cd7f648617e8479364d75688969ff77d100fe907309f57eb147f43feb09c06 WatchSource:0}: Error finding container c2cd7f648617e8479364d75688969ff77d100fe907309f57eb147f43feb09c06: Status 404 returned error can't find the container with id c2cd7f648617e8479364d75688969ff77d100fe907309f57eb147f43feb09c06 Oct 09 15:29:41 crc kubenswrapper[4719]: I1009 15:29:41.583317 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-jz9ct"] Oct 09 15:29:41 crc kubenswrapper[4719]: I1009 15:29:41.753125 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6lsq" event={"ID":"036064c5-e3a3-49a7-b457-5e64df820401","Type":"ContainerStarted","Data":"c2cd7f648617e8479364d75688969ff77d100fe907309f57eb147f43feb09c06"} Oct 09 15:29:41 crc kubenswrapper[4719]: I1009 15:29:41.754747 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8657cd4df-krc5j" event={"ID":"b53c0e13-bb5e-41ef-b5f3-754a02b284be","Type":"ContainerStarted","Data":"c30fca7a722fd71a574d2126fa2c24617557364dddfc52a1918306249f002b05"} Oct 09 15:29:41 crc kubenswrapper[4719]: I1009 15:29:41.754776 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8657cd4df-krc5j" event={"ID":"b53c0e13-bb5e-41ef-b5f3-754a02b284be","Type":"ContainerStarted","Data":"c94cba640f6b16d5589ea8758da468c903e980efe447c59fb01624236d6963fe"} Oct 09 15:29:41 crc kubenswrapper[4719]: I1009 15:29:41.759517 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-jz9ct" event={"ID":"db938ad9-d041-4874-855a-83d6fa385b3e","Type":"ContainerStarted","Data":"898019fd4a9511d5df98538220ff09c664fda77f1f5ddee70c979f144ce4a398"} Oct 09 15:29:41 crc kubenswrapper[4719]: I1009 15:29:41.776888 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8657cd4df-krc5j" podStartSLOduration=1.7768647309999999 podStartE2EDuration="1.776864731s" podCreationTimestamp="2025-10-09 15:29:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:29:41.772826543 +0000 UTC m=+687.282537848" watchObservedRunningTime="2025-10-09 15:29:41.776864731 +0000 UTC m=+687.286576016" Oct 09 15:29:43 crc kubenswrapper[4719]: I1009 15:29:43.774187 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-xwghc" event={"ID":"feb926a1-9332-41f0-80b7-b100b62f8664","Type":"ContainerStarted","Data":"410ae5a559a6bf032badb83cefe9ed8de6c7a6511df4eb94568bcba0d874d939"} Oct 09 15:29:43 crc kubenswrapper[4719]: I1009 15:29:43.776271 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-xwghc" Oct 09 15:29:43 crc kubenswrapper[4719]: I1009 15:29:43.782444 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-wgq7f" event={"ID":"07cbbe5f-3176-4cfa-97e0-a7b3e6613c7b","Type":"ContainerStarted","Data":"a42ab2d56d740c69fb444d0132ab4819d72f314aa36ffb09187f2e9adeba40fc"} Oct 09 15:29:43 crc kubenswrapper[4719]: I1009 15:29:43.783883 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6lsq" event={"ID":"036064c5-e3a3-49a7-b457-5e64df820401","Type":"ContainerStarted","Data":"78c88061cd5f013603e6295bcaf66319e62f3d8e49a331eb4fcfd6aa7c0fdd2b"} Oct 09 15:29:43 crc kubenswrapper[4719]: I1009 15:29:43.784441 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6lsq" Oct 09 15:29:43 crc kubenswrapper[4719]: I1009 15:29:43.790486 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-xwghc" podStartSLOduration=1.394349372 podStartE2EDuration="3.790465078s" podCreationTimestamp="2025-10-09 15:29:40 +0000 UTC" firstStartedPulling="2025-10-09 15:29:40.576290248 +0000 UTC m=+686.086001533" lastFinishedPulling="2025-10-09 15:29:42.972405954 +0000 UTC m=+688.482117239" observedRunningTime="2025-10-09 15:29:43.786664316 +0000 UTC m=+689.296375601" watchObservedRunningTime="2025-10-09 15:29:43.790465078 +0000 UTC m=+689.300176363" Oct 09 15:29:43 crc kubenswrapper[4719]: I1009 15:29:43.812432 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6lsq" podStartSLOduration=2.133323637 podStartE2EDuration="3.812413869s" podCreationTimestamp="2025-10-09 15:29:40 +0000 UTC" firstStartedPulling="2025-10-09 15:29:41.305012476 +0000 UTC m=+686.814723751" lastFinishedPulling="2025-10-09 15:29:42.984102698 +0000 UTC m=+688.493813983" observedRunningTime="2025-10-09 15:29:43.807131901 +0000 UTC m=+689.316843206" watchObservedRunningTime="2025-10-09 15:29:43.812413869 +0000 UTC m=+689.322125154" Oct 09 15:29:44 crc kubenswrapper[4719]: I1009 15:29:44.792667 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-jz9ct" event={"ID":"db938ad9-d041-4874-855a-83d6fa385b3e","Type":"ContainerStarted","Data":"a1e75954649d63037a8d38ceb76f5d1a5da7f042515172a985d3e3ee3c3a2fc3"} Oct 09 15:29:44 crc kubenswrapper[4719]: I1009 15:29:44.808162 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-jz9ct" podStartSLOduration=2.180134534 podStartE2EDuration="4.808144164s" podCreationTimestamp="2025-10-09 15:29:40 +0000 UTC" firstStartedPulling="2025-10-09 15:29:41.594387857 +0000 UTC m=+687.104099142" lastFinishedPulling="2025-10-09 15:29:44.222397487 +0000 UTC m=+689.732108772" observedRunningTime="2025-10-09 15:29:44.804746645 +0000 UTC m=+690.314457940" watchObservedRunningTime="2025-10-09 15:29:44.808144164 +0000 UTC m=+690.317855449" Oct 09 15:29:46 crc kubenswrapper[4719]: I1009 15:29:46.804006 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-wgq7f" event={"ID":"07cbbe5f-3176-4cfa-97e0-a7b3e6613c7b","Type":"ContainerStarted","Data":"c27340068a76cadb70c2fe3533c1165073c8909a98068312d5932351304cd4eb"} Oct 09 15:29:46 crc kubenswrapper[4719]: I1009 15:29:46.821072 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-wgq7f" podStartSLOduration=1.500845627 podStartE2EDuration="6.821045888s" podCreationTimestamp="2025-10-09 15:29:40 +0000 UTC" firstStartedPulling="2025-10-09 15:29:40.732073578 +0000 UTC m=+686.241784863" lastFinishedPulling="2025-10-09 15:29:46.052273839 +0000 UTC m=+691.561985124" observedRunningTime="2025-10-09 15:29:46.817816475 +0000 UTC m=+692.327527770" watchObservedRunningTime="2025-10-09 15:29:46.821045888 +0000 UTC m=+692.330757233" Oct 09 15:29:50 crc kubenswrapper[4719]: I1009 15:29:50.563214 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-xwghc" Oct 09 15:29:50 crc kubenswrapper[4719]: I1009 15:29:50.805345 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:50 crc kubenswrapper[4719]: I1009 15:29:50.805404 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:50 crc kubenswrapper[4719]: I1009 15:29:50.811148 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:50 crc kubenswrapper[4719]: I1009 15:29:50.828615 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8657cd4df-krc5j" Oct 09 15:29:50 crc kubenswrapper[4719]: I1009 15:29:50.892959 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-j74ct"] Oct 09 15:30:00 crc kubenswrapper[4719]: I1009 15:30:00.140223 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333730-5vlr5"] Oct 09 15:30:00 crc kubenswrapper[4719]: I1009 15:30:00.142399 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333730-5vlr5" Oct 09 15:30:00 crc kubenswrapper[4719]: I1009 15:30:00.144649 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 15:30:00 crc kubenswrapper[4719]: I1009 15:30:00.147737 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 15:30:00 crc kubenswrapper[4719]: I1009 15:30:00.149279 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333730-5vlr5"] Oct 09 15:30:00 crc kubenswrapper[4719]: I1009 15:30:00.248188 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8fd57e39-ad27-4ca6-89c0-01c9278f6c86-secret-volume\") pod \"collect-profiles-29333730-5vlr5\" (UID: \"8fd57e39-ad27-4ca6-89c0-01c9278f6c86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333730-5vlr5" Oct 09 15:30:00 crc kubenswrapper[4719]: I1009 15:30:00.248245 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8fd57e39-ad27-4ca6-89c0-01c9278f6c86-config-volume\") pod \"collect-profiles-29333730-5vlr5\" (UID: \"8fd57e39-ad27-4ca6-89c0-01c9278f6c86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333730-5vlr5" Oct 09 15:30:00 crc kubenswrapper[4719]: I1009 15:30:00.248326 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ttj5\" (UniqueName: \"kubernetes.io/projected/8fd57e39-ad27-4ca6-89c0-01c9278f6c86-kube-api-access-6ttj5\") pod \"collect-profiles-29333730-5vlr5\" (UID: \"8fd57e39-ad27-4ca6-89c0-01c9278f6c86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333730-5vlr5" Oct 09 15:30:00 crc kubenswrapper[4719]: I1009 15:30:00.349689 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8fd57e39-ad27-4ca6-89c0-01c9278f6c86-secret-volume\") pod \"collect-profiles-29333730-5vlr5\" (UID: \"8fd57e39-ad27-4ca6-89c0-01c9278f6c86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333730-5vlr5" Oct 09 15:30:00 crc kubenswrapper[4719]: I1009 15:30:00.350029 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8fd57e39-ad27-4ca6-89c0-01c9278f6c86-config-volume\") pod \"collect-profiles-29333730-5vlr5\" (UID: \"8fd57e39-ad27-4ca6-89c0-01c9278f6c86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333730-5vlr5" Oct 09 15:30:00 crc kubenswrapper[4719]: I1009 15:30:00.350077 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ttj5\" (UniqueName: \"kubernetes.io/projected/8fd57e39-ad27-4ca6-89c0-01c9278f6c86-kube-api-access-6ttj5\") pod \"collect-profiles-29333730-5vlr5\" (UID: \"8fd57e39-ad27-4ca6-89c0-01c9278f6c86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333730-5vlr5" Oct 09 15:30:00 crc kubenswrapper[4719]: I1009 15:30:00.351212 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8fd57e39-ad27-4ca6-89c0-01c9278f6c86-config-volume\") pod \"collect-profiles-29333730-5vlr5\" (UID: \"8fd57e39-ad27-4ca6-89c0-01c9278f6c86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333730-5vlr5" Oct 09 15:30:00 crc kubenswrapper[4719]: I1009 15:30:00.355370 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8fd57e39-ad27-4ca6-89c0-01c9278f6c86-secret-volume\") pod \"collect-profiles-29333730-5vlr5\" (UID: \"8fd57e39-ad27-4ca6-89c0-01c9278f6c86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333730-5vlr5" Oct 09 15:30:00 crc kubenswrapper[4719]: I1009 15:30:00.365869 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ttj5\" (UniqueName: \"kubernetes.io/projected/8fd57e39-ad27-4ca6-89c0-01c9278f6c86-kube-api-access-6ttj5\") pod \"collect-profiles-29333730-5vlr5\" (UID: \"8fd57e39-ad27-4ca6-89c0-01c9278f6c86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333730-5vlr5" Oct 09 15:30:00 crc kubenswrapper[4719]: I1009 15:30:00.461282 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333730-5vlr5" Oct 09 15:30:00 crc kubenswrapper[4719]: I1009 15:30:00.860241 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333730-5vlr5"] Oct 09 15:30:00 crc kubenswrapper[4719]: W1009 15:30:00.870018 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fd57e39_ad27_4ca6_89c0_01c9278f6c86.slice/crio-c4b7c73e9cc46b3f895e607ec1db48a21316f8ea1a0aae3ed33aaf7ed56dddb0 WatchSource:0}: Error finding container c4b7c73e9cc46b3f895e607ec1db48a21316f8ea1a0aae3ed33aaf7ed56dddb0: Status 404 returned error can't find the container with id c4b7c73e9cc46b3f895e607ec1db48a21316f8ea1a0aae3ed33aaf7ed56dddb0 Oct 09 15:30:00 crc kubenswrapper[4719]: I1009 15:30:00.883313 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333730-5vlr5" event={"ID":"8fd57e39-ad27-4ca6-89c0-01c9278f6c86","Type":"ContainerStarted","Data":"c4b7c73e9cc46b3f895e607ec1db48a21316f8ea1a0aae3ed33aaf7ed56dddb0"} Oct 09 15:30:01 crc kubenswrapper[4719]: I1009 15:30:01.127936 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6lsq" Oct 09 15:30:01 crc kubenswrapper[4719]: I1009 15:30:01.890434 4719 generic.go:334] "Generic (PLEG): container finished" podID="8fd57e39-ad27-4ca6-89c0-01c9278f6c86" containerID="e55d433f1889fccde78745ae4d13a88764e8e2361557eb5a7a815e3990b280a2" exitCode=0 Oct 09 15:30:01 crc kubenswrapper[4719]: I1009 15:30:01.890544 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333730-5vlr5" event={"ID":"8fd57e39-ad27-4ca6-89c0-01c9278f6c86","Type":"ContainerDied","Data":"e55d433f1889fccde78745ae4d13a88764e8e2361557eb5a7a815e3990b280a2"} Oct 09 15:30:03 crc kubenswrapper[4719]: I1009 15:30:03.108414 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333730-5vlr5" Oct 09 15:30:03 crc kubenswrapper[4719]: I1009 15:30:03.287680 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ttj5\" (UniqueName: \"kubernetes.io/projected/8fd57e39-ad27-4ca6-89c0-01c9278f6c86-kube-api-access-6ttj5\") pod \"8fd57e39-ad27-4ca6-89c0-01c9278f6c86\" (UID: \"8fd57e39-ad27-4ca6-89c0-01c9278f6c86\") " Oct 09 15:30:03 crc kubenswrapper[4719]: I1009 15:30:03.287830 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8fd57e39-ad27-4ca6-89c0-01c9278f6c86-secret-volume\") pod \"8fd57e39-ad27-4ca6-89c0-01c9278f6c86\" (UID: \"8fd57e39-ad27-4ca6-89c0-01c9278f6c86\") " Oct 09 15:30:03 crc kubenswrapper[4719]: I1009 15:30:03.288011 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8fd57e39-ad27-4ca6-89c0-01c9278f6c86-config-volume\") pod \"8fd57e39-ad27-4ca6-89c0-01c9278f6c86\" (UID: \"8fd57e39-ad27-4ca6-89c0-01c9278f6c86\") " Oct 09 15:30:03 crc kubenswrapper[4719]: I1009 15:30:03.288769 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fd57e39-ad27-4ca6-89c0-01c9278f6c86-config-volume" (OuterVolumeSpecName: "config-volume") pod "8fd57e39-ad27-4ca6-89c0-01c9278f6c86" (UID: "8fd57e39-ad27-4ca6-89c0-01c9278f6c86"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:30:03 crc kubenswrapper[4719]: I1009 15:30:03.293014 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd57e39-ad27-4ca6-89c0-01c9278f6c86-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8fd57e39-ad27-4ca6-89c0-01c9278f6c86" (UID: "8fd57e39-ad27-4ca6-89c0-01c9278f6c86"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:30:03 crc kubenswrapper[4719]: I1009 15:30:03.293132 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fd57e39-ad27-4ca6-89c0-01c9278f6c86-kube-api-access-6ttj5" (OuterVolumeSpecName: "kube-api-access-6ttj5") pod "8fd57e39-ad27-4ca6-89c0-01c9278f6c86" (UID: "8fd57e39-ad27-4ca6-89c0-01c9278f6c86"). InnerVolumeSpecName "kube-api-access-6ttj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:30:03 crc kubenswrapper[4719]: I1009 15:30:03.389657 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ttj5\" (UniqueName: \"kubernetes.io/projected/8fd57e39-ad27-4ca6-89c0-01c9278f6c86-kube-api-access-6ttj5\") on node \"crc\" DevicePath \"\"" Oct 09 15:30:03 crc kubenswrapper[4719]: I1009 15:30:03.389693 4719 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8fd57e39-ad27-4ca6-89c0-01c9278f6c86-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 15:30:03 crc kubenswrapper[4719]: I1009 15:30:03.389778 4719 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8fd57e39-ad27-4ca6-89c0-01c9278f6c86-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 15:30:03 crc kubenswrapper[4719]: I1009 15:30:03.901833 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333730-5vlr5" event={"ID":"8fd57e39-ad27-4ca6-89c0-01c9278f6c86","Type":"ContainerDied","Data":"c4b7c73e9cc46b3f895e607ec1db48a21316f8ea1a0aae3ed33aaf7ed56dddb0"} Oct 09 15:30:03 crc kubenswrapper[4719]: I1009 15:30:03.901875 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4b7c73e9cc46b3f895e607ec1db48a21316f8ea1a0aae3ed33aaf7ed56dddb0" Oct 09 15:30:03 crc kubenswrapper[4719]: I1009 15:30:03.901889 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333730-5vlr5" Oct 09 15:30:15 crc kubenswrapper[4719]: I1009 15:30:15.944649 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46"] Oct 09 15:30:15 crc kubenswrapper[4719]: E1009 15:30:15.945516 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd57e39-ad27-4ca6-89c0-01c9278f6c86" containerName="collect-profiles" Oct 09 15:30:15 crc kubenswrapper[4719]: I1009 15:30:15.945533 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd57e39-ad27-4ca6-89c0-01c9278f6c86" containerName="collect-profiles" Oct 09 15:30:15 crc kubenswrapper[4719]: I1009 15:30:15.945669 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fd57e39-ad27-4ca6-89c0-01c9278f6c86" containerName="collect-profiles" Oct 09 15:30:15 crc kubenswrapper[4719]: I1009 15:30:15.946723 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46" Oct 09 15:30:15 crc kubenswrapper[4719]: I1009 15:30:15.949853 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 09 15:30:15 crc kubenswrapper[4719]: I1009 15:30:15.950819 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-j74ct" podUID="c895d97a-7287-49a8-9ac5-bc87e8bcf297" containerName="console" containerID="cri-o://51924a57cc9932369b7f474ecdb0e3d189eaaa35d8c93809d09ba0a50706fc03" gracePeriod=15 Oct 09 15:30:15 crc kubenswrapper[4719]: I1009 15:30:15.965213 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46"] Oct 09 15:30:15 crc kubenswrapper[4719]: I1009 15:30:15.967996 4719 patch_prober.go:28] interesting pod/console-f9d7485db-j74ct container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/health\": read tcp 10.217.0.2:35930->10.217.0.9:8443: read: connection reset by peer" start-of-body= Oct 09 15:30:15 crc kubenswrapper[4719]: I1009 15:30:15.968059 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-f9d7485db-j74ct" podUID="c895d97a-7287-49a8-9ac5-bc87e8bcf297" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": read tcp 10.217.0.2:35930->10.217.0.9:8443: read: connection reset by peer" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.061292 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b460c47-e24c-46c7-bf23-0e5b5d6819bd-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46\" (UID: \"4b460c47-e24c-46c7-bf23-0e5b5d6819bd\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.061360 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b460c47-e24c-46c7-bf23-0e5b5d6819bd-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46\" (UID: \"4b460c47-e24c-46c7-bf23-0e5b5d6819bd\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.061417 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrzt7\" (UniqueName: \"kubernetes.io/projected/4b460c47-e24c-46c7-bf23-0e5b5d6819bd-kube-api-access-nrzt7\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46\" (UID: \"4b460c47-e24c-46c7-bf23-0e5b5d6819bd\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.162268 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b460c47-e24c-46c7-bf23-0e5b5d6819bd-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46\" (UID: \"4b460c47-e24c-46c7-bf23-0e5b5d6819bd\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.162414 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrzt7\" (UniqueName: \"kubernetes.io/projected/4b460c47-e24c-46c7-bf23-0e5b5d6819bd-kube-api-access-nrzt7\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46\" (UID: \"4b460c47-e24c-46c7-bf23-0e5b5d6819bd\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.162507 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b460c47-e24c-46c7-bf23-0e5b5d6819bd-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46\" (UID: \"4b460c47-e24c-46c7-bf23-0e5b5d6819bd\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.162745 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b460c47-e24c-46c7-bf23-0e5b5d6819bd-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46\" (UID: \"4b460c47-e24c-46c7-bf23-0e5b5d6819bd\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.162837 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b460c47-e24c-46c7-bf23-0e5b5d6819bd-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46\" (UID: \"4b460c47-e24c-46c7-bf23-0e5b5d6819bd\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.200308 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrzt7\" (UniqueName: \"kubernetes.io/projected/4b460c47-e24c-46c7-bf23-0e5b5d6819bd-kube-api-access-nrzt7\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46\" (UID: \"4b460c47-e24c-46c7-bf23-0e5b5d6819bd\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.261643 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.356480 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-j74ct_c895d97a-7287-49a8-9ac5-bc87e8bcf297/console/0.log" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.356893 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.468003 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c895d97a-7287-49a8-9ac5-bc87e8bcf297-oauth-serving-cert\") pod \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.468057 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c895d97a-7287-49a8-9ac5-bc87e8bcf297-console-config\") pod \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.468081 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c895d97a-7287-49a8-9ac5-bc87e8bcf297-console-serving-cert\") pod \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.468117 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c895d97a-7287-49a8-9ac5-bc87e8bcf297-trusted-ca-bundle\") pod \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.468190 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c895d97a-7287-49a8-9ac5-bc87e8bcf297-console-oauth-config\") pod \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.468221 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lpbj\" (UniqueName: \"kubernetes.io/projected/c895d97a-7287-49a8-9ac5-bc87e8bcf297-kube-api-access-4lpbj\") pod \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.468259 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c895d97a-7287-49a8-9ac5-bc87e8bcf297-service-ca\") pod \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\" (UID: \"c895d97a-7287-49a8-9ac5-bc87e8bcf297\") " Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.470028 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c895d97a-7287-49a8-9ac5-bc87e8bcf297-service-ca" (OuterVolumeSpecName: "service-ca") pod "c895d97a-7287-49a8-9ac5-bc87e8bcf297" (UID: "c895d97a-7287-49a8-9ac5-bc87e8bcf297"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.470608 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c895d97a-7287-49a8-9ac5-bc87e8bcf297-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c895d97a-7287-49a8-9ac5-bc87e8bcf297" (UID: "c895d97a-7287-49a8-9ac5-bc87e8bcf297"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.471064 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c895d97a-7287-49a8-9ac5-bc87e8bcf297-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c895d97a-7287-49a8-9ac5-bc87e8bcf297" (UID: "c895d97a-7287-49a8-9ac5-bc87e8bcf297"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.474866 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c895d97a-7287-49a8-9ac5-bc87e8bcf297-console-config" (OuterVolumeSpecName: "console-config") pod "c895d97a-7287-49a8-9ac5-bc87e8bcf297" (UID: "c895d97a-7287-49a8-9ac5-bc87e8bcf297"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.474984 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c895d97a-7287-49a8-9ac5-bc87e8bcf297-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c895d97a-7287-49a8-9ac5-bc87e8bcf297" (UID: "c895d97a-7287-49a8-9ac5-bc87e8bcf297"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.475788 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c895d97a-7287-49a8-9ac5-bc87e8bcf297-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c895d97a-7287-49a8-9ac5-bc87e8bcf297" (UID: "c895d97a-7287-49a8-9ac5-bc87e8bcf297"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.475885 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46"] Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.477474 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c895d97a-7287-49a8-9ac5-bc87e8bcf297-kube-api-access-4lpbj" (OuterVolumeSpecName: "kube-api-access-4lpbj") pod "c895d97a-7287-49a8-9ac5-bc87e8bcf297" (UID: "c895d97a-7287-49a8-9ac5-bc87e8bcf297"). InnerVolumeSpecName "kube-api-access-4lpbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.569212 4719 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c895d97a-7287-49a8-9ac5-bc87e8bcf297-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.569240 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lpbj\" (UniqueName: \"kubernetes.io/projected/c895d97a-7287-49a8-9ac5-bc87e8bcf297-kube-api-access-4lpbj\") on node \"crc\" DevicePath \"\"" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.569249 4719 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c895d97a-7287-49a8-9ac5-bc87e8bcf297-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.569258 4719 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c895d97a-7287-49a8-9ac5-bc87e8bcf297-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.569266 4719 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c895d97a-7287-49a8-9ac5-bc87e8bcf297-console-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.569273 4719 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c895d97a-7287-49a8-9ac5-bc87e8bcf297-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:30:16 crc kubenswrapper[4719]: I1009 15:30:16.569281 4719 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c895d97a-7287-49a8-9ac5-bc87e8bcf297-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:30:17 crc kubenswrapper[4719]: I1009 15:30:17.001447 4719 generic.go:334] "Generic (PLEG): container finished" podID="4b460c47-e24c-46c7-bf23-0e5b5d6819bd" containerID="5d8160d81a2242ee797859d8dc3bd250edc1c318dd57bb5b4fbf31d30a3e9f83" exitCode=0 Oct 09 15:30:17 crc kubenswrapper[4719]: I1009 15:30:17.001629 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46" event={"ID":"4b460c47-e24c-46c7-bf23-0e5b5d6819bd","Type":"ContainerDied","Data":"5d8160d81a2242ee797859d8dc3bd250edc1c318dd57bb5b4fbf31d30a3e9f83"} Oct 09 15:30:17 crc kubenswrapper[4719]: I1009 15:30:17.001846 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46" event={"ID":"4b460c47-e24c-46c7-bf23-0e5b5d6819bd","Type":"ContainerStarted","Data":"581c7ab9a9bb878539730b4843ffeaf95b18ccf9b45181270851c2eb8e6a13f2"} Oct 09 15:30:17 crc kubenswrapper[4719]: I1009 15:30:17.004198 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-j74ct_c895d97a-7287-49a8-9ac5-bc87e8bcf297/console/0.log" Oct 09 15:30:17 crc kubenswrapper[4719]: I1009 15:30:17.004238 4719 generic.go:334] "Generic (PLEG): container finished" podID="c895d97a-7287-49a8-9ac5-bc87e8bcf297" containerID="51924a57cc9932369b7f474ecdb0e3d189eaaa35d8c93809d09ba0a50706fc03" exitCode=2 Oct 09 15:30:17 crc kubenswrapper[4719]: I1009 15:30:17.004259 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-j74ct" event={"ID":"c895d97a-7287-49a8-9ac5-bc87e8bcf297","Type":"ContainerDied","Data":"51924a57cc9932369b7f474ecdb0e3d189eaaa35d8c93809d09ba0a50706fc03"} Oct 09 15:30:17 crc kubenswrapper[4719]: I1009 15:30:17.004277 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-j74ct" event={"ID":"c895d97a-7287-49a8-9ac5-bc87e8bcf297","Type":"ContainerDied","Data":"4cfe9df5f6dbdd6ffea299620b878b3dd4a1e62cb037248a1a19f5e51acb1db0"} Oct 09 15:30:17 crc kubenswrapper[4719]: I1009 15:30:17.004297 4719 scope.go:117] "RemoveContainer" containerID="51924a57cc9932369b7f474ecdb0e3d189eaaa35d8c93809d09ba0a50706fc03" Oct 09 15:30:17 crc kubenswrapper[4719]: I1009 15:30:17.004436 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-j74ct" Oct 09 15:30:17 crc kubenswrapper[4719]: I1009 15:30:17.037717 4719 scope.go:117] "RemoveContainer" containerID="51924a57cc9932369b7f474ecdb0e3d189eaaa35d8c93809d09ba0a50706fc03" Oct 09 15:30:17 crc kubenswrapper[4719]: I1009 15:30:17.042329 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-j74ct"] Oct 09 15:30:17 crc kubenswrapper[4719]: E1009 15:30:17.045065 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51924a57cc9932369b7f474ecdb0e3d189eaaa35d8c93809d09ba0a50706fc03\": container with ID starting with 51924a57cc9932369b7f474ecdb0e3d189eaaa35d8c93809d09ba0a50706fc03 not found: ID does not exist" containerID="51924a57cc9932369b7f474ecdb0e3d189eaaa35d8c93809d09ba0a50706fc03" Oct 09 15:30:17 crc kubenswrapper[4719]: I1009 15:30:17.045126 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51924a57cc9932369b7f474ecdb0e3d189eaaa35d8c93809d09ba0a50706fc03"} err="failed to get container status \"51924a57cc9932369b7f474ecdb0e3d189eaaa35d8c93809d09ba0a50706fc03\": rpc error: code = NotFound desc = could not find container \"51924a57cc9932369b7f474ecdb0e3d189eaaa35d8c93809d09ba0a50706fc03\": container with ID starting with 51924a57cc9932369b7f474ecdb0e3d189eaaa35d8c93809d09ba0a50706fc03 not found: ID does not exist" Oct 09 15:30:17 crc kubenswrapper[4719]: I1009 15:30:17.046206 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-j74ct"] Oct 09 15:30:17 crc kubenswrapper[4719]: I1009 15:30:17.171282 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c895d97a-7287-49a8-9ac5-bc87e8bcf297" path="/var/lib/kubelet/pods/c895d97a-7287-49a8-9ac5-bc87e8bcf297/volumes" Oct 09 15:30:19 crc kubenswrapper[4719]: I1009 15:30:19.016526 4719 generic.go:334] "Generic (PLEG): container finished" podID="4b460c47-e24c-46c7-bf23-0e5b5d6819bd" containerID="5674774a3991f282f7c32c2333d1f9ad608c39e9598c92a0bd7290813166a2b7" exitCode=0 Oct 09 15:30:19 crc kubenswrapper[4719]: I1009 15:30:19.016567 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46" event={"ID":"4b460c47-e24c-46c7-bf23-0e5b5d6819bd","Type":"ContainerDied","Data":"5674774a3991f282f7c32c2333d1f9ad608c39e9598c92a0bd7290813166a2b7"} Oct 09 15:30:20 crc kubenswrapper[4719]: I1009 15:30:20.025860 4719 generic.go:334] "Generic (PLEG): container finished" podID="4b460c47-e24c-46c7-bf23-0e5b5d6819bd" containerID="5e4e4b6d85c546c03ea4fa02dd8a9f5a713293d5b9c39c3457a69a00fca3f856" exitCode=0 Oct 09 15:30:20 crc kubenswrapper[4719]: I1009 15:30:20.025914 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46" event={"ID":"4b460c47-e24c-46c7-bf23-0e5b5d6819bd","Type":"ContainerDied","Data":"5e4e4b6d85c546c03ea4fa02dd8a9f5a713293d5b9c39c3457a69a00fca3f856"} Oct 09 15:30:21 crc kubenswrapper[4719]: I1009 15:30:21.234980 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46" Oct 09 15:30:21 crc kubenswrapper[4719]: I1009 15:30:21.430799 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b460c47-e24c-46c7-bf23-0e5b5d6819bd-bundle\") pod \"4b460c47-e24c-46c7-bf23-0e5b5d6819bd\" (UID: \"4b460c47-e24c-46c7-bf23-0e5b5d6819bd\") " Oct 09 15:30:21 crc kubenswrapper[4719]: I1009 15:30:21.430946 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrzt7\" (UniqueName: \"kubernetes.io/projected/4b460c47-e24c-46c7-bf23-0e5b5d6819bd-kube-api-access-nrzt7\") pod \"4b460c47-e24c-46c7-bf23-0e5b5d6819bd\" (UID: \"4b460c47-e24c-46c7-bf23-0e5b5d6819bd\") " Oct 09 15:30:21 crc kubenswrapper[4719]: I1009 15:30:21.431010 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b460c47-e24c-46c7-bf23-0e5b5d6819bd-util\") pod \"4b460c47-e24c-46c7-bf23-0e5b5d6819bd\" (UID: \"4b460c47-e24c-46c7-bf23-0e5b5d6819bd\") " Oct 09 15:30:21 crc kubenswrapper[4719]: I1009 15:30:21.432085 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b460c47-e24c-46c7-bf23-0e5b5d6819bd-bundle" (OuterVolumeSpecName: "bundle") pod "4b460c47-e24c-46c7-bf23-0e5b5d6819bd" (UID: "4b460c47-e24c-46c7-bf23-0e5b5d6819bd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:30:21 crc kubenswrapper[4719]: I1009 15:30:21.436069 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b460c47-e24c-46c7-bf23-0e5b5d6819bd-kube-api-access-nrzt7" (OuterVolumeSpecName: "kube-api-access-nrzt7") pod "4b460c47-e24c-46c7-bf23-0e5b5d6819bd" (UID: "4b460c47-e24c-46c7-bf23-0e5b5d6819bd"). InnerVolumeSpecName "kube-api-access-nrzt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:30:21 crc kubenswrapper[4719]: I1009 15:30:21.449878 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b460c47-e24c-46c7-bf23-0e5b5d6819bd-util" (OuterVolumeSpecName: "util") pod "4b460c47-e24c-46c7-bf23-0e5b5d6819bd" (UID: "4b460c47-e24c-46c7-bf23-0e5b5d6819bd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:30:21 crc kubenswrapper[4719]: I1009 15:30:21.532180 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrzt7\" (UniqueName: \"kubernetes.io/projected/4b460c47-e24c-46c7-bf23-0e5b5d6819bd-kube-api-access-nrzt7\") on node \"crc\" DevicePath \"\"" Oct 09 15:30:21 crc kubenswrapper[4719]: I1009 15:30:21.532250 4719 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b460c47-e24c-46c7-bf23-0e5b5d6819bd-util\") on node \"crc\" DevicePath \"\"" Oct 09 15:30:21 crc kubenswrapper[4719]: I1009 15:30:21.532260 4719 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b460c47-e24c-46c7-bf23-0e5b5d6819bd-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:30:22 crc kubenswrapper[4719]: I1009 15:30:22.037979 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46" event={"ID":"4b460c47-e24c-46c7-bf23-0e5b5d6819bd","Type":"ContainerDied","Data":"581c7ab9a9bb878539730b4843ffeaf95b18ccf9b45181270851c2eb8e6a13f2"} Oct 09 15:30:22 crc kubenswrapper[4719]: I1009 15:30:22.038018 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="581c7ab9a9bb878539730b4843ffeaf95b18ccf9b45181270851c2eb8e6a13f2" Oct 09 15:30:22 crc kubenswrapper[4719]: I1009 15:30:22.038026 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.741001 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-54fb4b8cc7-c4j9j"] Oct 09 15:30:32 crc kubenswrapper[4719]: E1009 15:30:32.741576 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b460c47-e24c-46c7-bf23-0e5b5d6819bd" containerName="pull" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.741593 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b460c47-e24c-46c7-bf23-0e5b5d6819bd" containerName="pull" Oct 09 15:30:32 crc kubenswrapper[4719]: E1009 15:30:32.741613 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b460c47-e24c-46c7-bf23-0e5b5d6819bd" containerName="extract" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.741640 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b460c47-e24c-46c7-bf23-0e5b5d6819bd" containerName="extract" Oct 09 15:30:32 crc kubenswrapper[4719]: E1009 15:30:32.741651 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b460c47-e24c-46c7-bf23-0e5b5d6819bd" containerName="util" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.741660 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b460c47-e24c-46c7-bf23-0e5b5d6819bd" containerName="util" Oct 09 15:30:32 crc kubenswrapper[4719]: E1009 15:30:32.741679 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c895d97a-7287-49a8-9ac5-bc87e8bcf297" containerName="console" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.741686 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="c895d97a-7287-49a8-9ac5-bc87e8bcf297" containerName="console" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.741824 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b460c47-e24c-46c7-bf23-0e5b5d6819bd" containerName="extract" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.741838 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="c895d97a-7287-49a8-9ac5-bc87e8bcf297" containerName="console" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.742311 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-54fb4b8cc7-c4j9j" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.744434 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.744878 4719 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.745166 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.745643 4719 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.746700 4719 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-kfmth" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.755833 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-54fb4b8cc7-c4j9j"] Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.866541 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69mjq\" (UniqueName: \"kubernetes.io/projected/7d280e60-020e-43c0-a430-fb220b1d8354-kube-api-access-69mjq\") pod \"metallb-operator-controller-manager-54fb4b8cc7-c4j9j\" (UID: \"7d280e60-020e-43c0-a430-fb220b1d8354\") " pod="metallb-system/metallb-operator-controller-manager-54fb4b8cc7-c4j9j" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.867404 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7d280e60-020e-43c0-a430-fb220b1d8354-apiservice-cert\") pod \"metallb-operator-controller-manager-54fb4b8cc7-c4j9j\" (UID: \"7d280e60-020e-43c0-a430-fb220b1d8354\") " pod="metallb-system/metallb-operator-controller-manager-54fb4b8cc7-c4j9j" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.867518 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7d280e60-020e-43c0-a430-fb220b1d8354-webhook-cert\") pod \"metallb-operator-controller-manager-54fb4b8cc7-c4j9j\" (UID: \"7d280e60-020e-43c0-a430-fb220b1d8354\") " pod="metallb-system/metallb-operator-controller-manager-54fb4b8cc7-c4j9j" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.968961 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7d280e60-020e-43c0-a430-fb220b1d8354-apiservice-cert\") pod \"metallb-operator-controller-manager-54fb4b8cc7-c4j9j\" (UID: \"7d280e60-020e-43c0-a430-fb220b1d8354\") " pod="metallb-system/metallb-operator-controller-manager-54fb4b8cc7-c4j9j" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.969405 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7d280e60-020e-43c0-a430-fb220b1d8354-webhook-cert\") pod \"metallb-operator-controller-manager-54fb4b8cc7-c4j9j\" (UID: \"7d280e60-020e-43c0-a430-fb220b1d8354\") " pod="metallb-system/metallb-operator-controller-manager-54fb4b8cc7-c4j9j" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.970163 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69mjq\" (UniqueName: \"kubernetes.io/projected/7d280e60-020e-43c0-a430-fb220b1d8354-kube-api-access-69mjq\") pod \"metallb-operator-controller-manager-54fb4b8cc7-c4j9j\" (UID: \"7d280e60-020e-43c0-a430-fb220b1d8354\") " pod="metallb-system/metallb-operator-controller-manager-54fb4b8cc7-c4j9j" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.978307 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7d280e60-020e-43c0-a430-fb220b1d8354-apiservice-cert\") pod \"metallb-operator-controller-manager-54fb4b8cc7-c4j9j\" (UID: \"7d280e60-020e-43c0-a430-fb220b1d8354\") " pod="metallb-system/metallb-operator-controller-manager-54fb4b8cc7-c4j9j" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.979427 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7d280e60-020e-43c0-a430-fb220b1d8354-webhook-cert\") pod \"metallb-operator-controller-manager-54fb4b8cc7-c4j9j\" (UID: \"7d280e60-020e-43c0-a430-fb220b1d8354\") " pod="metallb-system/metallb-operator-controller-manager-54fb4b8cc7-c4j9j" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.988119 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-645dbdc857-kl4lw"] Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.989284 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-645dbdc857-kl4lw" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.990075 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69mjq\" (UniqueName: \"kubernetes.io/projected/7d280e60-020e-43c0-a430-fb220b1d8354-kube-api-access-69mjq\") pod \"metallb-operator-controller-manager-54fb4b8cc7-c4j9j\" (UID: \"7d280e60-020e-43c0-a430-fb220b1d8354\") " pod="metallb-system/metallb-operator-controller-manager-54fb4b8cc7-c4j9j" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.993378 4719 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.994404 4719 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-zkcq4" Oct 09 15:30:32 crc kubenswrapper[4719]: I1009 15:30:32.993399 4719 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 09 15:30:33 crc kubenswrapper[4719]: I1009 15:30:33.051670 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-645dbdc857-kl4lw"] Oct 09 15:30:33 crc kubenswrapper[4719]: I1009 15:30:33.057891 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-54fb4b8cc7-c4j9j" Oct 09 15:30:33 crc kubenswrapper[4719]: I1009 15:30:33.071256 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gv5v\" (UniqueName: \"kubernetes.io/projected/e3a6e576-c324-4600-b0e9-4a83cd64d478-kube-api-access-9gv5v\") pod \"metallb-operator-webhook-server-645dbdc857-kl4lw\" (UID: \"e3a6e576-c324-4600-b0e9-4a83cd64d478\") " pod="metallb-system/metallb-operator-webhook-server-645dbdc857-kl4lw" Oct 09 15:30:33 crc kubenswrapper[4719]: I1009 15:30:33.071407 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e3a6e576-c324-4600-b0e9-4a83cd64d478-webhook-cert\") pod \"metallb-operator-webhook-server-645dbdc857-kl4lw\" (UID: \"e3a6e576-c324-4600-b0e9-4a83cd64d478\") " pod="metallb-system/metallb-operator-webhook-server-645dbdc857-kl4lw" Oct 09 15:30:33 crc kubenswrapper[4719]: I1009 15:30:33.071447 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e3a6e576-c324-4600-b0e9-4a83cd64d478-apiservice-cert\") pod \"metallb-operator-webhook-server-645dbdc857-kl4lw\" (UID: \"e3a6e576-c324-4600-b0e9-4a83cd64d478\") " pod="metallb-system/metallb-operator-webhook-server-645dbdc857-kl4lw" Oct 09 15:30:33 crc kubenswrapper[4719]: I1009 15:30:33.177222 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e3a6e576-c324-4600-b0e9-4a83cd64d478-webhook-cert\") pod \"metallb-operator-webhook-server-645dbdc857-kl4lw\" (UID: \"e3a6e576-c324-4600-b0e9-4a83cd64d478\") " pod="metallb-system/metallb-operator-webhook-server-645dbdc857-kl4lw" Oct 09 15:30:33 crc kubenswrapper[4719]: I1009 15:30:33.177560 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e3a6e576-c324-4600-b0e9-4a83cd64d478-apiservice-cert\") pod \"metallb-operator-webhook-server-645dbdc857-kl4lw\" (UID: \"e3a6e576-c324-4600-b0e9-4a83cd64d478\") " pod="metallb-system/metallb-operator-webhook-server-645dbdc857-kl4lw" Oct 09 15:30:33 crc kubenswrapper[4719]: I1009 15:30:33.177591 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gv5v\" (UniqueName: \"kubernetes.io/projected/e3a6e576-c324-4600-b0e9-4a83cd64d478-kube-api-access-9gv5v\") pod \"metallb-operator-webhook-server-645dbdc857-kl4lw\" (UID: \"e3a6e576-c324-4600-b0e9-4a83cd64d478\") " pod="metallb-system/metallb-operator-webhook-server-645dbdc857-kl4lw" Oct 09 15:30:33 crc kubenswrapper[4719]: I1009 15:30:33.182177 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e3a6e576-c324-4600-b0e9-4a83cd64d478-webhook-cert\") pod \"metallb-operator-webhook-server-645dbdc857-kl4lw\" (UID: \"e3a6e576-c324-4600-b0e9-4a83cd64d478\") " pod="metallb-system/metallb-operator-webhook-server-645dbdc857-kl4lw" Oct 09 15:30:33 crc kubenswrapper[4719]: I1009 15:30:33.187960 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e3a6e576-c324-4600-b0e9-4a83cd64d478-apiservice-cert\") pod \"metallb-operator-webhook-server-645dbdc857-kl4lw\" (UID: \"e3a6e576-c324-4600-b0e9-4a83cd64d478\") " pod="metallb-system/metallb-operator-webhook-server-645dbdc857-kl4lw" Oct 09 15:30:33 crc kubenswrapper[4719]: I1009 15:30:33.209343 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gv5v\" (UniqueName: \"kubernetes.io/projected/e3a6e576-c324-4600-b0e9-4a83cd64d478-kube-api-access-9gv5v\") pod \"metallb-operator-webhook-server-645dbdc857-kl4lw\" (UID: \"e3a6e576-c324-4600-b0e9-4a83cd64d478\") " pod="metallb-system/metallb-operator-webhook-server-645dbdc857-kl4lw" Oct 09 15:30:33 crc kubenswrapper[4719]: I1009 15:30:33.347278 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-645dbdc857-kl4lw" Oct 09 15:30:33 crc kubenswrapper[4719]: I1009 15:30:33.395605 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-54fb4b8cc7-c4j9j"] Oct 09 15:30:33 crc kubenswrapper[4719]: W1009 15:30:33.419932 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d280e60_020e_43c0_a430_fb220b1d8354.slice/crio-52748f80bf43fd0ed3d510ad792c66074befe03e2db907164f198b05c143defe WatchSource:0}: Error finding container 52748f80bf43fd0ed3d510ad792c66074befe03e2db907164f198b05c143defe: Status 404 returned error can't find the container with id 52748f80bf43fd0ed3d510ad792c66074befe03e2db907164f198b05c143defe Oct 09 15:30:33 crc kubenswrapper[4719]: I1009 15:30:33.625975 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-645dbdc857-kl4lw"] Oct 09 15:30:33 crc kubenswrapper[4719]: W1009 15:30:33.633913 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3a6e576_c324_4600_b0e9_4a83cd64d478.slice/crio-63e3be4d1f72846bd496a84c7a00f82fa2a8ba75cff4e00dc69700b617e3ccee WatchSource:0}: Error finding container 63e3be4d1f72846bd496a84c7a00f82fa2a8ba75cff4e00dc69700b617e3ccee: Status 404 returned error can't find the container with id 63e3be4d1f72846bd496a84c7a00f82fa2a8ba75cff4e00dc69700b617e3ccee Oct 09 15:30:34 crc kubenswrapper[4719]: I1009 15:30:34.112688 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-645dbdc857-kl4lw" event={"ID":"e3a6e576-c324-4600-b0e9-4a83cd64d478","Type":"ContainerStarted","Data":"63e3be4d1f72846bd496a84c7a00f82fa2a8ba75cff4e00dc69700b617e3ccee"} Oct 09 15:30:34 crc kubenswrapper[4719]: I1009 15:30:34.114403 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-54fb4b8cc7-c4j9j" event={"ID":"7d280e60-020e-43c0-a430-fb220b1d8354","Type":"ContainerStarted","Data":"52748f80bf43fd0ed3d510ad792c66074befe03e2db907164f198b05c143defe"} Oct 09 15:30:39 crc kubenswrapper[4719]: I1009 15:30:39.150693 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-645dbdc857-kl4lw" event={"ID":"e3a6e576-c324-4600-b0e9-4a83cd64d478","Type":"ContainerStarted","Data":"2ca7426a820109f933cdd787747b7f9c6020f2392c654aa590182090d444d4aa"} Oct 09 15:30:39 crc kubenswrapper[4719]: I1009 15:30:39.151303 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-645dbdc857-kl4lw" Oct 09 15:30:39 crc kubenswrapper[4719]: I1009 15:30:39.152447 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-54fb4b8cc7-c4j9j" event={"ID":"7d280e60-020e-43c0-a430-fb220b1d8354","Type":"ContainerStarted","Data":"c75c4bc952891698230983c2bfc30fc0fb52eb57b7ed11ed5494e270c3628cb2"} Oct 09 15:30:39 crc kubenswrapper[4719]: I1009 15:30:39.152592 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-54fb4b8cc7-c4j9j" Oct 09 15:30:39 crc kubenswrapper[4719]: I1009 15:30:39.177966 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-645dbdc857-kl4lw" podStartSLOduration=2.287370449 podStartE2EDuration="7.177944054s" podCreationTimestamp="2025-10-09 15:30:32 +0000 UTC" firstStartedPulling="2025-10-09 15:30:33.637566334 +0000 UTC m=+739.147277619" lastFinishedPulling="2025-10-09 15:30:38.528139939 +0000 UTC m=+744.037851224" observedRunningTime="2025-10-09 15:30:39.172623364 +0000 UTC m=+744.682334659" watchObservedRunningTime="2025-10-09 15:30:39.177944054 +0000 UTC m=+744.687655339" Oct 09 15:30:39 crc kubenswrapper[4719]: I1009 15:30:39.211465 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-54fb4b8cc7-c4j9j" podStartSLOduration=2.12161712 podStartE2EDuration="7.211443415s" podCreationTimestamp="2025-10-09 15:30:32 +0000 UTC" firstStartedPulling="2025-10-09 15:30:33.422330353 +0000 UTC m=+738.932041638" lastFinishedPulling="2025-10-09 15:30:38.512156648 +0000 UTC m=+744.021867933" observedRunningTime="2025-10-09 15:30:39.203751679 +0000 UTC m=+744.713462964" watchObservedRunningTime="2025-10-09 15:30:39.211443415 +0000 UTC m=+744.721154700" Oct 09 15:30:42 crc kubenswrapper[4719]: I1009 15:30:42.439379 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lblwv"] Oct 09 15:30:42 crc kubenswrapper[4719]: I1009 15:30:42.439952 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" podUID="35e1fbe6-e210-4329-a27b-3341136d7dcd" containerName="controller-manager" containerID="cri-o://970b351ff620528e1ba8050c7be20f902b2ed86d83bdf2734e36a85ec2becd21" gracePeriod=30 Oct 09 15:30:42 crc kubenswrapper[4719]: I1009 15:30:42.567974 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb"] Oct 09 15:30:42 crc kubenswrapper[4719]: I1009 15:30:42.581196 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb" podUID="c06c731d-cb94-49f2-8afb-899c7c6e7724" containerName="route-controller-manager" containerID="cri-o://61d007c207535c18c24bb726f5a501553f4dfc85b9e0e1b2812a8247af687744" gracePeriod=30 Oct 09 15:30:42 crc kubenswrapper[4719]: I1009 15:30:42.891932 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" Oct 09 15:30:42 crc kubenswrapper[4719]: I1009 15:30:42.905620 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4hgc\" (UniqueName: \"kubernetes.io/projected/35e1fbe6-e210-4329-a27b-3341136d7dcd-kube-api-access-x4hgc\") pod \"35e1fbe6-e210-4329-a27b-3341136d7dcd\" (UID: \"35e1fbe6-e210-4329-a27b-3341136d7dcd\") " Oct 09 15:30:42 crc kubenswrapper[4719]: I1009 15:30:42.905685 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/35e1fbe6-e210-4329-a27b-3341136d7dcd-proxy-ca-bundles\") pod \"35e1fbe6-e210-4329-a27b-3341136d7dcd\" (UID: \"35e1fbe6-e210-4329-a27b-3341136d7dcd\") " Oct 09 15:30:42 crc kubenswrapper[4719]: I1009 15:30:42.905730 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35e1fbe6-e210-4329-a27b-3341136d7dcd-serving-cert\") pod \"35e1fbe6-e210-4329-a27b-3341136d7dcd\" (UID: \"35e1fbe6-e210-4329-a27b-3341136d7dcd\") " Oct 09 15:30:42 crc kubenswrapper[4719]: I1009 15:30:42.905752 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/35e1fbe6-e210-4329-a27b-3341136d7dcd-client-ca\") pod \"35e1fbe6-e210-4329-a27b-3341136d7dcd\" (UID: \"35e1fbe6-e210-4329-a27b-3341136d7dcd\") " Oct 09 15:30:42 crc kubenswrapper[4719]: I1009 15:30:42.905790 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35e1fbe6-e210-4329-a27b-3341136d7dcd-config\") pod \"35e1fbe6-e210-4329-a27b-3341136d7dcd\" (UID: \"35e1fbe6-e210-4329-a27b-3341136d7dcd\") " Oct 09 15:30:42 crc kubenswrapper[4719]: I1009 15:30:42.907412 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35e1fbe6-e210-4329-a27b-3341136d7dcd-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "35e1fbe6-e210-4329-a27b-3341136d7dcd" (UID: "35e1fbe6-e210-4329-a27b-3341136d7dcd"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:30:42 crc kubenswrapper[4719]: I1009 15:30:42.907575 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35e1fbe6-e210-4329-a27b-3341136d7dcd-client-ca" (OuterVolumeSpecName: "client-ca") pod "35e1fbe6-e210-4329-a27b-3341136d7dcd" (UID: "35e1fbe6-e210-4329-a27b-3341136d7dcd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:30:42 crc kubenswrapper[4719]: I1009 15:30:42.911942 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35e1fbe6-e210-4329-a27b-3341136d7dcd-config" (OuterVolumeSpecName: "config") pod "35e1fbe6-e210-4329-a27b-3341136d7dcd" (UID: "35e1fbe6-e210-4329-a27b-3341136d7dcd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:30:42 crc kubenswrapper[4719]: I1009 15:30:42.913854 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e1fbe6-e210-4329-a27b-3341136d7dcd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "35e1fbe6-e210-4329-a27b-3341136d7dcd" (UID: "35e1fbe6-e210-4329-a27b-3341136d7dcd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:30:42 crc kubenswrapper[4719]: I1009 15:30:42.916399 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35e1fbe6-e210-4329-a27b-3341136d7dcd-kube-api-access-x4hgc" (OuterVolumeSpecName: "kube-api-access-x4hgc") pod "35e1fbe6-e210-4329-a27b-3341136d7dcd" (UID: "35e1fbe6-e210-4329-a27b-3341136d7dcd"). InnerVolumeSpecName "kube-api-access-x4hgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.007114 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35e1fbe6-e210-4329-a27b-3341136d7dcd-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.007160 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4hgc\" (UniqueName: \"kubernetes.io/projected/35e1fbe6-e210-4329-a27b-3341136d7dcd-kube-api-access-x4hgc\") on node \"crc\" DevicePath \"\"" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.007172 4719 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/35e1fbe6-e210-4329-a27b-3341136d7dcd-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.007180 4719 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35e1fbe6-e210-4329-a27b-3341136d7dcd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.007190 4719 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/35e1fbe6-e210-4329-a27b-3341136d7dcd-client-ca\") on node \"crc\" DevicePath \"\"" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.025192 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.174659 4719 generic.go:334] "Generic (PLEG): container finished" podID="c06c731d-cb94-49f2-8afb-899c7c6e7724" containerID="61d007c207535c18c24bb726f5a501553f4dfc85b9e0e1b2812a8247af687744" exitCode=0 Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.174687 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.174736 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb" event={"ID":"c06c731d-cb94-49f2-8afb-899c7c6e7724","Type":"ContainerDied","Data":"61d007c207535c18c24bb726f5a501553f4dfc85b9e0e1b2812a8247af687744"} Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.174772 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb" event={"ID":"c06c731d-cb94-49f2-8afb-899c7c6e7724","Type":"ContainerDied","Data":"35c40da366289aeeac5eaa006789c2e45207c7f7b4fc3a562482ddd68a22d2b8"} Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.174796 4719 scope.go:117] "RemoveContainer" containerID="61d007c207535c18c24bb726f5a501553f4dfc85b9e0e1b2812a8247af687744" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.177021 4719 generic.go:334] "Generic (PLEG): container finished" podID="35e1fbe6-e210-4329-a27b-3341136d7dcd" containerID="970b351ff620528e1ba8050c7be20f902b2ed86d83bdf2734e36a85ec2becd21" exitCode=0 Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.177048 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" event={"ID":"35e1fbe6-e210-4329-a27b-3341136d7dcd","Type":"ContainerDied","Data":"970b351ff620528e1ba8050c7be20f902b2ed86d83bdf2734e36a85ec2becd21"} Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.177068 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.177084 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lblwv" event={"ID":"35e1fbe6-e210-4329-a27b-3341136d7dcd","Type":"ContainerDied","Data":"4de81b59bd054f4d846c8ca65fb5dc7618e65c430bf7042f693f7c1d6b8d9e15"} Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.190377 4719 scope.go:117] "RemoveContainer" containerID="61d007c207535c18c24bb726f5a501553f4dfc85b9e0e1b2812a8247af687744" Oct 09 15:30:43 crc kubenswrapper[4719]: E1009 15:30:43.190786 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61d007c207535c18c24bb726f5a501553f4dfc85b9e0e1b2812a8247af687744\": container with ID starting with 61d007c207535c18c24bb726f5a501553f4dfc85b9e0e1b2812a8247af687744 not found: ID does not exist" containerID="61d007c207535c18c24bb726f5a501553f4dfc85b9e0e1b2812a8247af687744" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.190811 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61d007c207535c18c24bb726f5a501553f4dfc85b9e0e1b2812a8247af687744"} err="failed to get container status \"61d007c207535c18c24bb726f5a501553f4dfc85b9e0e1b2812a8247af687744\": rpc error: code = NotFound desc = could not find container \"61d007c207535c18c24bb726f5a501553f4dfc85b9e0e1b2812a8247af687744\": container with ID starting with 61d007c207535c18c24bb726f5a501553f4dfc85b9e0e1b2812a8247af687744 not found: ID does not exist" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.190830 4719 scope.go:117] "RemoveContainer" containerID="970b351ff620528e1ba8050c7be20f902b2ed86d83bdf2734e36a85ec2becd21" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.207271 4719 scope.go:117] "RemoveContainer" containerID="970b351ff620528e1ba8050c7be20f902b2ed86d83bdf2734e36a85ec2becd21" Oct 09 15:30:43 crc kubenswrapper[4719]: E1009 15:30:43.207767 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"970b351ff620528e1ba8050c7be20f902b2ed86d83bdf2734e36a85ec2becd21\": container with ID starting with 970b351ff620528e1ba8050c7be20f902b2ed86d83bdf2734e36a85ec2becd21 not found: ID does not exist" containerID="970b351ff620528e1ba8050c7be20f902b2ed86d83bdf2734e36a85ec2becd21" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.207883 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lblwv"] Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.207878 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"970b351ff620528e1ba8050c7be20f902b2ed86d83bdf2734e36a85ec2becd21"} err="failed to get container status \"970b351ff620528e1ba8050c7be20f902b2ed86d83bdf2734e36a85ec2becd21\": rpc error: code = NotFound desc = could not find container \"970b351ff620528e1ba8050c7be20f902b2ed86d83bdf2734e36a85ec2becd21\": container with ID starting with 970b351ff620528e1ba8050c7be20f902b2ed86d83bdf2734e36a85ec2becd21 not found: ID does not exist" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.208579 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c06c731d-cb94-49f2-8afb-899c7c6e7724-client-ca\") pod \"c06c731d-cb94-49f2-8afb-899c7c6e7724\" (UID: \"c06c731d-cb94-49f2-8afb-899c7c6e7724\") " Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.208700 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c06c731d-cb94-49f2-8afb-899c7c6e7724-serving-cert\") pod \"c06c731d-cb94-49f2-8afb-899c7c6e7724\" (UID: \"c06c731d-cb94-49f2-8afb-899c7c6e7724\") " Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.208918 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c06c731d-cb94-49f2-8afb-899c7c6e7724-config\") pod \"c06c731d-cb94-49f2-8afb-899c7c6e7724\" (UID: \"c06c731d-cb94-49f2-8afb-899c7c6e7724\") " Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.209020 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8nch\" (UniqueName: \"kubernetes.io/projected/c06c731d-cb94-49f2-8afb-899c7c6e7724-kube-api-access-x8nch\") pod \"c06c731d-cb94-49f2-8afb-899c7c6e7724\" (UID: \"c06c731d-cb94-49f2-8afb-899c7c6e7724\") " Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.209675 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c06c731d-cb94-49f2-8afb-899c7c6e7724-config" (OuterVolumeSpecName: "config") pod "c06c731d-cb94-49f2-8afb-899c7c6e7724" (UID: "c06c731d-cb94-49f2-8afb-899c7c6e7724"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.210012 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c06c731d-cb94-49f2-8afb-899c7c6e7724-client-ca" (OuterVolumeSpecName: "client-ca") pod "c06c731d-cb94-49f2-8afb-899c7c6e7724" (UID: "c06c731d-cb94-49f2-8afb-899c7c6e7724"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.212721 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c06c731d-cb94-49f2-8afb-899c7c6e7724-kube-api-access-x8nch" (OuterVolumeSpecName: "kube-api-access-x8nch") pod "c06c731d-cb94-49f2-8afb-899c7c6e7724" (UID: "c06c731d-cb94-49f2-8afb-899c7c6e7724"). InnerVolumeSpecName "kube-api-access-x8nch". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.213089 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c06c731d-cb94-49f2-8afb-899c7c6e7724-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c06c731d-cb94-49f2-8afb-899c7c6e7724" (UID: "c06c731d-cb94-49f2-8afb-899c7c6e7724"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.213232 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lblwv"] Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.311076 4719 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c06c731d-cb94-49f2-8afb-899c7c6e7724-client-ca\") on node \"crc\" DevicePath \"\"" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.311113 4719 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c06c731d-cb94-49f2-8afb-899c7c6e7724-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.311122 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c06c731d-cb94-49f2-8afb-899c7c6e7724-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.311131 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8nch\" (UniqueName: \"kubernetes.io/projected/c06c731d-cb94-49f2-8afb-899c7c6e7724-kube-api-access-x8nch\") on node \"crc\" DevicePath \"\"" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.501840 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb"] Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.507681 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rwhcb"] Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.969506 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-67f5bd84b9-w8nrb"] Oct 09 15:30:43 crc kubenswrapper[4719]: E1009 15:30:43.969728 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c06c731d-cb94-49f2-8afb-899c7c6e7724" containerName="route-controller-manager" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.969741 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="c06c731d-cb94-49f2-8afb-899c7c6e7724" containerName="route-controller-manager" Oct 09 15:30:43 crc kubenswrapper[4719]: E1009 15:30:43.969756 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35e1fbe6-e210-4329-a27b-3341136d7dcd" containerName="controller-manager" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.969762 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="35e1fbe6-e210-4329-a27b-3341136d7dcd" containerName="controller-manager" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.969855 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="c06c731d-cb94-49f2-8afb-899c7c6e7724" containerName="route-controller-manager" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.969869 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="35e1fbe6-e210-4329-a27b-3341136d7dcd" containerName="controller-manager" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.970237 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67f5bd84b9-w8nrb" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.976875 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5987579bf8-5k4xq"] Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.977639 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5987579bf8-5k4xq" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.977751 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.977951 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.978259 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.980495 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.980652 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.980935 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.986437 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.986671 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.987756 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.987764 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.988008 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.988259 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.992364 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67f5bd84b9-w8nrb"] Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.996226 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 09 15:30:43 crc kubenswrapper[4719]: I1009 15:30:43.997072 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5987579bf8-5k4xq"] Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.121470 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j8t2\" (UniqueName: \"kubernetes.io/projected/a043da44-fb9b-4eb7-85bb-c40a8b8bfd63-kube-api-access-2j8t2\") pod \"route-controller-manager-5987579bf8-5k4xq\" (UID: \"a043da44-fb9b-4eb7-85bb-c40a8b8bfd63\") " pod="openshift-route-controller-manager/route-controller-manager-5987579bf8-5k4xq" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.121611 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b6c1439-3d61-409e-9458-ac74df321d9d-config\") pod \"controller-manager-67f5bd84b9-w8nrb\" (UID: \"3b6c1439-3d61-409e-9458-ac74df321d9d\") " pod="openshift-controller-manager/controller-manager-67f5bd84b9-w8nrb" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.121676 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzfsb\" (UniqueName: \"kubernetes.io/projected/3b6c1439-3d61-409e-9458-ac74df321d9d-kube-api-access-qzfsb\") pod \"controller-manager-67f5bd84b9-w8nrb\" (UID: \"3b6c1439-3d61-409e-9458-ac74df321d9d\") " pod="openshift-controller-manager/controller-manager-67f5bd84b9-w8nrb" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.121741 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a043da44-fb9b-4eb7-85bb-c40a8b8bfd63-config\") pod \"route-controller-manager-5987579bf8-5k4xq\" (UID: \"a043da44-fb9b-4eb7-85bb-c40a8b8bfd63\") " pod="openshift-route-controller-manager/route-controller-manager-5987579bf8-5k4xq" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.121776 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a043da44-fb9b-4eb7-85bb-c40a8b8bfd63-client-ca\") pod \"route-controller-manager-5987579bf8-5k4xq\" (UID: \"a043da44-fb9b-4eb7-85bb-c40a8b8bfd63\") " pod="openshift-route-controller-manager/route-controller-manager-5987579bf8-5k4xq" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.121803 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b6c1439-3d61-409e-9458-ac74df321d9d-client-ca\") pod \"controller-manager-67f5bd84b9-w8nrb\" (UID: \"3b6c1439-3d61-409e-9458-ac74df321d9d\") " pod="openshift-controller-manager/controller-manager-67f5bd84b9-w8nrb" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.121821 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b6c1439-3d61-409e-9458-ac74df321d9d-serving-cert\") pod \"controller-manager-67f5bd84b9-w8nrb\" (UID: \"3b6c1439-3d61-409e-9458-ac74df321d9d\") " pod="openshift-controller-manager/controller-manager-67f5bd84b9-w8nrb" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.121873 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3b6c1439-3d61-409e-9458-ac74df321d9d-proxy-ca-bundles\") pod \"controller-manager-67f5bd84b9-w8nrb\" (UID: \"3b6c1439-3d61-409e-9458-ac74df321d9d\") " pod="openshift-controller-manager/controller-manager-67f5bd84b9-w8nrb" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.121896 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a043da44-fb9b-4eb7-85bb-c40a8b8bfd63-serving-cert\") pod \"route-controller-manager-5987579bf8-5k4xq\" (UID: \"a043da44-fb9b-4eb7-85bb-c40a8b8bfd63\") " pod="openshift-route-controller-manager/route-controller-manager-5987579bf8-5k4xq" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.223330 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j8t2\" (UniqueName: \"kubernetes.io/projected/a043da44-fb9b-4eb7-85bb-c40a8b8bfd63-kube-api-access-2j8t2\") pod \"route-controller-manager-5987579bf8-5k4xq\" (UID: \"a043da44-fb9b-4eb7-85bb-c40a8b8bfd63\") " pod="openshift-route-controller-manager/route-controller-manager-5987579bf8-5k4xq" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.223430 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b6c1439-3d61-409e-9458-ac74df321d9d-config\") pod \"controller-manager-67f5bd84b9-w8nrb\" (UID: \"3b6c1439-3d61-409e-9458-ac74df321d9d\") " pod="openshift-controller-manager/controller-manager-67f5bd84b9-w8nrb" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.223463 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzfsb\" (UniqueName: \"kubernetes.io/projected/3b6c1439-3d61-409e-9458-ac74df321d9d-kube-api-access-qzfsb\") pod \"controller-manager-67f5bd84b9-w8nrb\" (UID: \"3b6c1439-3d61-409e-9458-ac74df321d9d\") " pod="openshift-controller-manager/controller-manager-67f5bd84b9-w8nrb" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.223500 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a043da44-fb9b-4eb7-85bb-c40a8b8bfd63-config\") pod \"route-controller-manager-5987579bf8-5k4xq\" (UID: \"a043da44-fb9b-4eb7-85bb-c40a8b8bfd63\") " pod="openshift-route-controller-manager/route-controller-manager-5987579bf8-5k4xq" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.223527 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a043da44-fb9b-4eb7-85bb-c40a8b8bfd63-client-ca\") pod \"route-controller-manager-5987579bf8-5k4xq\" (UID: \"a043da44-fb9b-4eb7-85bb-c40a8b8bfd63\") " pod="openshift-route-controller-manager/route-controller-manager-5987579bf8-5k4xq" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.223549 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b6c1439-3d61-409e-9458-ac74df321d9d-client-ca\") pod \"controller-manager-67f5bd84b9-w8nrb\" (UID: \"3b6c1439-3d61-409e-9458-ac74df321d9d\") " pod="openshift-controller-manager/controller-manager-67f5bd84b9-w8nrb" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.223570 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b6c1439-3d61-409e-9458-ac74df321d9d-serving-cert\") pod \"controller-manager-67f5bd84b9-w8nrb\" (UID: \"3b6c1439-3d61-409e-9458-ac74df321d9d\") " pod="openshift-controller-manager/controller-manager-67f5bd84b9-w8nrb" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.223602 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3b6c1439-3d61-409e-9458-ac74df321d9d-proxy-ca-bundles\") pod \"controller-manager-67f5bd84b9-w8nrb\" (UID: \"3b6c1439-3d61-409e-9458-ac74df321d9d\") " pod="openshift-controller-manager/controller-manager-67f5bd84b9-w8nrb" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.223624 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a043da44-fb9b-4eb7-85bb-c40a8b8bfd63-serving-cert\") pod \"route-controller-manager-5987579bf8-5k4xq\" (UID: \"a043da44-fb9b-4eb7-85bb-c40a8b8bfd63\") " pod="openshift-route-controller-manager/route-controller-manager-5987579bf8-5k4xq" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.224505 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a043da44-fb9b-4eb7-85bb-c40a8b8bfd63-client-ca\") pod \"route-controller-manager-5987579bf8-5k4xq\" (UID: \"a043da44-fb9b-4eb7-85bb-c40a8b8bfd63\") " pod="openshift-route-controller-manager/route-controller-manager-5987579bf8-5k4xq" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.224571 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b6c1439-3d61-409e-9458-ac74df321d9d-client-ca\") pod \"controller-manager-67f5bd84b9-w8nrb\" (UID: \"3b6c1439-3d61-409e-9458-ac74df321d9d\") " pod="openshift-controller-manager/controller-manager-67f5bd84b9-w8nrb" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.224722 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a043da44-fb9b-4eb7-85bb-c40a8b8bfd63-config\") pod \"route-controller-manager-5987579bf8-5k4xq\" (UID: \"a043da44-fb9b-4eb7-85bb-c40a8b8bfd63\") " pod="openshift-route-controller-manager/route-controller-manager-5987579bf8-5k4xq" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.225039 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b6c1439-3d61-409e-9458-ac74df321d9d-config\") pod \"controller-manager-67f5bd84b9-w8nrb\" (UID: \"3b6c1439-3d61-409e-9458-ac74df321d9d\") " pod="openshift-controller-manager/controller-manager-67f5bd84b9-w8nrb" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.225589 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3b6c1439-3d61-409e-9458-ac74df321d9d-proxy-ca-bundles\") pod \"controller-manager-67f5bd84b9-w8nrb\" (UID: \"3b6c1439-3d61-409e-9458-ac74df321d9d\") " pod="openshift-controller-manager/controller-manager-67f5bd84b9-w8nrb" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.226903 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a043da44-fb9b-4eb7-85bb-c40a8b8bfd63-serving-cert\") pod \"route-controller-manager-5987579bf8-5k4xq\" (UID: \"a043da44-fb9b-4eb7-85bb-c40a8b8bfd63\") " pod="openshift-route-controller-manager/route-controller-manager-5987579bf8-5k4xq" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.233986 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b6c1439-3d61-409e-9458-ac74df321d9d-serving-cert\") pod \"controller-manager-67f5bd84b9-w8nrb\" (UID: \"3b6c1439-3d61-409e-9458-ac74df321d9d\") " pod="openshift-controller-manager/controller-manager-67f5bd84b9-w8nrb" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.240858 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j8t2\" (UniqueName: \"kubernetes.io/projected/a043da44-fb9b-4eb7-85bb-c40a8b8bfd63-kube-api-access-2j8t2\") pod \"route-controller-manager-5987579bf8-5k4xq\" (UID: \"a043da44-fb9b-4eb7-85bb-c40a8b8bfd63\") " pod="openshift-route-controller-manager/route-controller-manager-5987579bf8-5k4xq" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.248004 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzfsb\" (UniqueName: \"kubernetes.io/projected/3b6c1439-3d61-409e-9458-ac74df321d9d-kube-api-access-qzfsb\") pod \"controller-manager-67f5bd84b9-w8nrb\" (UID: \"3b6c1439-3d61-409e-9458-ac74df321d9d\") " pod="openshift-controller-manager/controller-manager-67f5bd84b9-w8nrb" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.284041 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67f5bd84b9-w8nrb" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.296948 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5987579bf8-5k4xq" Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.515240 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67f5bd84b9-w8nrb"] Oct 09 15:30:44 crc kubenswrapper[4719]: I1009 15:30:44.563456 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5987579bf8-5k4xq"] Oct 09 15:30:45 crc kubenswrapper[4719]: I1009 15:30:45.168320 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35e1fbe6-e210-4329-a27b-3341136d7dcd" path="/var/lib/kubelet/pods/35e1fbe6-e210-4329-a27b-3341136d7dcd/volumes" Oct 09 15:30:45 crc kubenswrapper[4719]: I1009 15:30:45.169604 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c06c731d-cb94-49f2-8afb-899c7c6e7724" path="/var/lib/kubelet/pods/c06c731d-cb94-49f2-8afb-899c7c6e7724/volumes" Oct 09 15:30:45 crc kubenswrapper[4719]: I1009 15:30:45.200767 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67f5bd84b9-w8nrb" event={"ID":"3b6c1439-3d61-409e-9458-ac74df321d9d","Type":"ContainerStarted","Data":"632e43f9a6e11969ad5f1bfdc7df1e15eea8d83ff485cf7f211580e97e0ef3c3"} Oct 09 15:30:45 crc kubenswrapper[4719]: I1009 15:30:45.200814 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67f5bd84b9-w8nrb" event={"ID":"3b6c1439-3d61-409e-9458-ac74df321d9d","Type":"ContainerStarted","Data":"c95b1a390c208f4765328b4122376188fc54d4a588f11dfc1a0b3ef894166e1f"} Oct 09 15:30:45 crc kubenswrapper[4719]: I1009 15:30:45.201269 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-67f5bd84b9-w8nrb" Oct 09 15:30:45 crc kubenswrapper[4719]: I1009 15:30:45.203934 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5987579bf8-5k4xq" event={"ID":"a043da44-fb9b-4eb7-85bb-c40a8b8bfd63","Type":"ContainerStarted","Data":"24e8a2f6510750bc00e16bbdfa00b2853d951e5349334e3529ad1d4971a60a78"} Oct 09 15:30:45 crc kubenswrapper[4719]: I1009 15:30:45.203973 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5987579bf8-5k4xq" event={"ID":"a043da44-fb9b-4eb7-85bb-c40a8b8bfd63","Type":"ContainerStarted","Data":"66d435fa69755971ac0c47e88cb8b16aaf2b418833bf403676076b7d0bbabdfc"} Oct 09 15:30:45 crc kubenswrapper[4719]: I1009 15:30:45.204758 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5987579bf8-5k4xq" Oct 09 15:30:45 crc kubenswrapper[4719]: I1009 15:30:45.213258 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-67f5bd84b9-w8nrb" Oct 09 15:30:45 crc kubenswrapper[4719]: I1009 15:30:45.220990 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-67f5bd84b9-w8nrb" podStartSLOduration=2.220972984 podStartE2EDuration="2.220972984s" podCreationTimestamp="2025-10-09 15:30:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:30:45.21897109 +0000 UTC m=+750.728682395" watchObservedRunningTime="2025-10-09 15:30:45.220972984 +0000 UTC m=+750.730684269" Oct 09 15:30:45 crc kubenswrapper[4719]: I1009 15:30:45.260738 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5987579bf8-5k4xq" podStartSLOduration=2.2607142749999998 podStartE2EDuration="2.260714275s" podCreationTimestamp="2025-10-09 15:30:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:30:45.256001514 +0000 UTC m=+750.765712809" watchObservedRunningTime="2025-10-09 15:30:45.260714275 +0000 UTC m=+750.770425570" Oct 09 15:30:45 crc kubenswrapper[4719]: I1009 15:30:45.438486 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5987579bf8-5k4xq" Oct 09 15:30:49 crc kubenswrapper[4719]: I1009 15:30:49.978830 4719 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 09 15:30:53 crc kubenswrapper[4719]: I1009 15:30:53.354492 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-645dbdc857-kl4lw" Oct 09 15:30:53 crc kubenswrapper[4719]: I1009 15:30:53.966607 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5rsn8"] Oct 09 15:30:53 crc kubenswrapper[4719]: I1009 15:30:53.968104 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5rsn8" Oct 09 15:30:53 crc kubenswrapper[4719]: I1009 15:30:53.982941 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5rsn8"] Oct 09 15:30:54 crc kubenswrapper[4719]: I1009 15:30:54.052335 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twwpp\" (UniqueName: \"kubernetes.io/projected/7a58b3b1-760e-414b-990a-d93989dbc0cd-kube-api-access-twwpp\") pod \"certified-operators-5rsn8\" (UID: \"7a58b3b1-760e-414b-990a-d93989dbc0cd\") " pod="openshift-marketplace/certified-operators-5rsn8" Oct 09 15:30:54 crc kubenswrapper[4719]: I1009 15:30:54.052406 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a58b3b1-760e-414b-990a-d93989dbc0cd-utilities\") pod \"certified-operators-5rsn8\" (UID: \"7a58b3b1-760e-414b-990a-d93989dbc0cd\") " pod="openshift-marketplace/certified-operators-5rsn8" Oct 09 15:30:54 crc kubenswrapper[4719]: I1009 15:30:54.052711 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a58b3b1-760e-414b-990a-d93989dbc0cd-catalog-content\") pod \"certified-operators-5rsn8\" (UID: \"7a58b3b1-760e-414b-990a-d93989dbc0cd\") " pod="openshift-marketplace/certified-operators-5rsn8" Oct 09 15:30:54 crc kubenswrapper[4719]: I1009 15:30:54.154346 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a58b3b1-760e-414b-990a-d93989dbc0cd-utilities\") pod \"certified-operators-5rsn8\" (UID: \"7a58b3b1-760e-414b-990a-d93989dbc0cd\") " pod="openshift-marketplace/certified-operators-5rsn8" Oct 09 15:30:54 crc kubenswrapper[4719]: I1009 15:30:54.154470 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a58b3b1-760e-414b-990a-d93989dbc0cd-catalog-content\") pod \"certified-operators-5rsn8\" (UID: \"7a58b3b1-760e-414b-990a-d93989dbc0cd\") " pod="openshift-marketplace/certified-operators-5rsn8" Oct 09 15:30:54 crc kubenswrapper[4719]: I1009 15:30:54.154507 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twwpp\" (UniqueName: \"kubernetes.io/projected/7a58b3b1-760e-414b-990a-d93989dbc0cd-kube-api-access-twwpp\") pod \"certified-operators-5rsn8\" (UID: \"7a58b3b1-760e-414b-990a-d93989dbc0cd\") " pod="openshift-marketplace/certified-operators-5rsn8" Oct 09 15:30:54 crc kubenswrapper[4719]: I1009 15:30:54.154951 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a58b3b1-760e-414b-990a-d93989dbc0cd-utilities\") pod \"certified-operators-5rsn8\" (UID: \"7a58b3b1-760e-414b-990a-d93989dbc0cd\") " pod="openshift-marketplace/certified-operators-5rsn8" Oct 09 15:30:54 crc kubenswrapper[4719]: I1009 15:30:54.154966 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a58b3b1-760e-414b-990a-d93989dbc0cd-catalog-content\") pod \"certified-operators-5rsn8\" (UID: \"7a58b3b1-760e-414b-990a-d93989dbc0cd\") " pod="openshift-marketplace/certified-operators-5rsn8" Oct 09 15:30:54 crc kubenswrapper[4719]: I1009 15:30:54.176516 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twwpp\" (UniqueName: \"kubernetes.io/projected/7a58b3b1-760e-414b-990a-d93989dbc0cd-kube-api-access-twwpp\") pod \"certified-operators-5rsn8\" (UID: \"7a58b3b1-760e-414b-990a-d93989dbc0cd\") " pod="openshift-marketplace/certified-operators-5rsn8" Oct 09 15:30:54 crc kubenswrapper[4719]: I1009 15:30:54.284437 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5rsn8" Oct 09 15:30:54 crc kubenswrapper[4719]: I1009 15:30:54.720953 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5rsn8"] Oct 09 15:30:55 crc kubenswrapper[4719]: I1009 15:30:55.256242 4719 generic.go:334] "Generic (PLEG): container finished" podID="7a58b3b1-760e-414b-990a-d93989dbc0cd" containerID="617e0432f23b6420e606f8e9dec10a2306fc42295e6742b859487208a732e771" exitCode=0 Oct 09 15:30:55 crc kubenswrapper[4719]: I1009 15:30:55.256290 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rsn8" event={"ID":"7a58b3b1-760e-414b-990a-d93989dbc0cd","Type":"ContainerDied","Data":"617e0432f23b6420e606f8e9dec10a2306fc42295e6742b859487208a732e771"} Oct 09 15:30:55 crc kubenswrapper[4719]: I1009 15:30:55.256317 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rsn8" event={"ID":"7a58b3b1-760e-414b-990a-d93989dbc0cd","Type":"ContainerStarted","Data":"44d7a2d35d4dd1bb9f375407b0c00ad011cb9a09269f0a8addabd67d140e38fc"} Oct 09 15:30:56 crc kubenswrapper[4719]: I1009 15:30:56.264009 4719 generic.go:334] "Generic (PLEG): container finished" podID="7a58b3b1-760e-414b-990a-d93989dbc0cd" containerID="8624d9e4736827488808268da50a67b67b0ca948fc413b5197785911e463e48d" exitCode=0 Oct 09 15:30:56 crc kubenswrapper[4719]: I1009 15:30:56.264050 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rsn8" event={"ID":"7a58b3b1-760e-414b-990a-d93989dbc0cd","Type":"ContainerDied","Data":"8624d9e4736827488808268da50a67b67b0ca948fc413b5197785911e463e48d"} Oct 09 15:30:56 crc kubenswrapper[4719]: I1009 15:30:56.768298 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sfk2s"] Oct 09 15:30:56 crc kubenswrapper[4719]: I1009 15:30:56.776699 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sfk2s" Oct 09 15:30:56 crc kubenswrapper[4719]: I1009 15:30:56.790241 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sfk2s"] Oct 09 15:30:56 crc kubenswrapper[4719]: I1009 15:30:56.887492 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdmnx\" (UniqueName: \"kubernetes.io/projected/e9c81c59-ffdc-4b88-8fea-88a2109922ee-kube-api-access-kdmnx\") pod \"community-operators-sfk2s\" (UID: \"e9c81c59-ffdc-4b88-8fea-88a2109922ee\") " pod="openshift-marketplace/community-operators-sfk2s" Oct 09 15:30:56 crc kubenswrapper[4719]: I1009 15:30:56.887564 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9c81c59-ffdc-4b88-8fea-88a2109922ee-catalog-content\") pod \"community-operators-sfk2s\" (UID: \"e9c81c59-ffdc-4b88-8fea-88a2109922ee\") " pod="openshift-marketplace/community-operators-sfk2s" Oct 09 15:30:56 crc kubenswrapper[4719]: I1009 15:30:56.887710 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9c81c59-ffdc-4b88-8fea-88a2109922ee-utilities\") pod \"community-operators-sfk2s\" (UID: \"e9c81c59-ffdc-4b88-8fea-88a2109922ee\") " pod="openshift-marketplace/community-operators-sfk2s" Oct 09 15:30:56 crc kubenswrapper[4719]: I1009 15:30:56.988500 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9c81c59-ffdc-4b88-8fea-88a2109922ee-catalog-content\") pod \"community-operators-sfk2s\" (UID: \"e9c81c59-ffdc-4b88-8fea-88a2109922ee\") " pod="openshift-marketplace/community-operators-sfk2s" Oct 09 15:30:56 crc kubenswrapper[4719]: I1009 15:30:56.988554 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9c81c59-ffdc-4b88-8fea-88a2109922ee-utilities\") pod \"community-operators-sfk2s\" (UID: \"e9c81c59-ffdc-4b88-8fea-88a2109922ee\") " pod="openshift-marketplace/community-operators-sfk2s" Oct 09 15:30:56 crc kubenswrapper[4719]: I1009 15:30:56.988604 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdmnx\" (UniqueName: \"kubernetes.io/projected/e9c81c59-ffdc-4b88-8fea-88a2109922ee-kube-api-access-kdmnx\") pod \"community-operators-sfk2s\" (UID: \"e9c81c59-ffdc-4b88-8fea-88a2109922ee\") " pod="openshift-marketplace/community-operators-sfk2s" Oct 09 15:30:56 crc kubenswrapper[4719]: I1009 15:30:56.989277 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9c81c59-ffdc-4b88-8fea-88a2109922ee-catalog-content\") pod \"community-operators-sfk2s\" (UID: \"e9c81c59-ffdc-4b88-8fea-88a2109922ee\") " pod="openshift-marketplace/community-operators-sfk2s" Oct 09 15:30:56 crc kubenswrapper[4719]: I1009 15:30:56.989506 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9c81c59-ffdc-4b88-8fea-88a2109922ee-utilities\") pod \"community-operators-sfk2s\" (UID: \"e9c81c59-ffdc-4b88-8fea-88a2109922ee\") " pod="openshift-marketplace/community-operators-sfk2s" Oct 09 15:30:57 crc kubenswrapper[4719]: I1009 15:30:57.013860 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdmnx\" (UniqueName: \"kubernetes.io/projected/e9c81c59-ffdc-4b88-8fea-88a2109922ee-kube-api-access-kdmnx\") pod \"community-operators-sfk2s\" (UID: \"e9c81c59-ffdc-4b88-8fea-88a2109922ee\") " pod="openshift-marketplace/community-operators-sfk2s" Oct 09 15:30:57 crc kubenswrapper[4719]: I1009 15:30:57.117554 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sfk2s" Oct 09 15:30:57 crc kubenswrapper[4719]: I1009 15:30:57.298965 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rsn8" event={"ID":"7a58b3b1-760e-414b-990a-d93989dbc0cd","Type":"ContainerStarted","Data":"dae7b85b7e31f3969cb06daaa9b56d35085f04fc8a87afca700641ca1f2406dc"} Oct 09 15:30:57 crc kubenswrapper[4719]: I1009 15:30:57.328149 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5rsn8" podStartSLOduration=2.927636376 podStartE2EDuration="4.328132181s" podCreationTimestamp="2025-10-09 15:30:53 +0000 UTC" firstStartedPulling="2025-10-09 15:30:55.25809856 +0000 UTC m=+760.767809845" lastFinishedPulling="2025-10-09 15:30:56.658594365 +0000 UTC m=+762.168305650" observedRunningTime="2025-10-09 15:30:57.326810978 +0000 UTC m=+762.836522273" watchObservedRunningTime="2025-10-09 15:30:57.328132181 +0000 UTC m=+762.837843466" Oct 09 15:30:57 crc kubenswrapper[4719]: I1009 15:30:57.655930 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sfk2s"] Oct 09 15:30:57 crc kubenswrapper[4719]: W1009 15:30:57.659392 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9c81c59_ffdc_4b88_8fea_88a2109922ee.slice/crio-5b980d1253fff41cad67c954137e3122665aa81e34c35e4ebfc7446082a28181 WatchSource:0}: Error finding container 5b980d1253fff41cad67c954137e3122665aa81e34c35e4ebfc7446082a28181: Status 404 returned error can't find the container with id 5b980d1253fff41cad67c954137e3122665aa81e34c35e4ebfc7446082a28181 Oct 09 15:30:58 crc kubenswrapper[4719]: I1009 15:30:58.306434 4719 generic.go:334] "Generic (PLEG): container finished" podID="e9c81c59-ffdc-4b88-8fea-88a2109922ee" containerID="756568e13dcde9c6ebd13fd4dc3f735f166d3e0d6f682a01cafe3e1597aa03b2" exitCode=0 Oct 09 15:30:58 crc kubenswrapper[4719]: I1009 15:30:58.306601 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfk2s" event={"ID":"e9c81c59-ffdc-4b88-8fea-88a2109922ee","Type":"ContainerDied","Data":"756568e13dcde9c6ebd13fd4dc3f735f166d3e0d6f682a01cafe3e1597aa03b2"} Oct 09 15:30:58 crc kubenswrapper[4719]: I1009 15:30:58.307292 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfk2s" event={"ID":"e9c81c59-ffdc-4b88-8fea-88a2109922ee","Type":"ContainerStarted","Data":"5b980d1253fff41cad67c954137e3122665aa81e34c35e4ebfc7446082a28181"} Oct 09 15:30:59 crc kubenswrapper[4719]: I1009 15:30:59.313860 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfk2s" event={"ID":"e9c81c59-ffdc-4b88-8fea-88a2109922ee","Type":"ContainerStarted","Data":"7a6e9b60f092e56ec3b068ac87b5222e0ba61e149e6c6db4c40ecf6556bc95d7"} Oct 09 15:31:00 crc kubenswrapper[4719]: I1009 15:31:00.320484 4719 generic.go:334] "Generic (PLEG): container finished" podID="e9c81c59-ffdc-4b88-8fea-88a2109922ee" containerID="7a6e9b60f092e56ec3b068ac87b5222e0ba61e149e6c6db4c40ecf6556bc95d7" exitCode=0 Oct 09 15:31:00 crc kubenswrapper[4719]: I1009 15:31:00.320519 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfk2s" event={"ID":"e9c81c59-ffdc-4b88-8fea-88a2109922ee","Type":"ContainerDied","Data":"7a6e9b60f092e56ec3b068ac87b5222e0ba61e149e6c6db4c40ecf6556bc95d7"} Oct 09 15:31:01 crc kubenswrapper[4719]: I1009 15:31:01.326889 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfk2s" event={"ID":"e9c81c59-ffdc-4b88-8fea-88a2109922ee","Type":"ContainerStarted","Data":"af095ec87fcb250e7ecde86f5791fa9894a3cd021c7cd2db58231294329bef82"} Oct 09 15:31:01 crc kubenswrapper[4719]: I1009 15:31:01.344357 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sfk2s" podStartSLOduration=2.886058049 podStartE2EDuration="5.344334312s" podCreationTimestamp="2025-10-09 15:30:56 +0000 UTC" firstStartedPulling="2025-10-09 15:30:58.308749351 +0000 UTC m=+763.818460636" lastFinishedPulling="2025-10-09 15:31:00.767025624 +0000 UTC m=+766.276736899" observedRunningTime="2025-10-09 15:31:01.341821051 +0000 UTC m=+766.851532376" watchObservedRunningTime="2025-10-09 15:31:01.344334312 +0000 UTC m=+766.854045607" Oct 09 15:31:04 crc kubenswrapper[4719]: I1009 15:31:04.284620 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5rsn8" Oct 09 15:31:04 crc kubenswrapper[4719]: I1009 15:31:04.284993 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5rsn8" Oct 09 15:31:04 crc kubenswrapper[4719]: I1009 15:31:04.332401 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5rsn8" Oct 09 15:31:04 crc kubenswrapper[4719]: I1009 15:31:04.383906 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5rsn8" Oct 09 15:31:06 crc kubenswrapper[4719]: I1009 15:31:06.976738 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:31:06 crc kubenswrapper[4719]: I1009 15:31:06.977008 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:31:07 crc kubenswrapper[4719]: I1009 15:31:07.118542 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sfk2s" Oct 09 15:31:07 crc kubenswrapper[4719]: I1009 15:31:07.118601 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sfk2s" Oct 09 15:31:07 crc kubenswrapper[4719]: I1009 15:31:07.152898 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5rsn8"] Oct 09 15:31:07 crc kubenswrapper[4719]: I1009 15:31:07.153125 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5rsn8" podUID="7a58b3b1-760e-414b-990a-d93989dbc0cd" containerName="registry-server" containerID="cri-o://dae7b85b7e31f3969cb06daaa9b56d35085f04fc8a87afca700641ca1f2406dc" gracePeriod=2 Oct 09 15:31:07 crc kubenswrapper[4719]: I1009 15:31:07.174413 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sfk2s" Oct 09 15:31:07 crc kubenswrapper[4719]: I1009 15:31:07.389263 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sfk2s" Oct 09 15:31:08 crc kubenswrapper[4719]: I1009 15:31:08.380793 4719 generic.go:334] "Generic (PLEG): container finished" podID="7a58b3b1-760e-414b-990a-d93989dbc0cd" containerID="dae7b85b7e31f3969cb06daaa9b56d35085f04fc8a87afca700641ca1f2406dc" exitCode=0 Oct 09 15:31:08 crc kubenswrapper[4719]: I1009 15:31:08.381672 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rsn8" event={"ID":"7a58b3b1-760e-414b-990a-d93989dbc0cd","Type":"ContainerDied","Data":"dae7b85b7e31f3969cb06daaa9b56d35085f04fc8a87afca700641ca1f2406dc"} Oct 09 15:31:08 crc kubenswrapper[4719]: I1009 15:31:08.738652 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5rsn8" Oct 09 15:31:08 crc kubenswrapper[4719]: I1009 15:31:08.835545 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twwpp\" (UniqueName: \"kubernetes.io/projected/7a58b3b1-760e-414b-990a-d93989dbc0cd-kube-api-access-twwpp\") pod \"7a58b3b1-760e-414b-990a-d93989dbc0cd\" (UID: \"7a58b3b1-760e-414b-990a-d93989dbc0cd\") " Oct 09 15:31:08 crc kubenswrapper[4719]: I1009 15:31:08.835629 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a58b3b1-760e-414b-990a-d93989dbc0cd-catalog-content\") pod \"7a58b3b1-760e-414b-990a-d93989dbc0cd\" (UID: \"7a58b3b1-760e-414b-990a-d93989dbc0cd\") " Oct 09 15:31:08 crc kubenswrapper[4719]: I1009 15:31:08.835660 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a58b3b1-760e-414b-990a-d93989dbc0cd-utilities\") pod \"7a58b3b1-760e-414b-990a-d93989dbc0cd\" (UID: \"7a58b3b1-760e-414b-990a-d93989dbc0cd\") " Oct 09 15:31:08 crc kubenswrapper[4719]: I1009 15:31:08.836508 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a58b3b1-760e-414b-990a-d93989dbc0cd-utilities" (OuterVolumeSpecName: "utilities") pod "7a58b3b1-760e-414b-990a-d93989dbc0cd" (UID: "7a58b3b1-760e-414b-990a-d93989dbc0cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:31:08 crc kubenswrapper[4719]: I1009 15:31:08.846525 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a58b3b1-760e-414b-990a-d93989dbc0cd-kube-api-access-twwpp" (OuterVolumeSpecName: "kube-api-access-twwpp") pod "7a58b3b1-760e-414b-990a-d93989dbc0cd" (UID: "7a58b3b1-760e-414b-990a-d93989dbc0cd"). InnerVolumeSpecName "kube-api-access-twwpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:31:08 crc kubenswrapper[4719]: I1009 15:31:08.875907 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a58b3b1-760e-414b-990a-d93989dbc0cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a58b3b1-760e-414b-990a-d93989dbc0cd" (UID: "7a58b3b1-760e-414b-990a-d93989dbc0cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:31:08 crc kubenswrapper[4719]: I1009 15:31:08.937444 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twwpp\" (UniqueName: \"kubernetes.io/projected/7a58b3b1-760e-414b-990a-d93989dbc0cd-kube-api-access-twwpp\") on node \"crc\" DevicePath \"\"" Oct 09 15:31:08 crc kubenswrapper[4719]: I1009 15:31:08.937481 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a58b3b1-760e-414b-990a-d93989dbc0cd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 15:31:08 crc kubenswrapper[4719]: I1009 15:31:08.937491 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a58b3b1-760e-414b-990a-d93989dbc0cd-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 15:31:09 crc kubenswrapper[4719]: I1009 15:31:09.388679 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rsn8" event={"ID":"7a58b3b1-760e-414b-990a-d93989dbc0cd","Type":"ContainerDied","Data":"44d7a2d35d4dd1bb9f375407b0c00ad011cb9a09269f0a8addabd67d140e38fc"} Oct 09 15:31:09 crc kubenswrapper[4719]: I1009 15:31:09.388744 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5rsn8" Oct 09 15:31:09 crc kubenswrapper[4719]: I1009 15:31:09.389471 4719 scope.go:117] "RemoveContainer" containerID="dae7b85b7e31f3969cb06daaa9b56d35085f04fc8a87afca700641ca1f2406dc" Oct 09 15:31:09 crc kubenswrapper[4719]: I1009 15:31:09.410302 4719 scope.go:117] "RemoveContainer" containerID="8624d9e4736827488808268da50a67b67b0ca948fc413b5197785911e463e48d" Oct 09 15:31:09 crc kubenswrapper[4719]: I1009 15:31:09.410917 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5rsn8"] Oct 09 15:31:09 crc kubenswrapper[4719]: I1009 15:31:09.415577 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5rsn8"] Oct 09 15:31:09 crc kubenswrapper[4719]: I1009 15:31:09.432598 4719 scope.go:117] "RemoveContainer" containerID="617e0432f23b6420e606f8e9dec10a2306fc42295e6742b859487208a732e771" Oct 09 15:31:10 crc kubenswrapper[4719]: I1009 15:31:10.756283 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sfk2s"] Oct 09 15:31:10 crc kubenswrapper[4719]: I1009 15:31:10.757014 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sfk2s" podUID="e9c81c59-ffdc-4b88-8fea-88a2109922ee" containerName="registry-server" containerID="cri-o://af095ec87fcb250e7ecde86f5791fa9894a3cd021c7cd2db58231294329bef82" gracePeriod=2 Oct 09 15:31:11 crc kubenswrapper[4719]: I1009 15:31:11.169212 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a58b3b1-760e-414b-990a-d93989dbc0cd" path="/var/lib/kubelet/pods/7a58b3b1-760e-414b-990a-d93989dbc0cd/volumes" Oct 09 15:31:12 crc kubenswrapper[4719]: I1009 15:31:12.292868 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sfk2s" Oct 09 15:31:12 crc kubenswrapper[4719]: I1009 15:31:12.385366 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9c81c59-ffdc-4b88-8fea-88a2109922ee-utilities\") pod \"e9c81c59-ffdc-4b88-8fea-88a2109922ee\" (UID: \"e9c81c59-ffdc-4b88-8fea-88a2109922ee\") " Oct 09 15:31:12 crc kubenswrapper[4719]: I1009 15:31:12.385424 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9c81c59-ffdc-4b88-8fea-88a2109922ee-catalog-content\") pod \"e9c81c59-ffdc-4b88-8fea-88a2109922ee\" (UID: \"e9c81c59-ffdc-4b88-8fea-88a2109922ee\") " Oct 09 15:31:12 crc kubenswrapper[4719]: I1009 15:31:12.385502 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdmnx\" (UniqueName: \"kubernetes.io/projected/e9c81c59-ffdc-4b88-8fea-88a2109922ee-kube-api-access-kdmnx\") pod \"e9c81c59-ffdc-4b88-8fea-88a2109922ee\" (UID: \"e9c81c59-ffdc-4b88-8fea-88a2109922ee\") " Oct 09 15:31:12 crc kubenswrapper[4719]: I1009 15:31:12.386127 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9c81c59-ffdc-4b88-8fea-88a2109922ee-utilities" (OuterVolumeSpecName: "utilities") pod "e9c81c59-ffdc-4b88-8fea-88a2109922ee" (UID: "e9c81c59-ffdc-4b88-8fea-88a2109922ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:31:12 crc kubenswrapper[4719]: I1009 15:31:12.390954 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9c81c59-ffdc-4b88-8fea-88a2109922ee-kube-api-access-kdmnx" (OuterVolumeSpecName: "kube-api-access-kdmnx") pod "e9c81c59-ffdc-4b88-8fea-88a2109922ee" (UID: "e9c81c59-ffdc-4b88-8fea-88a2109922ee"). InnerVolumeSpecName "kube-api-access-kdmnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:31:12 crc kubenswrapper[4719]: I1009 15:31:12.413187 4719 generic.go:334] "Generic (PLEG): container finished" podID="e9c81c59-ffdc-4b88-8fea-88a2109922ee" containerID="af095ec87fcb250e7ecde86f5791fa9894a3cd021c7cd2db58231294329bef82" exitCode=0 Oct 09 15:31:12 crc kubenswrapper[4719]: I1009 15:31:12.413231 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sfk2s" Oct 09 15:31:12 crc kubenswrapper[4719]: I1009 15:31:12.413256 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfk2s" event={"ID":"e9c81c59-ffdc-4b88-8fea-88a2109922ee","Type":"ContainerDied","Data":"af095ec87fcb250e7ecde86f5791fa9894a3cd021c7cd2db58231294329bef82"} Oct 09 15:31:12 crc kubenswrapper[4719]: I1009 15:31:12.413304 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfk2s" event={"ID":"e9c81c59-ffdc-4b88-8fea-88a2109922ee","Type":"ContainerDied","Data":"5b980d1253fff41cad67c954137e3122665aa81e34c35e4ebfc7446082a28181"} Oct 09 15:31:12 crc kubenswrapper[4719]: I1009 15:31:12.413321 4719 scope.go:117] "RemoveContainer" containerID="af095ec87fcb250e7ecde86f5791fa9894a3cd021c7cd2db58231294329bef82" Oct 09 15:31:12 crc kubenswrapper[4719]: I1009 15:31:12.430312 4719 scope.go:117] "RemoveContainer" containerID="7a6e9b60f092e56ec3b068ac87b5222e0ba61e149e6c6db4c40ecf6556bc95d7" Oct 09 15:31:12 crc kubenswrapper[4719]: I1009 15:31:12.433119 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9c81c59-ffdc-4b88-8fea-88a2109922ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9c81c59-ffdc-4b88-8fea-88a2109922ee" (UID: "e9c81c59-ffdc-4b88-8fea-88a2109922ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:31:12 crc kubenswrapper[4719]: I1009 15:31:12.442709 4719 scope.go:117] "RemoveContainer" containerID="756568e13dcde9c6ebd13fd4dc3f735f166d3e0d6f682a01cafe3e1597aa03b2" Oct 09 15:31:12 crc kubenswrapper[4719]: I1009 15:31:12.464247 4719 scope.go:117] "RemoveContainer" containerID="af095ec87fcb250e7ecde86f5791fa9894a3cd021c7cd2db58231294329bef82" Oct 09 15:31:12 crc kubenswrapper[4719]: E1009 15:31:12.464985 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af095ec87fcb250e7ecde86f5791fa9894a3cd021c7cd2db58231294329bef82\": container with ID starting with af095ec87fcb250e7ecde86f5791fa9894a3cd021c7cd2db58231294329bef82 not found: ID does not exist" containerID="af095ec87fcb250e7ecde86f5791fa9894a3cd021c7cd2db58231294329bef82" Oct 09 15:31:12 crc kubenswrapper[4719]: I1009 15:31:12.465107 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af095ec87fcb250e7ecde86f5791fa9894a3cd021c7cd2db58231294329bef82"} err="failed to get container status \"af095ec87fcb250e7ecde86f5791fa9894a3cd021c7cd2db58231294329bef82\": rpc error: code = NotFound desc = could not find container \"af095ec87fcb250e7ecde86f5791fa9894a3cd021c7cd2db58231294329bef82\": container with ID starting with af095ec87fcb250e7ecde86f5791fa9894a3cd021c7cd2db58231294329bef82 not found: ID does not exist" Oct 09 15:31:12 crc kubenswrapper[4719]: I1009 15:31:12.465161 4719 scope.go:117] "RemoveContainer" containerID="7a6e9b60f092e56ec3b068ac87b5222e0ba61e149e6c6db4c40ecf6556bc95d7" Oct 09 15:31:12 crc kubenswrapper[4719]: E1009 15:31:12.465389 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a6e9b60f092e56ec3b068ac87b5222e0ba61e149e6c6db4c40ecf6556bc95d7\": container with ID starting with 7a6e9b60f092e56ec3b068ac87b5222e0ba61e149e6c6db4c40ecf6556bc95d7 not found: ID does not exist" containerID="7a6e9b60f092e56ec3b068ac87b5222e0ba61e149e6c6db4c40ecf6556bc95d7" Oct 09 15:31:12 crc kubenswrapper[4719]: I1009 15:31:12.465416 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a6e9b60f092e56ec3b068ac87b5222e0ba61e149e6c6db4c40ecf6556bc95d7"} err="failed to get container status \"7a6e9b60f092e56ec3b068ac87b5222e0ba61e149e6c6db4c40ecf6556bc95d7\": rpc error: code = NotFound desc = could not find container \"7a6e9b60f092e56ec3b068ac87b5222e0ba61e149e6c6db4c40ecf6556bc95d7\": container with ID starting with 7a6e9b60f092e56ec3b068ac87b5222e0ba61e149e6c6db4c40ecf6556bc95d7 not found: ID does not exist" Oct 09 15:31:12 crc kubenswrapper[4719]: I1009 15:31:12.465428 4719 scope.go:117] "RemoveContainer" containerID="756568e13dcde9c6ebd13fd4dc3f735f166d3e0d6f682a01cafe3e1597aa03b2" Oct 09 15:31:12 crc kubenswrapper[4719]: E1009 15:31:12.465705 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"756568e13dcde9c6ebd13fd4dc3f735f166d3e0d6f682a01cafe3e1597aa03b2\": container with ID starting with 756568e13dcde9c6ebd13fd4dc3f735f166d3e0d6f682a01cafe3e1597aa03b2 not found: ID does not exist" containerID="756568e13dcde9c6ebd13fd4dc3f735f166d3e0d6f682a01cafe3e1597aa03b2" Oct 09 15:31:12 crc kubenswrapper[4719]: I1009 15:31:12.465733 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"756568e13dcde9c6ebd13fd4dc3f735f166d3e0d6f682a01cafe3e1597aa03b2"} err="failed to get container status \"756568e13dcde9c6ebd13fd4dc3f735f166d3e0d6f682a01cafe3e1597aa03b2\": rpc error: code = NotFound desc = could not find container \"756568e13dcde9c6ebd13fd4dc3f735f166d3e0d6f682a01cafe3e1597aa03b2\": container with ID starting with 756568e13dcde9c6ebd13fd4dc3f735f166d3e0d6f682a01cafe3e1597aa03b2 not found: ID does not exist" Oct 09 15:31:12 crc kubenswrapper[4719]: I1009 15:31:12.486712 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9c81c59-ffdc-4b88-8fea-88a2109922ee-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 15:31:12 crc kubenswrapper[4719]: I1009 15:31:12.486759 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9c81c59-ffdc-4b88-8fea-88a2109922ee-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 15:31:12 crc kubenswrapper[4719]: I1009 15:31:12.486772 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdmnx\" (UniqueName: \"kubernetes.io/projected/e9c81c59-ffdc-4b88-8fea-88a2109922ee-kube-api-access-kdmnx\") on node \"crc\" DevicePath \"\"" Oct 09 15:31:12 crc kubenswrapper[4719]: I1009 15:31:12.742919 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sfk2s"] Oct 09 15:31:12 crc kubenswrapper[4719]: I1009 15:31:12.747096 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sfk2s"] Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.062919 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-54fb4b8cc7-c4j9j" Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.168289 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9c81c59-ffdc-4b88-8fea-88a2109922ee" path="/var/lib/kubelet/pods/e9c81c59-ffdc-4b88-8fea-88a2109922ee/volumes" Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.878321 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-r8hz9"] Oct 09 15:31:13 crc kubenswrapper[4719]: E1009 15:31:13.878936 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a58b3b1-760e-414b-990a-d93989dbc0cd" containerName="registry-server" Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.878953 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a58b3b1-760e-414b-990a-d93989dbc0cd" containerName="registry-server" Oct 09 15:31:13 crc kubenswrapper[4719]: E1009 15:31:13.878966 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9c81c59-ffdc-4b88-8fea-88a2109922ee" containerName="registry-server" Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.878974 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c81c59-ffdc-4b88-8fea-88a2109922ee" containerName="registry-server" Oct 09 15:31:13 crc kubenswrapper[4719]: E1009 15:31:13.878988 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a58b3b1-760e-414b-990a-d93989dbc0cd" containerName="extract-content" Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.878996 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a58b3b1-760e-414b-990a-d93989dbc0cd" containerName="extract-content" Oct 09 15:31:13 crc kubenswrapper[4719]: E1009 15:31:13.879005 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9c81c59-ffdc-4b88-8fea-88a2109922ee" containerName="extract-content" Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.879015 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c81c59-ffdc-4b88-8fea-88a2109922ee" containerName="extract-content" Oct 09 15:31:13 crc kubenswrapper[4719]: E1009 15:31:13.879028 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9c81c59-ffdc-4b88-8fea-88a2109922ee" containerName="extract-utilities" Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.879037 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c81c59-ffdc-4b88-8fea-88a2109922ee" containerName="extract-utilities" Oct 09 15:31:13 crc kubenswrapper[4719]: E1009 15:31:13.879052 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a58b3b1-760e-414b-990a-d93989dbc0cd" containerName="extract-utilities" Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.879060 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a58b3b1-760e-414b-990a-d93989dbc0cd" containerName="extract-utilities" Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.879181 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9c81c59-ffdc-4b88-8fea-88a2109922ee" containerName="registry-server" Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.879192 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a58b3b1-760e-414b-990a-d93989dbc0cd" containerName="registry-server" Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.881497 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.885678 4719 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-fsbr4" Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.885678 4719 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.885721 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.888135 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-wpvvd"] Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.888999 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-wpvvd" Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.890449 4719 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.900843 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-wpvvd"] Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.971967 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-9qhkk"] Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.973241 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-9qhkk" Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.975967 4719 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.976882 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.977056 4719 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-lfpt7" Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.982476 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-42kr4"] Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.983606 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-42kr4" Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.989217 4719 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.989501 4719 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 09 15:31:13 crc kubenswrapper[4719]: I1009 15:31:13.995632 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-42kr4"] Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.010016 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9bqc\" (UniqueName: \"kubernetes.io/projected/f1c10c76-5d7a-4dbc-8688-2017821c1872-kube-api-access-l9bqc\") pod \"frr-k8s-r8hz9\" (UID: \"f1c10c76-5d7a-4dbc-8688-2017821c1872\") " pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.010263 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f1c10c76-5d7a-4dbc-8688-2017821c1872-frr-conf\") pod \"frr-k8s-r8hz9\" (UID: \"f1c10c76-5d7a-4dbc-8688-2017821c1872\") " pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.010406 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f1c10c76-5d7a-4dbc-8688-2017821c1872-frr-startup\") pod \"frr-k8s-r8hz9\" (UID: \"f1c10c76-5d7a-4dbc-8688-2017821c1872\") " pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.010500 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvnd6\" (UniqueName: \"kubernetes.io/projected/1f525824-038f-49b3-9410-10b49819ee01-kube-api-access-qvnd6\") pod \"frr-k8s-webhook-server-64bf5d555-wpvvd\" (UID: \"1f525824-038f-49b3-9410-10b49819ee01\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-wpvvd" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.010600 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f1c10c76-5d7a-4dbc-8688-2017821c1872-metrics\") pod \"frr-k8s-r8hz9\" (UID: \"f1c10c76-5d7a-4dbc-8688-2017821c1872\") " pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.010701 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f525824-038f-49b3-9410-10b49819ee01-cert\") pod \"frr-k8s-webhook-server-64bf5d555-wpvvd\" (UID: \"1f525824-038f-49b3-9410-10b49819ee01\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-wpvvd" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.010788 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f1c10c76-5d7a-4dbc-8688-2017821c1872-metrics-certs\") pod \"frr-k8s-r8hz9\" (UID: \"f1c10c76-5d7a-4dbc-8688-2017821c1872\") " pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.010901 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f1c10c76-5d7a-4dbc-8688-2017821c1872-frr-sockets\") pod \"frr-k8s-r8hz9\" (UID: \"f1c10c76-5d7a-4dbc-8688-2017821c1872\") " pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.010995 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f1c10c76-5d7a-4dbc-8688-2017821c1872-reloader\") pod \"frr-k8s-r8hz9\" (UID: \"f1c10c76-5d7a-4dbc-8688-2017821c1872\") " pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.112189 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f1c10c76-5d7a-4dbc-8688-2017821c1872-frr-sockets\") pod \"frr-k8s-r8hz9\" (UID: \"f1c10c76-5d7a-4dbc-8688-2017821c1872\") " pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.112233 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f1c10c76-5d7a-4dbc-8688-2017821c1872-reloader\") pod \"frr-k8s-r8hz9\" (UID: \"f1c10c76-5d7a-4dbc-8688-2017821c1872\") " pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.112253 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee3449b-a86a-4db6-9e57-233f95dfbad0-metrics-certs\") pod \"controller-68d546b9d8-42kr4\" (UID: \"bee3449b-a86a-4db6-9e57-233f95dfbad0\") " pod="metallb-system/controller-68d546b9d8-42kr4" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.112272 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6298dd42-080d-4d5e-bf61-c798382943a7-metrics-certs\") pod \"speaker-9qhkk\" (UID: \"6298dd42-080d-4d5e-bf61-c798382943a7\") " pod="metallb-system/speaker-9qhkk" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.112293 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6298dd42-080d-4d5e-bf61-c798382943a7-metallb-excludel2\") pod \"speaker-9qhkk\" (UID: \"6298dd42-080d-4d5e-bf61-c798382943a7\") " pod="metallb-system/speaker-9qhkk" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.112311 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6298dd42-080d-4d5e-bf61-c798382943a7-memberlist\") pod \"speaker-9qhkk\" (UID: \"6298dd42-080d-4d5e-bf61-c798382943a7\") " pod="metallb-system/speaker-9qhkk" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.112337 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9bqc\" (UniqueName: \"kubernetes.io/projected/f1c10c76-5d7a-4dbc-8688-2017821c1872-kube-api-access-l9bqc\") pod \"frr-k8s-r8hz9\" (UID: \"f1c10c76-5d7a-4dbc-8688-2017821c1872\") " pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.112373 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49d95\" (UniqueName: \"kubernetes.io/projected/6298dd42-080d-4d5e-bf61-c798382943a7-kube-api-access-49d95\") pod \"speaker-9qhkk\" (UID: \"6298dd42-080d-4d5e-bf61-c798382943a7\") " pod="metallb-system/speaker-9qhkk" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.112389 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f1c10c76-5d7a-4dbc-8688-2017821c1872-frr-conf\") pod \"frr-k8s-r8hz9\" (UID: \"f1c10c76-5d7a-4dbc-8688-2017821c1872\") " pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.112403 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bee3449b-a86a-4db6-9e57-233f95dfbad0-cert\") pod \"controller-68d546b9d8-42kr4\" (UID: \"bee3449b-a86a-4db6-9e57-233f95dfbad0\") " pod="metallb-system/controller-68d546b9d8-42kr4" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.112422 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f1c10c76-5d7a-4dbc-8688-2017821c1872-frr-startup\") pod \"frr-k8s-r8hz9\" (UID: \"f1c10c76-5d7a-4dbc-8688-2017821c1872\") " pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.112459 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvnd6\" (UniqueName: \"kubernetes.io/projected/1f525824-038f-49b3-9410-10b49819ee01-kube-api-access-qvnd6\") pod \"frr-k8s-webhook-server-64bf5d555-wpvvd\" (UID: \"1f525824-038f-49b3-9410-10b49819ee01\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-wpvvd" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.112480 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr78d\" (UniqueName: \"kubernetes.io/projected/bee3449b-a86a-4db6-9e57-233f95dfbad0-kube-api-access-lr78d\") pod \"controller-68d546b9d8-42kr4\" (UID: \"bee3449b-a86a-4db6-9e57-233f95dfbad0\") " pod="metallb-system/controller-68d546b9d8-42kr4" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.112503 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f1c10c76-5d7a-4dbc-8688-2017821c1872-metrics\") pod \"frr-k8s-r8hz9\" (UID: \"f1c10c76-5d7a-4dbc-8688-2017821c1872\") " pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.112522 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f525824-038f-49b3-9410-10b49819ee01-cert\") pod \"frr-k8s-webhook-server-64bf5d555-wpvvd\" (UID: \"1f525824-038f-49b3-9410-10b49819ee01\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-wpvvd" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.112546 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f1c10c76-5d7a-4dbc-8688-2017821c1872-metrics-certs\") pod \"frr-k8s-r8hz9\" (UID: \"f1c10c76-5d7a-4dbc-8688-2017821c1872\") " pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:14 crc kubenswrapper[4719]: E1009 15:31:14.112665 4719 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 09 15:31:14 crc kubenswrapper[4719]: E1009 15:31:14.112713 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1c10c76-5d7a-4dbc-8688-2017821c1872-metrics-certs podName:f1c10c76-5d7a-4dbc-8688-2017821c1872 nodeName:}" failed. No retries permitted until 2025-10-09 15:31:14.612698507 +0000 UTC m=+780.122409792 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f1c10c76-5d7a-4dbc-8688-2017821c1872-metrics-certs") pod "frr-k8s-r8hz9" (UID: "f1c10c76-5d7a-4dbc-8688-2017821c1872") : secret "frr-k8s-certs-secret" not found Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.113548 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f1c10c76-5d7a-4dbc-8688-2017821c1872-frr-sockets\") pod \"frr-k8s-r8hz9\" (UID: \"f1c10c76-5d7a-4dbc-8688-2017821c1872\") " pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.113652 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f1c10c76-5d7a-4dbc-8688-2017821c1872-metrics\") pod \"frr-k8s-r8hz9\" (UID: \"f1c10c76-5d7a-4dbc-8688-2017821c1872\") " pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.113948 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f1c10c76-5d7a-4dbc-8688-2017821c1872-frr-conf\") pod \"frr-k8s-r8hz9\" (UID: \"f1c10c76-5d7a-4dbc-8688-2017821c1872\") " pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.114024 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f1c10c76-5d7a-4dbc-8688-2017821c1872-reloader\") pod \"frr-k8s-r8hz9\" (UID: \"f1c10c76-5d7a-4dbc-8688-2017821c1872\") " pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.114205 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f1c10c76-5d7a-4dbc-8688-2017821c1872-frr-startup\") pod \"frr-k8s-r8hz9\" (UID: \"f1c10c76-5d7a-4dbc-8688-2017821c1872\") " pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.116418 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f525824-038f-49b3-9410-10b49819ee01-cert\") pod \"frr-k8s-webhook-server-64bf5d555-wpvvd\" (UID: \"1f525824-038f-49b3-9410-10b49819ee01\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-wpvvd" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.142972 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9bqc\" (UniqueName: \"kubernetes.io/projected/f1c10c76-5d7a-4dbc-8688-2017821c1872-kube-api-access-l9bqc\") pod \"frr-k8s-r8hz9\" (UID: \"f1c10c76-5d7a-4dbc-8688-2017821c1872\") " pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.145661 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvnd6\" (UniqueName: \"kubernetes.io/projected/1f525824-038f-49b3-9410-10b49819ee01-kube-api-access-qvnd6\") pod \"frr-k8s-webhook-server-64bf5d555-wpvvd\" (UID: \"1f525824-038f-49b3-9410-10b49819ee01\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-wpvvd" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.213765 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr78d\" (UniqueName: \"kubernetes.io/projected/bee3449b-a86a-4db6-9e57-233f95dfbad0-kube-api-access-lr78d\") pod \"controller-68d546b9d8-42kr4\" (UID: \"bee3449b-a86a-4db6-9e57-233f95dfbad0\") " pod="metallb-system/controller-68d546b9d8-42kr4" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.213885 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee3449b-a86a-4db6-9e57-233f95dfbad0-metrics-certs\") pod \"controller-68d546b9d8-42kr4\" (UID: \"bee3449b-a86a-4db6-9e57-233f95dfbad0\") " pod="metallb-system/controller-68d546b9d8-42kr4" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.213911 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6298dd42-080d-4d5e-bf61-c798382943a7-metrics-certs\") pod \"speaker-9qhkk\" (UID: \"6298dd42-080d-4d5e-bf61-c798382943a7\") " pod="metallb-system/speaker-9qhkk" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.213941 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6298dd42-080d-4d5e-bf61-c798382943a7-metallb-excludel2\") pod \"speaker-9qhkk\" (UID: \"6298dd42-080d-4d5e-bf61-c798382943a7\") " pod="metallb-system/speaker-9qhkk" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.213964 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6298dd42-080d-4d5e-bf61-c798382943a7-memberlist\") pod \"speaker-9qhkk\" (UID: \"6298dd42-080d-4d5e-bf61-c798382943a7\") " pod="metallb-system/speaker-9qhkk" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.214007 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49d95\" (UniqueName: \"kubernetes.io/projected/6298dd42-080d-4d5e-bf61-c798382943a7-kube-api-access-49d95\") pod \"speaker-9qhkk\" (UID: \"6298dd42-080d-4d5e-bf61-c798382943a7\") " pod="metallb-system/speaker-9qhkk" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.214028 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bee3449b-a86a-4db6-9e57-233f95dfbad0-cert\") pod \"controller-68d546b9d8-42kr4\" (UID: \"bee3449b-a86a-4db6-9e57-233f95dfbad0\") " pod="metallb-system/controller-68d546b9d8-42kr4" Oct 09 15:31:14 crc kubenswrapper[4719]: E1009 15:31:14.214336 4719 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 09 15:31:14 crc kubenswrapper[4719]: E1009 15:31:14.214567 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6298dd42-080d-4d5e-bf61-c798382943a7-memberlist podName:6298dd42-080d-4d5e-bf61-c798382943a7 nodeName:}" failed. No retries permitted until 2025-10-09 15:31:14.714549543 +0000 UTC m=+780.224260828 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6298dd42-080d-4d5e-bf61-c798382943a7-memberlist") pod "speaker-9qhkk" (UID: "6298dd42-080d-4d5e-bf61-c798382943a7") : secret "metallb-memberlist" not found Oct 09 15:31:14 crc kubenswrapper[4719]: E1009 15:31:14.214499 4719 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Oct 09 15:31:14 crc kubenswrapper[4719]: E1009 15:31:14.214744 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bee3449b-a86a-4db6-9e57-233f95dfbad0-metrics-certs podName:bee3449b-a86a-4db6-9e57-233f95dfbad0 nodeName:}" failed. No retries permitted until 2025-10-09 15:31:14.714735929 +0000 UTC m=+780.224447214 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bee3449b-a86a-4db6-9e57-233f95dfbad0-metrics-certs") pod "controller-68d546b9d8-42kr4" (UID: "bee3449b-a86a-4db6-9e57-233f95dfbad0") : secret "controller-certs-secret" not found Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.214762 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6298dd42-080d-4d5e-bf61-c798382943a7-metallb-excludel2\") pod \"speaker-9qhkk\" (UID: \"6298dd42-080d-4d5e-bf61-c798382943a7\") " pod="metallb-system/speaker-9qhkk" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.215481 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-wpvvd" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.217122 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bee3449b-a86a-4db6-9e57-233f95dfbad0-cert\") pod \"controller-68d546b9d8-42kr4\" (UID: \"bee3449b-a86a-4db6-9e57-233f95dfbad0\") " pod="metallb-system/controller-68d546b9d8-42kr4" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.217444 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6298dd42-080d-4d5e-bf61-c798382943a7-metrics-certs\") pod \"speaker-9qhkk\" (UID: \"6298dd42-080d-4d5e-bf61-c798382943a7\") " pod="metallb-system/speaker-9qhkk" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.230264 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr78d\" (UniqueName: \"kubernetes.io/projected/bee3449b-a86a-4db6-9e57-233f95dfbad0-kube-api-access-lr78d\") pod \"controller-68d546b9d8-42kr4\" (UID: \"bee3449b-a86a-4db6-9e57-233f95dfbad0\") " pod="metallb-system/controller-68d546b9d8-42kr4" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.237970 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49d95\" (UniqueName: \"kubernetes.io/projected/6298dd42-080d-4d5e-bf61-c798382943a7-kube-api-access-49d95\") pod \"speaker-9qhkk\" (UID: \"6298dd42-080d-4d5e-bf61-c798382943a7\") " pod="metallb-system/speaker-9qhkk" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.620905 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f1c10c76-5d7a-4dbc-8688-2017821c1872-metrics-certs\") pod \"frr-k8s-r8hz9\" (UID: \"f1c10c76-5d7a-4dbc-8688-2017821c1872\") " pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.627695 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f1c10c76-5d7a-4dbc-8688-2017821c1872-metrics-certs\") pod \"frr-k8s-r8hz9\" (UID: \"f1c10c76-5d7a-4dbc-8688-2017821c1872\") " pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.640869 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-wpvvd"] Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.721828 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee3449b-a86a-4db6-9e57-233f95dfbad0-metrics-certs\") pod \"controller-68d546b9d8-42kr4\" (UID: \"bee3449b-a86a-4db6-9e57-233f95dfbad0\") " pod="metallb-system/controller-68d546b9d8-42kr4" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.721898 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6298dd42-080d-4d5e-bf61-c798382943a7-memberlist\") pod \"speaker-9qhkk\" (UID: \"6298dd42-080d-4d5e-bf61-c798382943a7\") " pod="metallb-system/speaker-9qhkk" Oct 09 15:31:14 crc kubenswrapper[4719]: E1009 15:31:14.722066 4719 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 09 15:31:14 crc kubenswrapper[4719]: E1009 15:31:14.722125 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6298dd42-080d-4d5e-bf61-c798382943a7-memberlist podName:6298dd42-080d-4d5e-bf61-c798382943a7 nodeName:}" failed. No retries permitted until 2025-10-09 15:31:15.72210846 +0000 UTC m=+781.231819745 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6298dd42-080d-4d5e-bf61-c798382943a7-memberlist") pod "speaker-9qhkk" (UID: "6298dd42-080d-4d5e-bf61-c798382943a7") : secret "metallb-memberlist" not found Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.725030 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bee3449b-a86a-4db6-9e57-233f95dfbad0-metrics-certs\") pod \"controller-68d546b9d8-42kr4\" (UID: \"bee3449b-a86a-4db6-9e57-233f95dfbad0\") " pod="metallb-system/controller-68d546b9d8-42kr4" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.796854 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:14 crc kubenswrapper[4719]: I1009 15:31:14.903934 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-42kr4" Oct 09 15:31:15 crc kubenswrapper[4719]: I1009 15:31:15.288921 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-42kr4"] Oct 09 15:31:15 crc kubenswrapper[4719]: W1009 15:31:15.294457 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbee3449b_a86a_4db6_9e57_233f95dfbad0.slice/crio-60df26c41b371758c591ad76bd0416ddb9fb3562291ea5375d7c0a85dceb9e5a WatchSource:0}: Error finding container 60df26c41b371758c591ad76bd0416ddb9fb3562291ea5375d7c0a85dceb9e5a: Status 404 returned error can't find the container with id 60df26c41b371758c591ad76bd0416ddb9fb3562291ea5375d7c0a85dceb9e5a Oct 09 15:31:15 crc kubenswrapper[4719]: I1009 15:31:15.441902 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-42kr4" event={"ID":"bee3449b-a86a-4db6-9e57-233f95dfbad0","Type":"ContainerStarted","Data":"60df26c41b371758c591ad76bd0416ddb9fb3562291ea5375d7c0a85dceb9e5a"} Oct 09 15:31:15 crc kubenswrapper[4719]: I1009 15:31:15.442801 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-wpvvd" event={"ID":"1f525824-038f-49b3-9410-10b49819ee01","Type":"ContainerStarted","Data":"a21c811a2fb7358059017112f8b78cb91a0036991413afa33c8d6b374c1b05e7"} Oct 09 15:31:15 crc kubenswrapper[4719]: I1009 15:31:15.443957 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r8hz9" event={"ID":"f1c10c76-5d7a-4dbc-8688-2017821c1872","Type":"ContainerStarted","Data":"1e5a35c2970f0c0de456ed66b89b737e9a096bab7d1f021a859585719cf93091"} Oct 09 15:31:15 crc kubenswrapper[4719]: I1009 15:31:15.738489 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6298dd42-080d-4d5e-bf61-c798382943a7-memberlist\") pod \"speaker-9qhkk\" (UID: \"6298dd42-080d-4d5e-bf61-c798382943a7\") " pod="metallb-system/speaker-9qhkk" Oct 09 15:31:15 crc kubenswrapper[4719]: I1009 15:31:15.749599 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6298dd42-080d-4d5e-bf61-c798382943a7-memberlist\") pod \"speaker-9qhkk\" (UID: \"6298dd42-080d-4d5e-bf61-c798382943a7\") " pod="metallb-system/speaker-9qhkk" Oct 09 15:31:15 crc kubenswrapper[4719]: I1009 15:31:15.801568 4719 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-lfpt7" Oct 09 15:31:15 crc kubenswrapper[4719]: I1009 15:31:15.810108 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-9qhkk" Oct 09 15:31:16 crc kubenswrapper[4719]: I1009 15:31:16.471444 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-42kr4" event={"ID":"bee3449b-a86a-4db6-9e57-233f95dfbad0","Type":"ContainerStarted","Data":"eef2bc68d09ab00f7f3db0402b6f06749458e270e9cb04d942ce1436996f14a8"} Oct 09 15:31:16 crc kubenswrapper[4719]: I1009 15:31:16.474300 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-42kr4" event={"ID":"bee3449b-a86a-4db6-9e57-233f95dfbad0","Type":"ContainerStarted","Data":"56483592dd88aab8b4b24eee3d4ceedd11e634746f0c6111656af75d48dbaf75"} Oct 09 15:31:16 crc kubenswrapper[4719]: I1009 15:31:16.474333 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-42kr4" Oct 09 15:31:16 crc kubenswrapper[4719]: I1009 15:31:16.477879 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-9qhkk" event={"ID":"6298dd42-080d-4d5e-bf61-c798382943a7","Type":"ContainerStarted","Data":"679251ad50ddf21907de5f13994ede3d9190a65ccfab6226e0ee8756f96eff7d"} Oct 09 15:31:16 crc kubenswrapper[4719]: I1009 15:31:16.477938 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-9qhkk" event={"ID":"6298dd42-080d-4d5e-bf61-c798382943a7","Type":"ContainerStarted","Data":"7e88d2581df65faf53dfe1dc7cceb5c7c3bcd435a0fef2c4cebdb3b3db7b5d5f"} Oct 09 15:31:16 crc kubenswrapper[4719]: I1009 15:31:16.477955 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-9qhkk" event={"ID":"6298dd42-080d-4d5e-bf61-c798382943a7","Type":"ContainerStarted","Data":"f9b710d4172f9fab589d4ea2f0a338f4db788046c8745b510c73d4147c9b8fe4"} Oct 09 15:31:16 crc kubenswrapper[4719]: I1009 15:31:16.478361 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-9qhkk" Oct 09 15:31:16 crc kubenswrapper[4719]: I1009 15:31:16.504044 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-42kr4" podStartSLOduration=3.5040258189999998 podStartE2EDuration="3.504025819s" podCreationTimestamp="2025-10-09 15:31:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:31:16.49653033 +0000 UTC m=+782.006241635" watchObservedRunningTime="2025-10-09 15:31:16.504025819 +0000 UTC m=+782.013737104" Oct 09 15:31:22 crc kubenswrapper[4719]: I1009 15:31:22.575067 4719 generic.go:334] "Generic (PLEG): container finished" podID="f1c10c76-5d7a-4dbc-8688-2017821c1872" containerID="23cb00224585f5a9c57915180373a4f9053121fa7920c3310529bbcad788a0a2" exitCode=0 Oct 09 15:31:22 crc kubenswrapper[4719]: I1009 15:31:22.575733 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r8hz9" event={"ID":"f1c10c76-5d7a-4dbc-8688-2017821c1872","Type":"ContainerDied","Data":"23cb00224585f5a9c57915180373a4f9053121fa7920c3310529bbcad788a0a2"} Oct 09 15:31:22 crc kubenswrapper[4719]: I1009 15:31:22.578335 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-wpvvd" event={"ID":"1f525824-038f-49b3-9410-10b49819ee01","Type":"ContainerStarted","Data":"6cc693836bfcb9554bc94e041c59bce0153c5fe468b37d12688b448288b44059"} Oct 09 15:31:22 crc kubenswrapper[4719]: I1009 15:31:22.578924 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-wpvvd" Oct 09 15:31:22 crc kubenswrapper[4719]: I1009 15:31:22.608546 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-9qhkk" podStartSLOduration=9.608525334 podStartE2EDuration="9.608525334s" podCreationTimestamp="2025-10-09 15:31:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:31:16.529519704 +0000 UTC m=+782.039230999" watchObservedRunningTime="2025-10-09 15:31:22.608525334 +0000 UTC m=+788.118236629" Oct 09 15:31:22 crc kubenswrapper[4719]: I1009 15:31:22.621002 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-wpvvd" podStartSLOduration=2.8366842549999998 podStartE2EDuration="9.620983803s" podCreationTimestamp="2025-10-09 15:31:13 +0000 UTC" firstStartedPulling="2025-10-09 15:31:14.649141068 +0000 UTC m=+780.158852353" lastFinishedPulling="2025-10-09 15:31:21.433440616 +0000 UTC m=+786.943151901" observedRunningTime="2025-10-09 15:31:22.618450782 +0000 UTC m=+788.128162077" watchObservedRunningTime="2025-10-09 15:31:22.620983803 +0000 UTC m=+788.130695108" Oct 09 15:31:23 crc kubenswrapper[4719]: I1009 15:31:23.587060 4719 generic.go:334] "Generic (PLEG): container finished" podID="f1c10c76-5d7a-4dbc-8688-2017821c1872" containerID="8c22076658a8143bf245016da65cd6a1837fedbd84948d3b67c9405ad82fa6a8" exitCode=0 Oct 09 15:31:23 crc kubenswrapper[4719]: I1009 15:31:23.587160 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r8hz9" event={"ID":"f1c10c76-5d7a-4dbc-8688-2017821c1872","Type":"ContainerDied","Data":"8c22076658a8143bf245016da65cd6a1837fedbd84948d3b67c9405ad82fa6a8"} Oct 09 15:31:24 crc kubenswrapper[4719]: I1009 15:31:24.596515 4719 generic.go:334] "Generic (PLEG): container finished" podID="f1c10c76-5d7a-4dbc-8688-2017821c1872" containerID="ec5fa03eb2464bb20eea0f0eceb1f20de8ce19f47e9fc2510dfbdf5be9191544" exitCode=0 Oct 09 15:31:24 crc kubenswrapper[4719]: I1009 15:31:24.596599 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r8hz9" event={"ID":"f1c10c76-5d7a-4dbc-8688-2017821c1872","Type":"ContainerDied","Data":"ec5fa03eb2464bb20eea0f0eceb1f20de8ce19f47e9fc2510dfbdf5be9191544"} Oct 09 15:31:25 crc kubenswrapper[4719]: I1009 15:31:25.605410 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r8hz9" event={"ID":"f1c10c76-5d7a-4dbc-8688-2017821c1872","Type":"ContainerStarted","Data":"d678457f23abe5d12041a186dd09dd00475882c7e0e72942eed1cbd48488ddc5"} Oct 09 15:31:25 crc kubenswrapper[4719]: I1009 15:31:25.605457 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r8hz9" event={"ID":"f1c10c76-5d7a-4dbc-8688-2017821c1872","Type":"ContainerStarted","Data":"35e259832bb1e83aafc9ffb6e76783b47b33a5759448ce790c9ad0e438bb8a13"} Oct 09 15:31:25 crc kubenswrapper[4719]: I1009 15:31:25.605471 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r8hz9" event={"ID":"f1c10c76-5d7a-4dbc-8688-2017821c1872","Type":"ContainerStarted","Data":"3d421d88095f6a3d49fa23c5242310ebe9a9d7259b285abce8419f81280fb72e"} Oct 09 15:31:25 crc kubenswrapper[4719]: I1009 15:31:25.605481 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r8hz9" event={"ID":"f1c10c76-5d7a-4dbc-8688-2017821c1872","Type":"ContainerStarted","Data":"44aa4fb23c5fbe1fcfe022e4c3e42d6f1053628a8cec1ad7d061f7a225c8b801"} Oct 09 15:31:26 crc kubenswrapper[4719]: I1009 15:31:26.615210 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r8hz9" event={"ID":"f1c10c76-5d7a-4dbc-8688-2017821c1872","Type":"ContainerStarted","Data":"3a2d2fe751272ab81e9246de799d3587106b36182276f5364a9bf666f7675996"} Oct 09 15:31:26 crc kubenswrapper[4719]: I1009 15:31:26.615500 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r8hz9" event={"ID":"f1c10c76-5d7a-4dbc-8688-2017821c1872","Type":"ContainerStarted","Data":"41a5a395c2dcfe426fdaf86a0a59250922ab706d8a37be84c09ad34afd859363"} Oct 09 15:31:26 crc kubenswrapper[4719]: I1009 15:31:26.615544 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:26 crc kubenswrapper[4719]: I1009 15:31:26.646316 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-r8hz9" podStartSLOduration=7.082811479 podStartE2EDuration="13.646299578s" podCreationTimestamp="2025-10-09 15:31:13 +0000 UTC" firstStartedPulling="2025-10-09 15:31:14.897821848 +0000 UTC m=+780.407533133" lastFinishedPulling="2025-10-09 15:31:21.461309947 +0000 UTC m=+786.971021232" observedRunningTime="2025-10-09 15:31:26.64229641 +0000 UTC m=+792.152007715" watchObservedRunningTime="2025-10-09 15:31:26.646299578 +0000 UTC m=+792.156010863" Oct 09 15:31:28 crc kubenswrapper[4719]: I1009 15:31:28.528224 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mx9bg"] Oct 09 15:31:28 crc kubenswrapper[4719]: I1009 15:31:28.529565 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mx9bg" Oct 09 15:31:28 crc kubenswrapper[4719]: I1009 15:31:28.540883 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06db29ff-605f-4e2d-894c-b8380f5ae7b7-utilities\") pod \"redhat-marketplace-mx9bg\" (UID: \"06db29ff-605f-4e2d-894c-b8380f5ae7b7\") " pod="openshift-marketplace/redhat-marketplace-mx9bg" Oct 09 15:31:28 crc kubenswrapper[4719]: I1009 15:31:28.540935 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06db29ff-605f-4e2d-894c-b8380f5ae7b7-catalog-content\") pod \"redhat-marketplace-mx9bg\" (UID: \"06db29ff-605f-4e2d-894c-b8380f5ae7b7\") " pod="openshift-marketplace/redhat-marketplace-mx9bg" Oct 09 15:31:28 crc kubenswrapper[4719]: I1009 15:31:28.541010 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7ln2\" (UniqueName: \"kubernetes.io/projected/06db29ff-605f-4e2d-894c-b8380f5ae7b7-kube-api-access-k7ln2\") pod \"redhat-marketplace-mx9bg\" (UID: \"06db29ff-605f-4e2d-894c-b8380f5ae7b7\") " pod="openshift-marketplace/redhat-marketplace-mx9bg" Oct 09 15:31:28 crc kubenswrapper[4719]: I1009 15:31:28.546416 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mx9bg"] Oct 09 15:31:28 crc kubenswrapper[4719]: I1009 15:31:28.642173 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06db29ff-605f-4e2d-894c-b8380f5ae7b7-catalog-content\") pod \"redhat-marketplace-mx9bg\" (UID: \"06db29ff-605f-4e2d-894c-b8380f5ae7b7\") " pod="openshift-marketplace/redhat-marketplace-mx9bg" Oct 09 15:31:28 crc kubenswrapper[4719]: I1009 15:31:28.642406 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7ln2\" (UniqueName: \"kubernetes.io/projected/06db29ff-605f-4e2d-894c-b8380f5ae7b7-kube-api-access-k7ln2\") pod \"redhat-marketplace-mx9bg\" (UID: \"06db29ff-605f-4e2d-894c-b8380f5ae7b7\") " pod="openshift-marketplace/redhat-marketplace-mx9bg" Oct 09 15:31:28 crc kubenswrapper[4719]: I1009 15:31:28.642464 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06db29ff-605f-4e2d-894c-b8380f5ae7b7-utilities\") pod \"redhat-marketplace-mx9bg\" (UID: \"06db29ff-605f-4e2d-894c-b8380f5ae7b7\") " pod="openshift-marketplace/redhat-marketplace-mx9bg" Oct 09 15:31:28 crc kubenswrapper[4719]: I1009 15:31:28.643000 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06db29ff-605f-4e2d-894c-b8380f5ae7b7-utilities\") pod \"redhat-marketplace-mx9bg\" (UID: \"06db29ff-605f-4e2d-894c-b8380f5ae7b7\") " pod="openshift-marketplace/redhat-marketplace-mx9bg" Oct 09 15:31:28 crc kubenswrapper[4719]: I1009 15:31:28.643003 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06db29ff-605f-4e2d-894c-b8380f5ae7b7-catalog-content\") pod \"redhat-marketplace-mx9bg\" (UID: \"06db29ff-605f-4e2d-894c-b8380f5ae7b7\") " pod="openshift-marketplace/redhat-marketplace-mx9bg" Oct 09 15:31:28 crc kubenswrapper[4719]: I1009 15:31:28.661599 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7ln2\" (UniqueName: \"kubernetes.io/projected/06db29ff-605f-4e2d-894c-b8380f5ae7b7-kube-api-access-k7ln2\") pod \"redhat-marketplace-mx9bg\" (UID: \"06db29ff-605f-4e2d-894c-b8380f5ae7b7\") " pod="openshift-marketplace/redhat-marketplace-mx9bg" Oct 09 15:31:28 crc kubenswrapper[4719]: I1009 15:31:28.845718 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mx9bg" Oct 09 15:31:29 crc kubenswrapper[4719]: I1009 15:31:29.233874 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mx9bg"] Oct 09 15:31:29 crc kubenswrapper[4719]: W1009 15:31:29.246528 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06db29ff_605f_4e2d_894c_b8380f5ae7b7.slice/crio-3db432fccb4b23c546ca6ed5458270732208ba875c8033fbd9824b928a378da6 WatchSource:0}: Error finding container 3db432fccb4b23c546ca6ed5458270732208ba875c8033fbd9824b928a378da6: Status 404 returned error can't find the container with id 3db432fccb4b23c546ca6ed5458270732208ba875c8033fbd9824b928a378da6 Oct 09 15:31:29 crc kubenswrapper[4719]: I1009 15:31:29.632120 4719 generic.go:334] "Generic (PLEG): container finished" podID="06db29ff-605f-4e2d-894c-b8380f5ae7b7" containerID="70f8ce8ef932dd68ac75529c220eded601980cf7b4cd79e76b8694cd3789a1e0" exitCode=0 Oct 09 15:31:29 crc kubenswrapper[4719]: I1009 15:31:29.632164 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mx9bg" event={"ID":"06db29ff-605f-4e2d-894c-b8380f5ae7b7","Type":"ContainerDied","Data":"70f8ce8ef932dd68ac75529c220eded601980cf7b4cd79e76b8694cd3789a1e0"} Oct 09 15:31:29 crc kubenswrapper[4719]: I1009 15:31:29.632187 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mx9bg" event={"ID":"06db29ff-605f-4e2d-894c-b8380f5ae7b7","Type":"ContainerStarted","Data":"3db432fccb4b23c546ca6ed5458270732208ba875c8033fbd9824b928a378da6"} Oct 09 15:31:29 crc kubenswrapper[4719]: I1009 15:31:29.797530 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:29 crc kubenswrapper[4719]: I1009 15:31:29.832448 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:30 crc kubenswrapper[4719]: I1009 15:31:30.641559 4719 generic.go:334] "Generic (PLEG): container finished" podID="06db29ff-605f-4e2d-894c-b8380f5ae7b7" containerID="d816aa1498fef08288485e2371871047fb319da091d0b887ccdc3b385be3a2ec" exitCode=0 Oct 09 15:31:30 crc kubenswrapper[4719]: I1009 15:31:30.642991 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mx9bg" event={"ID":"06db29ff-605f-4e2d-894c-b8380f5ae7b7","Type":"ContainerDied","Data":"d816aa1498fef08288485e2371871047fb319da091d0b887ccdc3b385be3a2ec"} Oct 09 15:31:31 crc kubenswrapper[4719]: I1009 15:31:31.648488 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mx9bg" event={"ID":"06db29ff-605f-4e2d-894c-b8380f5ae7b7","Type":"ContainerStarted","Data":"2ff33d9d2ad90c1199e33464ee37a6ecb76cfd9d13fb0de0fe4bd0b393335f02"} Oct 09 15:31:31 crc kubenswrapper[4719]: I1009 15:31:31.664530 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mx9bg" podStartSLOduration=2.119143328 podStartE2EDuration="3.664510725s" podCreationTimestamp="2025-10-09 15:31:28 +0000 UTC" firstStartedPulling="2025-10-09 15:31:29.633943305 +0000 UTC m=+795.143654590" lastFinishedPulling="2025-10-09 15:31:31.179310692 +0000 UTC m=+796.689021987" observedRunningTime="2025-10-09 15:31:31.663597515 +0000 UTC m=+797.173308830" watchObservedRunningTime="2025-10-09 15:31:31.664510725 +0000 UTC m=+797.174222010" Oct 09 15:31:34 crc kubenswrapper[4719]: I1009 15:31:34.220633 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-wpvvd" Oct 09 15:31:34 crc kubenswrapper[4719]: I1009 15:31:34.800122 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-r8hz9" Oct 09 15:31:34 crc kubenswrapper[4719]: I1009 15:31:34.908807 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-42kr4" Oct 09 15:31:35 crc kubenswrapper[4719]: I1009 15:31:35.814078 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-9qhkk" Oct 09 15:31:36 crc kubenswrapper[4719]: I1009 15:31:36.976893 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:31:36 crc kubenswrapper[4719]: I1009 15:31:36.977253 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:31:38 crc kubenswrapper[4719]: I1009 15:31:38.760952 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-zmr4p"] Oct 09 15:31:38 crc kubenswrapper[4719]: I1009 15:31:38.761761 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zmr4p" Oct 09 15:31:38 crc kubenswrapper[4719]: I1009 15:31:38.764231 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-9slxn" Oct 09 15:31:38 crc kubenswrapper[4719]: I1009 15:31:38.765762 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 09 15:31:38 crc kubenswrapper[4719]: I1009 15:31:38.767141 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 09 15:31:38 crc kubenswrapper[4719]: I1009 15:31:38.784612 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zmr4p"] Oct 09 15:31:38 crc kubenswrapper[4719]: I1009 15:31:38.846514 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mx9bg" Oct 09 15:31:38 crc kubenswrapper[4719]: I1009 15:31:38.847540 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mx9bg" Oct 09 15:31:38 crc kubenswrapper[4719]: I1009 15:31:38.865994 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb2ll\" (UniqueName: \"kubernetes.io/projected/395419d4-0335-43f2-ba3c-a98568f97b69-kube-api-access-qb2ll\") pod \"openstack-operator-index-zmr4p\" (UID: \"395419d4-0335-43f2-ba3c-a98568f97b69\") " pod="openstack-operators/openstack-operator-index-zmr4p" Oct 09 15:31:38 crc kubenswrapper[4719]: I1009 15:31:38.891007 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mx9bg" Oct 09 15:31:38 crc kubenswrapper[4719]: I1009 15:31:38.967395 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb2ll\" (UniqueName: \"kubernetes.io/projected/395419d4-0335-43f2-ba3c-a98568f97b69-kube-api-access-qb2ll\") pod \"openstack-operator-index-zmr4p\" (UID: \"395419d4-0335-43f2-ba3c-a98568f97b69\") " pod="openstack-operators/openstack-operator-index-zmr4p" Oct 09 15:31:38 crc kubenswrapper[4719]: I1009 15:31:38.985753 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb2ll\" (UniqueName: \"kubernetes.io/projected/395419d4-0335-43f2-ba3c-a98568f97b69-kube-api-access-qb2ll\") pod \"openstack-operator-index-zmr4p\" (UID: \"395419d4-0335-43f2-ba3c-a98568f97b69\") " pod="openstack-operators/openstack-operator-index-zmr4p" Oct 09 15:31:39 crc kubenswrapper[4719]: I1009 15:31:39.080955 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zmr4p" Oct 09 15:31:39 crc kubenswrapper[4719]: I1009 15:31:39.468651 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zmr4p"] Oct 09 15:31:39 crc kubenswrapper[4719]: W1009 15:31:39.475181 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod395419d4_0335_43f2_ba3c_a98568f97b69.slice/crio-a697e8ce01cd823b0d3b8053b298e065d9064c65c8fbb1e6b25817674e08a49a WatchSource:0}: Error finding container a697e8ce01cd823b0d3b8053b298e065d9064c65c8fbb1e6b25817674e08a49a: Status 404 returned error can't find the container with id a697e8ce01cd823b0d3b8053b298e065d9064c65c8fbb1e6b25817674e08a49a Oct 09 15:31:39 crc kubenswrapper[4719]: I1009 15:31:39.694888 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zmr4p" event={"ID":"395419d4-0335-43f2-ba3c-a98568f97b69","Type":"ContainerStarted","Data":"a697e8ce01cd823b0d3b8053b298e065d9064c65c8fbb1e6b25817674e08a49a"} Oct 09 15:31:39 crc kubenswrapper[4719]: I1009 15:31:39.737151 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mx9bg" Oct 09 15:31:41 crc kubenswrapper[4719]: I1009 15:31:41.711432 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zmr4p" event={"ID":"395419d4-0335-43f2-ba3c-a98568f97b69","Type":"ContainerStarted","Data":"eb6710427f519d276d174a01b7a131365e8ea15c0b94a5508830d7977bc56217"} Oct 09 15:31:41 crc kubenswrapper[4719]: I1009 15:31:41.738860 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-zmr4p" podStartSLOduration=1.985935307 podStartE2EDuration="3.738838129s" podCreationTimestamp="2025-10-09 15:31:38 +0000 UTC" firstStartedPulling="2025-10-09 15:31:39.477638637 +0000 UTC m=+804.987349922" lastFinishedPulling="2025-10-09 15:31:41.230541459 +0000 UTC m=+806.740252744" observedRunningTime="2025-10-09 15:31:41.724706817 +0000 UTC m=+807.234418112" watchObservedRunningTime="2025-10-09 15:31:41.738838129 +0000 UTC m=+807.248549414" Oct 09 15:31:42 crc kubenswrapper[4719]: I1009 15:31:42.336650 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-zmr4p"] Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.146445 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-cmjrg"] Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.147415 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cmjrg" Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.152908 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cmjrg"] Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.322140 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdrr9\" (UniqueName: \"kubernetes.io/projected/21fdeec4-a518-4f1f-a27d-50d49e078d3d-kube-api-access-qdrr9\") pod \"openstack-operator-index-cmjrg\" (UID: \"21fdeec4-a518-4f1f-a27d-50d49e078d3d\") " pod="openstack-operators/openstack-operator-index-cmjrg" Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.338507 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mx9bg"] Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.338718 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mx9bg" podUID="06db29ff-605f-4e2d-894c-b8380f5ae7b7" containerName="registry-server" containerID="cri-o://2ff33d9d2ad90c1199e33464ee37a6ecb76cfd9d13fb0de0fe4bd0b393335f02" gracePeriod=2 Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.423032 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdrr9\" (UniqueName: \"kubernetes.io/projected/21fdeec4-a518-4f1f-a27d-50d49e078d3d-kube-api-access-qdrr9\") pod \"openstack-operator-index-cmjrg\" (UID: \"21fdeec4-a518-4f1f-a27d-50d49e078d3d\") " pod="openstack-operators/openstack-operator-index-cmjrg" Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.440505 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdrr9\" (UniqueName: \"kubernetes.io/projected/21fdeec4-a518-4f1f-a27d-50d49e078d3d-kube-api-access-qdrr9\") pod \"openstack-operator-index-cmjrg\" (UID: \"21fdeec4-a518-4f1f-a27d-50d49e078d3d\") " pod="openstack-operators/openstack-operator-index-cmjrg" Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.467088 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cmjrg" Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.705203 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mx9bg" Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.734090 4719 generic.go:334] "Generic (PLEG): container finished" podID="06db29ff-605f-4e2d-894c-b8380f5ae7b7" containerID="2ff33d9d2ad90c1199e33464ee37a6ecb76cfd9d13fb0de0fe4bd0b393335f02" exitCode=0 Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.734229 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-zmr4p" podUID="395419d4-0335-43f2-ba3c-a98568f97b69" containerName="registry-server" containerID="cri-o://eb6710427f519d276d174a01b7a131365e8ea15c0b94a5508830d7977bc56217" gracePeriod=2 Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.734588 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mx9bg" Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.734820 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mx9bg" event={"ID":"06db29ff-605f-4e2d-894c-b8380f5ae7b7","Type":"ContainerDied","Data":"2ff33d9d2ad90c1199e33464ee37a6ecb76cfd9d13fb0de0fe4bd0b393335f02"} Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.734842 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mx9bg" event={"ID":"06db29ff-605f-4e2d-894c-b8380f5ae7b7","Type":"ContainerDied","Data":"3db432fccb4b23c546ca6ed5458270732208ba875c8033fbd9824b928a378da6"} Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.734859 4719 scope.go:117] "RemoveContainer" containerID="2ff33d9d2ad90c1199e33464ee37a6ecb76cfd9d13fb0de0fe4bd0b393335f02" Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.753150 4719 scope.go:117] "RemoveContainer" containerID="d816aa1498fef08288485e2371871047fb319da091d0b887ccdc3b385be3a2ec" Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.828317 4719 scope.go:117] "RemoveContainer" containerID="70f8ce8ef932dd68ac75529c220eded601980cf7b4cd79e76b8694cd3789a1e0" Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.828537 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06db29ff-605f-4e2d-894c-b8380f5ae7b7-utilities\") pod \"06db29ff-605f-4e2d-894c-b8380f5ae7b7\" (UID: \"06db29ff-605f-4e2d-894c-b8380f5ae7b7\") " Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.828615 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06db29ff-605f-4e2d-894c-b8380f5ae7b7-catalog-content\") pod \"06db29ff-605f-4e2d-894c-b8380f5ae7b7\" (UID: \"06db29ff-605f-4e2d-894c-b8380f5ae7b7\") " Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.828652 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7ln2\" (UniqueName: \"kubernetes.io/projected/06db29ff-605f-4e2d-894c-b8380f5ae7b7-kube-api-access-k7ln2\") pod \"06db29ff-605f-4e2d-894c-b8380f5ae7b7\" (UID: \"06db29ff-605f-4e2d-894c-b8380f5ae7b7\") " Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.829890 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06db29ff-605f-4e2d-894c-b8380f5ae7b7-utilities" (OuterVolumeSpecName: "utilities") pod "06db29ff-605f-4e2d-894c-b8380f5ae7b7" (UID: "06db29ff-605f-4e2d-894c-b8380f5ae7b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.832498 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06db29ff-605f-4e2d-894c-b8380f5ae7b7-kube-api-access-k7ln2" (OuterVolumeSpecName: "kube-api-access-k7ln2") pod "06db29ff-605f-4e2d-894c-b8380f5ae7b7" (UID: "06db29ff-605f-4e2d-894c-b8380f5ae7b7"). InnerVolumeSpecName "kube-api-access-k7ln2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.841761 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06db29ff-605f-4e2d-894c-b8380f5ae7b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "06db29ff-605f-4e2d-894c-b8380f5ae7b7" (UID: "06db29ff-605f-4e2d-894c-b8380f5ae7b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.842137 4719 scope.go:117] "RemoveContainer" containerID="2ff33d9d2ad90c1199e33464ee37a6ecb76cfd9d13fb0de0fe4bd0b393335f02" Oct 09 15:31:43 crc kubenswrapper[4719]: E1009 15:31:43.843574 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ff33d9d2ad90c1199e33464ee37a6ecb76cfd9d13fb0de0fe4bd0b393335f02\": container with ID starting with 2ff33d9d2ad90c1199e33464ee37a6ecb76cfd9d13fb0de0fe4bd0b393335f02 not found: ID does not exist" containerID="2ff33d9d2ad90c1199e33464ee37a6ecb76cfd9d13fb0de0fe4bd0b393335f02" Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.843619 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ff33d9d2ad90c1199e33464ee37a6ecb76cfd9d13fb0de0fe4bd0b393335f02"} err="failed to get container status \"2ff33d9d2ad90c1199e33464ee37a6ecb76cfd9d13fb0de0fe4bd0b393335f02\": rpc error: code = NotFound desc = could not find container \"2ff33d9d2ad90c1199e33464ee37a6ecb76cfd9d13fb0de0fe4bd0b393335f02\": container with ID starting with 2ff33d9d2ad90c1199e33464ee37a6ecb76cfd9d13fb0de0fe4bd0b393335f02 not found: ID does not exist" Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.843644 4719 scope.go:117] "RemoveContainer" containerID="d816aa1498fef08288485e2371871047fb319da091d0b887ccdc3b385be3a2ec" Oct 09 15:31:43 crc kubenswrapper[4719]: E1009 15:31:43.844052 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d816aa1498fef08288485e2371871047fb319da091d0b887ccdc3b385be3a2ec\": container with ID starting with d816aa1498fef08288485e2371871047fb319da091d0b887ccdc3b385be3a2ec not found: ID does not exist" containerID="d816aa1498fef08288485e2371871047fb319da091d0b887ccdc3b385be3a2ec" Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.844080 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d816aa1498fef08288485e2371871047fb319da091d0b887ccdc3b385be3a2ec"} err="failed to get container status \"d816aa1498fef08288485e2371871047fb319da091d0b887ccdc3b385be3a2ec\": rpc error: code = NotFound desc = could not find container \"d816aa1498fef08288485e2371871047fb319da091d0b887ccdc3b385be3a2ec\": container with ID starting with d816aa1498fef08288485e2371871047fb319da091d0b887ccdc3b385be3a2ec not found: ID does not exist" Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.844098 4719 scope.go:117] "RemoveContainer" containerID="70f8ce8ef932dd68ac75529c220eded601980cf7b4cd79e76b8694cd3789a1e0" Oct 09 15:31:43 crc kubenswrapper[4719]: E1009 15:31:43.844372 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70f8ce8ef932dd68ac75529c220eded601980cf7b4cd79e76b8694cd3789a1e0\": container with ID starting with 70f8ce8ef932dd68ac75529c220eded601980cf7b4cd79e76b8694cd3789a1e0 not found: ID does not exist" containerID="70f8ce8ef932dd68ac75529c220eded601980cf7b4cd79e76b8694cd3789a1e0" Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.844399 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70f8ce8ef932dd68ac75529c220eded601980cf7b4cd79e76b8694cd3789a1e0"} err="failed to get container status \"70f8ce8ef932dd68ac75529c220eded601980cf7b4cd79e76b8694cd3789a1e0\": rpc error: code = NotFound desc = could not find container \"70f8ce8ef932dd68ac75529c220eded601980cf7b4cd79e76b8694cd3789a1e0\": container with ID starting with 70f8ce8ef932dd68ac75529c220eded601980cf7b4cd79e76b8694cd3789a1e0 not found: ID does not exist" Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.879887 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cmjrg"] Oct 09 15:31:43 crc kubenswrapper[4719]: W1009 15:31:43.886823 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21fdeec4_a518_4f1f_a27d_50d49e078d3d.slice/crio-0308f8ca13c22cf976daa074ace9d864c16dc1e31a67967c9bb9b93a9efbdddf WatchSource:0}: Error finding container 0308f8ca13c22cf976daa074ace9d864c16dc1e31a67967c9bb9b93a9efbdddf: Status 404 returned error can't find the container with id 0308f8ca13c22cf976daa074ace9d864c16dc1e31a67967c9bb9b93a9efbdddf Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.933666 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06db29ff-605f-4e2d-894c-b8380f5ae7b7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.933707 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7ln2\" (UniqueName: \"kubernetes.io/projected/06db29ff-605f-4e2d-894c-b8380f5ae7b7-kube-api-access-k7ln2\") on node \"crc\" DevicePath \"\"" Oct 09 15:31:43 crc kubenswrapper[4719]: I1009 15:31:43.933721 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06db29ff-605f-4e2d-894c-b8380f5ae7b7-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 15:31:44 crc kubenswrapper[4719]: I1009 15:31:44.073580 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mx9bg"] Oct 09 15:31:44 crc kubenswrapper[4719]: I1009 15:31:44.074922 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zmr4p" Oct 09 15:31:44 crc kubenswrapper[4719]: I1009 15:31:44.083242 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mx9bg"] Oct 09 15:31:44 crc kubenswrapper[4719]: I1009 15:31:44.237484 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb2ll\" (UniqueName: \"kubernetes.io/projected/395419d4-0335-43f2-ba3c-a98568f97b69-kube-api-access-qb2ll\") pod \"395419d4-0335-43f2-ba3c-a98568f97b69\" (UID: \"395419d4-0335-43f2-ba3c-a98568f97b69\") " Oct 09 15:31:44 crc kubenswrapper[4719]: I1009 15:31:44.241335 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/395419d4-0335-43f2-ba3c-a98568f97b69-kube-api-access-qb2ll" (OuterVolumeSpecName: "kube-api-access-qb2ll") pod "395419d4-0335-43f2-ba3c-a98568f97b69" (UID: "395419d4-0335-43f2-ba3c-a98568f97b69"). InnerVolumeSpecName "kube-api-access-qb2ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:31:44 crc kubenswrapper[4719]: I1009 15:31:44.340605 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb2ll\" (UniqueName: \"kubernetes.io/projected/395419d4-0335-43f2-ba3c-a98568f97b69-kube-api-access-qb2ll\") on node \"crc\" DevicePath \"\"" Oct 09 15:31:44 crc kubenswrapper[4719]: I1009 15:31:44.743176 4719 generic.go:334] "Generic (PLEG): container finished" podID="395419d4-0335-43f2-ba3c-a98568f97b69" containerID="eb6710427f519d276d174a01b7a131365e8ea15c0b94a5508830d7977bc56217" exitCode=0 Oct 09 15:31:44 crc kubenswrapper[4719]: I1009 15:31:44.743244 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zmr4p" event={"ID":"395419d4-0335-43f2-ba3c-a98568f97b69","Type":"ContainerDied","Data":"eb6710427f519d276d174a01b7a131365e8ea15c0b94a5508830d7977bc56217"} Oct 09 15:31:44 crc kubenswrapper[4719]: I1009 15:31:44.743275 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zmr4p" event={"ID":"395419d4-0335-43f2-ba3c-a98568f97b69","Type":"ContainerDied","Data":"a697e8ce01cd823b0d3b8053b298e065d9064c65c8fbb1e6b25817674e08a49a"} Oct 09 15:31:44 crc kubenswrapper[4719]: I1009 15:31:44.743297 4719 scope.go:117] "RemoveContainer" containerID="eb6710427f519d276d174a01b7a131365e8ea15c0b94a5508830d7977bc56217" Oct 09 15:31:44 crc kubenswrapper[4719]: I1009 15:31:44.743401 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zmr4p" Oct 09 15:31:44 crc kubenswrapper[4719]: I1009 15:31:44.745296 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cmjrg" event={"ID":"21fdeec4-a518-4f1f-a27d-50d49e078d3d","Type":"ContainerStarted","Data":"45d73ccc41975ad1a6d37337b4ca5baffde399af88264e30010ad1b5d1c7836d"} Oct 09 15:31:44 crc kubenswrapper[4719]: I1009 15:31:44.745379 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cmjrg" event={"ID":"21fdeec4-a518-4f1f-a27d-50d49e078d3d","Type":"ContainerStarted","Data":"0308f8ca13c22cf976daa074ace9d864c16dc1e31a67967c9bb9b93a9efbdddf"} Oct 09 15:31:44 crc kubenswrapper[4719]: I1009 15:31:44.762633 4719 scope.go:117] "RemoveContainer" containerID="eb6710427f519d276d174a01b7a131365e8ea15c0b94a5508830d7977bc56217" Oct 09 15:31:44 crc kubenswrapper[4719]: E1009 15:31:44.764209 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb6710427f519d276d174a01b7a131365e8ea15c0b94a5508830d7977bc56217\": container with ID starting with eb6710427f519d276d174a01b7a131365e8ea15c0b94a5508830d7977bc56217 not found: ID does not exist" containerID="eb6710427f519d276d174a01b7a131365e8ea15c0b94a5508830d7977bc56217" Oct 09 15:31:44 crc kubenswrapper[4719]: I1009 15:31:44.764251 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb6710427f519d276d174a01b7a131365e8ea15c0b94a5508830d7977bc56217"} err="failed to get container status \"eb6710427f519d276d174a01b7a131365e8ea15c0b94a5508830d7977bc56217\": rpc error: code = NotFound desc = could not find container \"eb6710427f519d276d174a01b7a131365e8ea15c0b94a5508830d7977bc56217\": container with ID starting with eb6710427f519d276d174a01b7a131365e8ea15c0b94a5508830d7977bc56217 not found: ID does not exist" Oct 09 15:31:44 crc kubenswrapper[4719]: I1009 15:31:44.825681 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-cmjrg" podStartSLOduration=1.765510684 podStartE2EDuration="1.825657987s" podCreationTimestamp="2025-10-09 15:31:43 +0000 UTC" firstStartedPulling="2025-10-09 15:31:43.891314155 +0000 UTC m=+809.401025440" lastFinishedPulling="2025-10-09 15:31:43.951461458 +0000 UTC m=+809.461172743" observedRunningTime="2025-10-09 15:31:44.762708394 +0000 UTC m=+810.272419679" watchObservedRunningTime="2025-10-09 15:31:44.825657987 +0000 UTC m=+810.335369262" Oct 09 15:31:44 crc kubenswrapper[4719]: I1009 15:31:44.826919 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-zmr4p"] Oct 09 15:31:44 crc kubenswrapper[4719]: I1009 15:31:44.831697 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-zmr4p"] Oct 09 15:31:45 crc kubenswrapper[4719]: I1009 15:31:45.170685 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06db29ff-605f-4e2d-894c-b8380f5ae7b7" path="/var/lib/kubelet/pods/06db29ff-605f-4e2d-894c-b8380f5ae7b7/volumes" Oct 09 15:31:45 crc kubenswrapper[4719]: I1009 15:31:45.171719 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="395419d4-0335-43f2-ba3c-a98568f97b69" path="/var/lib/kubelet/pods/395419d4-0335-43f2-ba3c-a98568f97b69/volumes" Oct 09 15:31:53 crc kubenswrapper[4719]: I1009 15:31:53.467198 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-cmjrg" Oct 09 15:31:53 crc kubenswrapper[4719]: I1009 15:31:53.467710 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-cmjrg" Oct 09 15:31:53 crc kubenswrapper[4719]: I1009 15:31:53.492619 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-cmjrg" Oct 09 15:31:53 crc kubenswrapper[4719]: I1009 15:31:53.837836 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-cmjrg" Oct 09 15:31:54 crc kubenswrapper[4719]: I1009 15:31:54.774002 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt"] Oct 09 15:31:54 crc kubenswrapper[4719]: E1009 15:31:54.774228 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06db29ff-605f-4e2d-894c-b8380f5ae7b7" containerName="extract-content" Oct 09 15:31:54 crc kubenswrapper[4719]: I1009 15:31:54.774239 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="06db29ff-605f-4e2d-894c-b8380f5ae7b7" containerName="extract-content" Oct 09 15:31:54 crc kubenswrapper[4719]: E1009 15:31:54.774256 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06db29ff-605f-4e2d-894c-b8380f5ae7b7" containerName="extract-utilities" Oct 09 15:31:54 crc kubenswrapper[4719]: I1009 15:31:54.774261 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="06db29ff-605f-4e2d-894c-b8380f5ae7b7" containerName="extract-utilities" Oct 09 15:31:54 crc kubenswrapper[4719]: E1009 15:31:54.774278 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="395419d4-0335-43f2-ba3c-a98568f97b69" containerName="registry-server" Oct 09 15:31:54 crc kubenswrapper[4719]: I1009 15:31:54.774283 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="395419d4-0335-43f2-ba3c-a98568f97b69" containerName="registry-server" Oct 09 15:31:54 crc kubenswrapper[4719]: E1009 15:31:54.774292 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06db29ff-605f-4e2d-894c-b8380f5ae7b7" containerName="registry-server" Oct 09 15:31:54 crc kubenswrapper[4719]: I1009 15:31:54.774298 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="06db29ff-605f-4e2d-894c-b8380f5ae7b7" containerName="registry-server" Oct 09 15:31:54 crc kubenswrapper[4719]: I1009 15:31:54.774436 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="395419d4-0335-43f2-ba3c-a98568f97b69" containerName="registry-server" Oct 09 15:31:54 crc kubenswrapper[4719]: I1009 15:31:54.774454 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="06db29ff-605f-4e2d-894c-b8380f5ae7b7" containerName="registry-server" Oct 09 15:31:54 crc kubenswrapper[4719]: I1009 15:31:54.787542 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt" Oct 09 15:31:54 crc kubenswrapper[4719]: I1009 15:31:54.790066 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-ncbgm" Oct 09 15:31:54 crc kubenswrapper[4719]: I1009 15:31:54.799181 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt"] Oct 09 15:31:54 crc kubenswrapper[4719]: I1009 15:31:54.888689 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57fba553-2ed7-4d57-94af-f8322ebd87d3-util\") pod \"748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt\" (UID: \"57fba553-2ed7-4d57-94af-f8322ebd87d3\") " pod="openstack-operators/748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt" Oct 09 15:31:54 crc kubenswrapper[4719]: I1009 15:31:54.889343 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57fba553-2ed7-4d57-94af-f8322ebd87d3-bundle\") pod \"748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt\" (UID: \"57fba553-2ed7-4d57-94af-f8322ebd87d3\") " pod="openstack-operators/748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt" Oct 09 15:31:54 crc kubenswrapper[4719]: I1009 15:31:54.889394 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwfj4\" (UniqueName: \"kubernetes.io/projected/57fba553-2ed7-4d57-94af-f8322ebd87d3-kube-api-access-dwfj4\") pod \"748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt\" (UID: \"57fba553-2ed7-4d57-94af-f8322ebd87d3\") " pod="openstack-operators/748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt" Oct 09 15:31:54 crc kubenswrapper[4719]: I1009 15:31:54.991785 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwfj4\" (UniqueName: \"kubernetes.io/projected/57fba553-2ed7-4d57-94af-f8322ebd87d3-kube-api-access-dwfj4\") pod \"748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt\" (UID: \"57fba553-2ed7-4d57-94af-f8322ebd87d3\") " pod="openstack-operators/748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt" Oct 09 15:31:54 crc kubenswrapper[4719]: I1009 15:31:54.992015 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57fba553-2ed7-4d57-94af-f8322ebd87d3-util\") pod \"748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt\" (UID: \"57fba553-2ed7-4d57-94af-f8322ebd87d3\") " pod="openstack-operators/748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt" Oct 09 15:31:54 crc kubenswrapper[4719]: I1009 15:31:54.992150 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57fba553-2ed7-4d57-94af-f8322ebd87d3-bundle\") pod \"748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt\" (UID: \"57fba553-2ed7-4d57-94af-f8322ebd87d3\") " pod="openstack-operators/748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt" Oct 09 15:31:54 crc kubenswrapper[4719]: I1009 15:31:54.992511 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57fba553-2ed7-4d57-94af-f8322ebd87d3-util\") pod \"748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt\" (UID: \"57fba553-2ed7-4d57-94af-f8322ebd87d3\") " pod="openstack-operators/748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt" Oct 09 15:31:54 crc kubenswrapper[4719]: I1009 15:31:54.992588 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57fba553-2ed7-4d57-94af-f8322ebd87d3-bundle\") pod \"748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt\" (UID: \"57fba553-2ed7-4d57-94af-f8322ebd87d3\") " pod="openstack-operators/748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt" Oct 09 15:31:55 crc kubenswrapper[4719]: I1009 15:31:55.015886 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwfj4\" (UniqueName: \"kubernetes.io/projected/57fba553-2ed7-4d57-94af-f8322ebd87d3-kube-api-access-dwfj4\") pod \"748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt\" (UID: \"57fba553-2ed7-4d57-94af-f8322ebd87d3\") " pod="openstack-operators/748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt" Oct 09 15:31:55 crc kubenswrapper[4719]: I1009 15:31:55.118467 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt" Oct 09 15:31:55 crc kubenswrapper[4719]: I1009 15:31:55.491098 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt"] Oct 09 15:31:55 crc kubenswrapper[4719]: I1009 15:31:55.816402 4719 generic.go:334] "Generic (PLEG): container finished" podID="57fba553-2ed7-4d57-94af-f8322ebd87d3" containerID="8aec06cf521ec909313846ac4f19e53fd33bf4fa99105470a5520f6a576cddfa" exitCode=0 Oct 09 15:31:55 crc kubenswrapper[4719]: I1009 15:31:55.816442 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt" event={"ID":"57fba553-2ed7-4d57-94af-f8322ebd87d3","Type":"ContainerDied","Data":"8aec06cf521ec909313846ac4f19e53fd33bf4fa99105470a5520f6a576cddfa"} Oct 09 15:31:55 crc kubenswrapper[4719]: I1009 15:31:55.816466 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt" event={"ID":"57fba553-2ed7-4d57-94af-f8322ebd87d3","Type":"ContainerStarted","Data":"46b0d0f0a53ae7fe17fcf6f1dee0d8efe105ef032c13bf43f8313777952aa64a"} Oct 09 15:31:56 crc kubenswrapper[4719]: I1009 15:31:56.146724 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nf2m6"] Oct 09 15:31:56 crc kubenswrapper[4719]: I1009 15:31:56.148922 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nf2m6" Oct 09 15:31:56 crc kubenswrapper[4719]: I1009 15:31:56.161953 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nf2m6"] Oct 09 15:31:56 crc kubenswrapper[4719]: I1009 15:31:56.308611 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d395aa34-bb99-47c7-b1ce-a94d37ea4ad8-utilities\") pod \"redhat-operators-nf2m6\" (UID: \"d395aa34-bb99-47c7-b1ce-a94d37ea4ad8\") " pod="openshift-marketplace/redhat-operators-nf2m6" Oct 09 15:31:56 crc kubenswrapper[4719]: I1009 15:31:56.308659 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjc7d\" (UniqueName: \"kubernetes.io/projected/d395aa34-bb99-47c7-b1ce-a94d37ea4ad8-kube-api-access-xjc7d\") pod \"redhat-operators-nf2m6\" (UID: \"d395aa34-bb99-47c7-b1ce-a94d37ea4ad8\") " pod="openshift-marketplace/redhat-operators-nf2m6" Oct 09 15:31:56 crc kubenswrapper[4719]: I1009 15:31:56.308794 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d395aa34-bb99-47c7-b1ce-a94d37ea4ad8-catalog-content\") pod \"redhat-operators-nf2m6\" (UID: \"d395aa34-bb99-47c7-b1ce-a94d37ea4ad8\") " pod="openshift-marketplace/redhat-operators-nf2m6" Oct 09 15:31:56 crc kubenswrapper[4719]: I1009 15:31:56.409919 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d395aa34-bb99-47c7-b1ce-a94d37ea4ad8-utilities\") pod \"redhat-operators-nf2m6\" (UID: \"d395aa34-bb99-47c7-b1ce-a94d37ea4ad8\") " pod="openshift-marketplace/redhat-operators-nf2m6" Oct 09 15:31:56 crc kubenswrapper[4719]: I1009 15:31:56.410096 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjc7d\" (UniqueName: \"kubernetes.io/projected/d395aa34-bb99-47c7-b1ce-a94d37ea4ad8-kube-api-access-xjc7d\") pod \"redhat-operators-nf2m6\" (UID: \"d395aa34-bb99-47c7-b1ce-a94d37ea4ad8\") " pod="openshift-marketplace/redhat-operators-nf2m6" Oct 09 15:31:56 crc kubenswrapper[4719]: I1009 15:31:56.410265 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d395aa34-bb99-47c7-b1ce-a94d37ea4ad8-catalog-content\") pod \"redhat-operators-nf2m6\" (UID: \"d395aa34-bb99-47c7-b1ce-a94d37ea4ad8\") " pod="openshift-marketplace/redhat-operators-nf2m6" Oct 09 15:31:56 crc kubenswrapper[4719]: I1009 15:31:56.410625 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d395aa34-bb99-47c7-b1ce-a94d37ea4ad8-utilities\") pod \"redhat-operators-nf2m6\" (UID: \"d395aa34-bb99-47c7-b1ce-a94d37ea4ad8\") " pod="openshift-marketplace/redhat-operators-nf2m6" Oct 09 15:31:56 crc kubenswrapper[4719]: I1009 15:31:56.410750 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d395aa34-bb99-47c7-b1ce-a94d37ea4ad8-catalog-content\") pod \"redhat-operators-nf2m6\" (UID: \"d395aa34-bb99-47c7-b1ce-a94d37ea4ad8\") " pod="openshift-marketplace/redhat-operators-nf2m6" Oct 09 15:31:56 crc kubenswrapper[4719]: I1009 15:31:56.432773 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjc7d\" (UniqueName: \"kubernetes.io/projected/d395aa34-bb99-47c7-b1ce-a94d37ea4ad8-kube-api-access-xjc7d\") pod \"redhat-operators-nf2m6\" (UID: \"d395aa34-bb99-47c7-b1ce-a94d37ea4ad8\") " pod="openshift-marketplace/redhat-operators-nf2m6" Oct 09 15:31:56 crc kubenswrapper[4719]: I1009 15:31:56.472606 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nf2m6" Oct 09 15:31:56 crc kubenswrapper[4719]: I1009 15:31:56.823578 4719 generic.go:334] "Generic (PLEG): container finished" podID="57fba553-2ed7-4d57-94af-f8322ebd87d3" containerID="1dfb7e5334b0b21babd5179aca8b5dd7285588134726faf37117b3cb757efc08" exitCode=0 Oct 09 15:31:56 crc kubenswrapper[4719]: I1009 15:31:56.823940 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt" event={"ID":"57fba553-2ed7-4d57-94af-f8322ebd87d3","Type":"ContainerDied","Data":"1dfb7e5334b0b21babd5179aca8b5dd7285588134726faf37117b3cb757efc08"} Oct 09 15:31:56 crc kubenswrapper[4719]: I1009 15:31:56.949223 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nf2m6"] Oct 09 15:31:56 crc kubenswrapper[4719]: W1009 15:31:56.975214 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd395aa34_bb99_47c7_b1ce_a94d37ea4ad8.slice/crio-3b70176e622c69920a8b6a89c56e6f576e72e956ed056a1349862859666de2f1 WatchSource:0}: Error finding container 3b70176e622c69920a8b6a89c56e6f576e72e956ed056a1349862859666de2f1: Status 404 returned error can't find the container with id 3b70176e622c69920a8b6a89c56e6f576e72e956ed056a1349862859666de2f1 Oct 09 15:31:57 crc kubenswrapper[4719]: I1009 15:31:57.831164 4719 generic.go:334] "Generic (PLEG): container finished" podID="57fba553-2ed7-4d57-94af-f8322ebd87d3" containerID="803e3ce923a26c8e7148a7d40e4478e15ff97a476b6bd3a5abe4a2e2b597c85e" exitCode=0 Oct 09 15:31:57 crc kubenswrapper[4719]: I1009 15:31:57.831241 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt" event={"ID":"57fba553-2ed7-4d57-94af-f8322ebd87d3","Type":"ContainerDied","Data":"803e3ce923a26c8e7148a7d40e4478e15ff97a476b6bd3a5abe4a2e2b597c85e"} Oct 09 15:31:57 crc kubenswrapper[4719]: I1009 15:31:57.832790 4719 generic.go:334] "Generic (PLEG): container finished" podID="d395aa34-bb99-47c7-b1ce-a94d37ea4ad8" containerID="3af30404b48d098343713d1cae402648545fb307b22dfedada728deadd8a5653" exitCode=0 Oct 09 15:31:57 crc kubenswrapper[4719]: I1009 15:31:57.832822 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nf2m6" event={"ID":"d395aa34-bb99-47c7-b1ce-a94d37ea4ad8","Type":"ContainerDied","Data":"3af30404b48d098343713d1cae402648545fb307b22dfedada728deadd8a5653"} Oct 09 15:31:57 crc kubenswrapper[4719]: I1009 15:31:57.832839 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nf2m6" event={"ID":"d395aa34-bb99-47c7-b1ce-a94d37ea4ad8","Type":"ContainerStarted","Data":"3b70176e622c69920a8b6a89c56e6f576e72e956ed056a1349862859666de2f1"} Oct 09 15:31:58 crc kubenswrapper[4719]: I1009 15:31:58.843020 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nf2m6" event={"ID":"d395aa34-bb99-47c7-b1ce-a94d37ea4ad8","Type":"ContainerStarted","Data":"a4f75dc933c7392d66e79300a25f1b7b8dda6d7d1c9d6d493b3d0a7a87c47d80"} Oct 09 15:31:59 crc kubenswrapper[4719]: I1009 15:31:59.140339 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt" Oct 09 15:31:59 crc kubenswrapper[4719]: I1009 15:31:59.244701 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57fba553-2ed7-4d57-94af-f8322ebd87d3-bundle\") pod \"57fba553-2ed7-4d57-94af-f8322ebd87d3\" (UID: \"57fba553-2ed7-4d57-94af-f8322ebd87d3\") " Oct 09 15:31:59 crc kubenswrapper[4719]: I1009 15:31:59.245036 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwfj4\" (UniqueName: \"kubernetes.io/projected/57fba553-2ed7-4d57-94af-f8322ebd87d3-kube-api-access-dwfj4\") pod \"57fba553-2ed7-4d57-94af-f8322ebd87d3\" (UID: \"57fba553-2ed7-4d57-94af-f8322ebd87d3\") " Oct 09 15:31:59 crc kubenswrapper[4719]: I1009 15:31:59.245145 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57fba553-2ed7-4d57-94af-f8322ebd87d3-util\") pod \"57fba553-2ed7-4d57-94af-f8322ebd87d3\" (UID: \"57fba553-2ed7-4d57-94af-f8322ebd87d3\") " Oct 09 15:31:59 crc kubenswrapper[4719]: I1009 15:31:59.245947 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57fba553-2ed7-4d57-94af-f8322ebd87d3-bundle" (OuterVolumeSpecName: "bundle") pod "57fba553-2ed7-4d57-94af-f8322ebd87d3" (UID: "57fba553-2ed7-4d57-94af-f8322ebd87d3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:31:59 crc kubenswrapper[4719]: I1009 15:31:59.254560 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57fba553-2ed7-4d57-94af-f8322ebd87d3-kube-api-access-dwfj4" (OuterVolumeSpecName: "kube-api-access-dwfj4") pod "57fba553-2ed7-4d57-94af-f8322ebd87d3" (UID: "57fba553-2ed7-4d57-94af-f8322ebd87d3"). InnerVolumeSpecName "kube-api-access-dwfj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:31:59 crc kubenswrapper[4719]: I1009 15:31:59.264050 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57fba553-2ed7-4d57-94af-f8322ebd87d3-util" (OuterVolumeSpecName: "util") pod "57fba553-2ed7-4d57-94af-f8322ebd87d3" (UID: "57fba553-2ed7-4d57-94af-f8322ebd87d3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:31:59 crc kubenswrapper[4719]: I1009 15:31:59.346560 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwfj4\" (UniqueName: \"kubernetes.io/projected/57fba553-2ed7-4d57-94af-f8322ebd87d3-kube-api-access-dwfj4\") on node \"crc\" DevicePath \"\"" Oct 09 15:31:59 crc kubenswrapper[4719]: I1009 15:31:59.346591 4719 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57fba553-2ed7-4d57-94af-f8322ebd87d3-util\") on node \"crc\" DevicePath \"\"" Oct 09 15:31:59 crc kubenswrapper[4719]: I1009 15:31:59.346603 4719 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57fba553-2ed7-4d57-94af-f8322ebd87d3-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:31:59 crc kubenswrapper[4719]: I1009 15:31:59.853414 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt" Oct 09 15:31:59 crc kubenswrapper[4719]: I1009 15:31:59.853436 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt" event={"ID":"57fba553-2ed7-4d57-94af-f8322ebd87d3","Type":"ContainerDied","Data":"46b0d0f0a53ae7fe17fcf6f1dee0d8efe105ef032c13bf43f8313777952aa64a"} Oct 09 15:31:59 crc kubenswrapper[4719]: I1009 15:31:59.855315 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46b0d0f0a53ae7fe17fcf6f1dee0d8efe105ef032c13bf43f8313777952aa64a" Oct 09 15:31:59 crc kubenswrapper[4719]: I1009 15:31:59.857178 4719 generic.go:334] "Generic (PLEG): container finished" podID="d395aa34-bb99-47c7-b1ce-a94d37ea4ad8" containerID="a4f75dc933c7392d66e79300a25f1b7b8dda6d7d1c9d6d493b3d0a7a87c47d80" exitCode=0 Oct 09 15:31:59 crc kubenswrapper[4719]: I1009 15:31:59.857225 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nf2m6" event={"ID":"d395aa34-bb99-47c7-b1ce-a94d37ea4ad8","Type":"ContainerDied","Data":"a4f75dc933c7392d66e79300a25f1b7b8dda6d7d1c9d6d493b3d0a7a87c47d80"} Oct 09 15:32:00 crc kubenswrapper[4719]: I1009 15:32:00.864754 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nf2m6" event={"ID":"d395aa34-bb99-47c7-b1ce-a94d37ea4ad8","Type":"ContainerStarted","Data":"1af4f0cf25cbb792f69daa5c17db6fb061970733aaa1a8d702e1fbfcb680b4b9"} Oct 09 15:32:00 crc kubenswrapper[4719]: I1009 15:32:00.885208 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nf2m6" podStartSLOduration=2.413808231 podStartE2EDuration="4.885189504s" podCreationTimestamp="2025-10-09 15:31:56 +0000 UTC" firstStartedPulling="2025-10-09 15:31:57.83446829 +0000 UTC m=+823.344179575" lastFinishedPulling="2025-10-09 15:32:00.305849562 +0000 UTC m=+825.815560848" observedRunningTime="2025-10-09 15:32:00.881043152 +0000 UTC m=+826.390754447" watchObservedRunningTime="2025-10-09 15:32:00.885189504 +0000 UTC m=+826.394900799" Oct 09 15:32:05 crc kubenswrapper[4719]: I1009 15:32:05.939296 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6cf9d6bfd4-rw5j8"] Oct 09 15:32:05 crc kubenswrapper[4719]: E1009 15:32:05.939966 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57fba553-2ed7-4d57-94af-f8322ebd87d3" containerName="pull" Oct 09 15:32:05 crc kubenswrapper[4719]: I1009 15:32:05.939982 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="57fba553-2ed7-4d57-94af-f8322ebd87d3" containerName="pull" Oct 09 15:32:05 crc kubenswrapper[4719]: E1009 15:32:05.939993 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57fba553-2ed7-4d57-94af-f8322ebd87d3" containerName="util" Oct 09 15:32:05 crc kubenswrapper[4719]: I1009 15:32:05.940001 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="57fba553-2ed7-4d57-94af-f8322ebd87d3" containerName="util" Oct 09 15:32:05 crc kubenswrapper[4719]: E1009 15:32:05.940022 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57fba553-2ed7-4d57-94af-f8322ebd87d3" containerName="extract" Oct 09 15:32:05 crc kubenswrapper[4719]: I1009 15:32:05.940031 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="57fba553-2ed7-4d57-94af-f8322ebd87d3" containerName="extract" Oct 09 15:32:05 crc kubenswrapper[4719]: I1009 15:32:05.940202 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="57fba553-2ed7-4d57-94af-f8322ebd87d3" containerName="extract" Oct 09 15:32:05 crc kubenswrapper[4719]: I1009 15:32:05.941035 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6cf9d6bfd4-rw5j8" Oct 09 15:32:05 crc kubenswrapper[4719]: I1009 15:32:05.953774 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-5kf4x" Oct 09 15:32:05 crc kubenswrapper[4719]: I1009 15:32:05.993174 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6cf9d6bfd4-rw5j8"] Oct 09 15:32:06 crc kubenswrapper[4719]: I1009 15:32:06.138916 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xdms\" (UniqueName: \"kubernetes.io/projected/a64c087b-46a2-4c1b-abf9-ce21ce6f9688-kube-api-access-2xdms\") pod \"openstack-operator-controller-operator-6cf9d6bfd4-rw5j8\" (UID: \"a64c087b-46a2-4c1b-abf9-ce21ce6f9688\") " pod="openstack-operators/openstack-operator-controller-operator-6cf9d6bfd4-rw5j8" Oct 09 15:32:06 crc kubenswrapper[4719]: I1009 15:32:06.240596 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xdms\" (UniqueName: \"kubernetes.io/projected/a64c087b-46a2-4c1b-abf9-ce21ce6f9688-kube-api-access-2xdms\") pod \"openstack-operator-controller-operator-6cf9d6bfd4-rw5j8\" (UID: \"a64c087b-46a2-4c1b-abf9-ce21ce6f9688\") " pod="openstack-operators/openstack-operator-controller-operator-6cf9d6bfd4-rw5j8" Oct 09 15:32:06 crc kubenswrapper[4719]: I1009 15:32:06.261107 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xdms\" (UniqueName: \"kubernetes.io/projected/a64c087b-46a2-4c1b-abf9-ce21ce6f9688-kube-api-access-2xdms\") pod \"openstack-operator-controller-operator-6cf9d6bfd4-rw5j8\" (UID: \"a64c087b-46a2-4c1b-abf9-ce21ce6f9688\") " pod="openstack-operators/openstack-operator-controller-operator-6cf9d6bfd4-rw5j8" Oct 09 15:32:06 crc kubenswrapper[4719]: I1009 15:32:06.274449 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6cf9d6bfd4-rw5j8" Oct 09 15:32:06 crc kubenswrapper[4719]: I1009 15:32:06.475627 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nf2m6" Oct 09 15:32:06 crc kubenswrapper[4719]: I1009 15:32:06.475978 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nf2m6" Oct 09 15:32:06 crc kubenswrapper[4719]: I1009 15:32:06.538986 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nf2m6" Oct 09 15:32:06 crc kubenswrapper[4719]: I1009 15:32:06.750839 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6cf9d6bfd4-rw5j8"] Oct 09 15:32:06 crc kubenswrapper[4719]: W1009 15:32:06.760493 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda64c087b_46a2_4c1b_abf9_ce21ce6f9688.slice/crio-3f34d192f19976b8b408c041c8c807dbb1f49f79e3b8f50ce11bd7bbbeef60b0 WatchSource:0}: Error finding container 3f34d192f19976b8b408c041c8c807dbb1f49f79e3b8f50ce11bd7bbbeef60b0: Status 404 returned error can't find the container with id 3f34d192f19976b8b408c041c8c807dbb1f49f79e3b8f50ce11bd7bbbeef60b0 Oct 09 15:32:06 crc kubenswrapper[4719]: I1009 15:32:06.901575 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6cf9d6bfd4-rw5j8" event={"ID":"a64c087b-46a2-4c1b-abf9-ce21ce6f9688","Type":"ContainerStarted","Data":"3f34d192f19976b8b408c041c8c807dbb1f49f79e3b8f50ce11bd7bbbeef60b0"} Oct 09 15:32:06 crc kubenswrapper[4719]: I1009 15:32:06.944241 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nf2m6" Oct 09 15:32:06 crc kubenswrapper[4719]: I1009 15:32:06.977209 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:32:06 crc kubenswrapper[4719]: I1009 15:32:06.977284 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:32:06 crc kubenswrapper[4719]: I1009 15:32:06.977343 4719 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 15:32:06 crc kubenswrapper[4719]: I1009 15:32:06.978173 4719 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"68d8ab72b367a09fd501bf52a95e52e96b2dd8454309c2056f29b2264d60dcdd"} pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 15:32:06 crc kubenswrapper[4719]: I1009 15:32:06.978224 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" containerID="cri-o://68d8ab72b367a09fd501bf52a95e52e96b2dd8454309c2056f29b2264d60dcdd" gracePeriod=600 Oct 09 15:32:07 crc kubenswrapper[4719]: I1009 15:32:07.537987 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nf2m6"] Oct 09 15:32:07 crc kubenswrapper[4719]: I1009 15:32:07.908996 4719 generic.go:334] "Generic (PLEG): container finished" podID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerID="68d8ab72b367a09fd501bf52a95e52e96b2dd8454309c2056f29b2264d60dcdd" exitCode=0 Oct 09 15:32:07 crc kubenswrapper[4719]: I1009 15:32:07.909062 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerDied","Data":"68d8ab72b367a09fd501bf52a95e52e96b2dd8454309c2056f29b2264d60dcdd"} Oct 09 15:32:07 crc kubenswrapper[4719]: I1009 15:32:07.909097 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerStarted","Data":"b3d6b70762c7cbd23b776b68a38ad2cc0ec2c06605dcc7efb9d87a0020d07dde"} Oct 09 15:32:07 crc kubenswrapper[4719]: I1009 15:32:07.909114 4719 scope.go:117] "RemoveContainer" containerID="8f170c1640fe33d3a488ada64feda2ff2ffd3c8d5fb1e790430375c1ffcc2527" Oct 09 15:32:08 crc kubenswrapper[4719]: I1009 15:32:08.915228 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nf2m6" podUID="d395aa34-bb99-47c7-b1ce-a94d37ea4ad8" containerName="registry-server" containerID="cri-o://1af4f0cf25cbb792f69daa5c17db6fb061970733aaa1a8d702e1fbfcb680b4b9" gracePeriod=2 Oct 09 15:32:09 crc kubenswrapper[4719]: I1009 15:32:09.923285 4719 generic.go:334] "Generic (PLEG): container finished" podID="d395aa34-bb99-47c7-b1ce-a94d37ea4ad8" containerID="1af4f0cf25cbb792f69daa5c17db6fb061970733aaa1a8d702e1fbfcb680b4b9" exitCode=0 Oct 09 15:32:09 crc kubenswrapper[4719]: I1009 15:32:09.923373 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nf2m6" event={"ID":"d395aa34-bb99-47c7-b1ce-a94d37ea4ad8","Type":"ContainerDied","Data":"1af4f0cf25cbb792f69daa5c17db6fb061970733aaa1a8d702e1fbfcb680b4b9"} Oct 09 15:32:10 crc kubenswrapper[4719]: I1009 15:32:10.212168 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nf2m6" Oct 09 15:32:10 crc kubenswrapper[4719]: I1009 15:32:10.310719 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjc7d\" (UniqueName: \"kubernetes.io/projected/d395aa34-bb99-47c7-b1ce-a94d37ea4ad8-kube-api-access-xjc7d\") pod \"d395aa34-bb99-47c7-b1ce-a94d37ea4ad8\" (UID: \"d395aa34-bb99-47c7-b1ce-a94d37ea4ad8\") " Oct 09 15:32:10 crc kubenswrapper[4719]: I1009 15:32:10.310802 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d395aa34-bb99-47c7-b1ce-a94d37ea4ad8-utilities\") pod \"d395aa34-bb99-47c7-b1ce-a94d37ea4ad8\" (UID: \"d395aa34-bb99-47c7-b1ce-a94d37ea4ad8\") " Oct 09 15:32:10 crc kubenswrapper[4719]: I1009 15:32:10.310832 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d395aa34-bb99-47c7-b1ce-a94d37ea4ad8-catalog-content\") pod \"d395aa34-bb99-47c7-b1ce-a94d37ea4ad8\" (UID: \"d395aa34-bb99-47c7-b1ce-a94d37ea4ad8\") " Oct 09 15:32:10 crc kubenswrapper[4719]: I1009 15:32:10.311771 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d395aa34-bb99-47c7-b1ce-a94d37ea4ad8-utilities" (OuterVolumeSpecName: "utilities") pod "d395aa34-bb99-47c7-b1ce-a94d37ea4ad8" (UID: "d395aa34-bb99-47c7-b1ce-a94d37ea4ad8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:32:10 crc kubenswrapper[4719]: I1009 15:32:10.317814 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d395aa34-bb99-47c7-b1ce-a94d37ea4ad8-kube-api-access-xjc7d" (OuterVolumeSpecName: "kube-api-access-xjc7d") pod "d395aa34-bb99-47c7-b1ce-a94d37ea4ad8" (UID: "d395aa34-bb99-47c7-b1ce-a94d37ea4ad8"). InnerVolumeSpecName "kube-api-access-xjc7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:32:10 crc kubenswrapper[4719]: I1009 15:32:10.393114 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d395aa34-bb99-47c7-b1ce-a94d37ea4ad8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d395aa34-bb99-47c7-b1ce-a94d37ea4ad8" (UID: "d395aa34-bb99-47c7-b1ce-a94d37ea4ad8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:32:10 crc kubenswrapper[4719]: I1009 15:32:10.412773 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d395aa34-bb99-47c7-b1ce-a94d37ea4ad8-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 15:32:10 crc kubenswrapper[4719]: I1009 15:32:10.412804 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d395aa34-bb99-47c7-b1ce-a94d37ea4ad8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 15:32:10 crc kubenswrapper[4719]: I1009 15:32:10.412816 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjc7d\" (UniqueName: \"kubernetes.io/projected/d395aa34-bb99-47c7-b1ce-a94d37ea4ad8-kube-api-access-xjc7d\") on node \"crc\" DevicePath \"\"" Oct 09 15:32:10 crc kubenswrapper[4719]: I1009 15:32:10.932992 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nf2m6" Oct 09 15:32:10 crc kubenswrapper[4719]: I1009 15:32:10.932981 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nf2m6" event={"ID":"d395aa34-bb99-47c7-b1ce-a94d37ea4ad8","Type":"ContainerDied","Data":"3b70176e622c69920a8b6a89c56e6f576e72e956ed056a1349862859666de2f1"} Oct 09 15:32:10 crc kubenswrapper[4719]: I1009 15:32:10.933409 4719 scope.go:117] "RemoveContainer" containerID="1af4f0cf25cbb792f69daa5c17db6fb061970733aaa1a8d702e1fbfcb680b4b9" Oct 09 15:32:10 crc kubenswrapper[4719]: I1009 15:32:10.934628 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6cf9d6bfd4-rw5j8" event={"ID":"a64c087b-46a2-4c1b-abf9-ce21ce6f9688","Type":"ContainerStarted","Data":"aed21a394dd0474cac9bb7b2f8e87cc21de52f0777a2c46d45045e3d8fda1002"} Oct 09 15:32:10 crc kubenswrapper[4719]: I1009 15:32:10.948040 4719 scope.go:117] "RemoveContainer" containerID="a4f75dc933c7392d66e79300a25f1b7b8dda6d7d1c9d6d493b3d0a7a87c47d80" Oct 09 15:32:10 crc kubenswrapper[4719]: I1009 15:32:10.960303 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nf2m6"] Oct 09 15:32:10 crc kubenswrapper[4719]: I1009 15:32:10.970672 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nf2m6"] Oct 09 15:32:10 crc kubenswrapper[4719]: I1009 15:32:10.977860 4719 scope.go:117] "RemoveContainer" containerID="3af30404b48d098343713d1cae402648545fb307b22dfedada728deadd8a5653" Oct 09 15:32:11 crc kubenswrapper[4719]: I1009 15:32:11.168314 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d395aa34-bb99-47c7-b1ce-a94d37ea4ad8" path="/var/lib/kubelet/pods/d395aa34-bb99-47c7-b1ce-a94d37ea4ad8/volumes" Oct 09 15:32:13 crc kubenswrapper[4719]: I1009 15:32:13.959671 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6cf9d6bfd4-rw5j8" event={"ID":"a64c087b-46a2-4c1b-abf9-ce21ce6f9688","Type":"ContainerStarted","Data":"04ba278c4a93a3dcdfa56d651822d1563f4a1e414526c78d1be82df399ff60ae"} Oct 09 15:32:13 crc kubenswrapper[4719]: I1009 15:32:13.959984 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6cf9d6bfd4-rw5j8" Oct 09 15:32:13 crc kubenswrapper[4719]: I1009 15:32:13.999585 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6cf9d6bfd4-rw5j8" podStartSLOduration=2.214925112 podStartE2EDuration="8.999563521s" podCreationTimestamp="2025-10-09 15:32:05 +0000 UTC" firstStartedPulling="2025-10-09 15:32:06.762318161 +0000 UTC m=+832.272029446" lastFinishedPulling="2025-10-09 15:32:13.54695657 +0000 UTC m=+839.056667855" observedRunningTime="2025-10-09 15:32:13.992607808 +0000 UTC m=+839.502319093" watchObservedRunningTime="2025-10-09 15:32:13.999563521 +0000 UTC m=+839.509274816" Oct 09 15:32:16 crc kubenswrapper[4719]: I1009 15:32:16.277870 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6cf9d6bfd4-rw5j8" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.057371 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-pnb2g"] Oct 09 15:32:32 crc kubenswrapper[4719]: E1009 15:32:32.058263 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d395aa34-bb99-47c7-b1ce-a94d37ea4ad8" containerName="extract-content" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.058282 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="d395aa34-bb99-47c7-b1ce-a94d37ea4ad8" containerName="extract-content" Oct 09 15:32:32 crc kubenswrapper[4719]: E1009 15:32:32.058304 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d395aa34-bb99-47c7-b1ce-a94d37ea4ad8" containerName="registry-server" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.058314 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="d395aa34-bb99-47c7-b1ce-a94d37ea4ad8" containerName="registry-server" Oct 09 15:32:32 crc kubenswrapper[4719]: E1009 15:32:32.058330 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d395aa34-bb99-47c7-b1ce-a94d37ea4ad8" containerName="extract-utilities" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.058339 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="d395aa34-bb99-47c7-b1ce-a94d37ea4ad8" containerName="extract-utilities" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.058514 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="d395aa34-bb99-47c7-b1ce-a94d37ea4ad8" containerName="registry-server" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.059277 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-pnb2g" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.062778 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-kxkjp"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.063995 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-kxkjp" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.065226 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-5fxjw" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.065331 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-zcgl7" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.073405 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-wzn6r"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.074709 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-wzn6r" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.081023 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-z25pz" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.085377 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-pnb2g"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.088867 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-kxkjp"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.092544 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-wzn6r"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.125396 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-pvjzc"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.145381 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-pvjzc" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.150447 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-fjccs" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.185428 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-pvjzc"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.189866 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-z4mpg"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.191040 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-z4mpg" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.204167 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-p7c7t" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.204269 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-jg6r2"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.205529 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-jg6r2" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.208059 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-kjfxr" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.215729 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-z4mpg"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.239538 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxzn4\" (UniqueName: \"kubernetes.io/projected/ab93ff28-c8ec-4514-bd82-dbab0fe25cee-kube-api-access-nxzn4\") pod \"barbican-operator-controller-manager-64f84fcdbb-kxkjp\" (UID: \"ab93ff28-c8ec-4514-bd82-dbab0fe25cee\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-kxkjp" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.239629 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb6fm\" (UniqueName: \"kubernetes.io/projected/f013ff43-3cb6-47f5-bc35-a4bf02143db0-kube-api-access-cb6fm\") pod \"designate-operator-controller-manager-687df44cdb-wzn6r\" (UID: \"f013ff43-3cb6-47f5-bc35-a4bf02143db0\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-wzn6r" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.239676 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2k9c\" (UniqueName: \"kubernetes.io/projected/f4b78ea6-51d8-4a7a-b5d3-cc4bdc3b5ba4-kube-api-access-q2k9c\") pod \"cinder-operator-controller-manager-59cdc64769-pnb2g\" (UID: \"f4b78ea6-51d8-4a7a-b5d3-cc4bdc3b5ba4\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-pnb2g" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.243850 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-jg6r2"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.274413 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-jzwqf"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.275718 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-jzwqf" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.277525 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-hdsvr" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.280588 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.296328 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-j945k"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.297415 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-j945k" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.304028 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-dnts4" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.309488 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-jzwqf"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.316109 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-j945k"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.337462 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-6zjhd"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.338507 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-6zjhd" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.344724 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-tfk7f"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.345798 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-59578bc799-tfk7f" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.346596 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hlgg\" (UniqueName: \"kubernetes.io/projected/3bc5e8dd-bc95-4b65-afda-a821512a89dd-kube-api-access-2hlgg\") pod \"infra-operator-controller-manager-585fc5b659-jzwqf\" (UID: \"3bc5e8dd-bc95-4b65-afda-a821512a89dd\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-jzwqf" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.346745 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcvt7\" (UniqueName: \"kubernetes.io/projected/8b59c5dc-f309-48cc-9c66-7a5c42050f8e-kube-api-access-jcvt7\") pod \"ironic-operator-controller-manager-74cb5cbc49-j945k\" (UID: \"8b59c5dc-f309-48cc-9c66-7a5c42050f8e\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-j945k" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.346860 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrwcx\" (UniqueName: \"kubernetes.io/projected/582c5c2a-a5b2-43bf-bbdb-4c3fb1b21c09-kube-api-access-lrwcx\") pod \"heat-operator-controller-manager-6d9967f8dd-z4mpg\" (UID: \"582c5c2a-a5b2-43bf-bbdb-4c3fb1b21c09\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-z4mpg" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.346958 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb6fm\" (UniqueName: \"kubernetes.io/projected/f013ff43-3cb6-47f5-bc35-a4bf02143db0-kube-api-access-cb6fm\") pod \"designate-operator-controller-manager-687df44cdb-wzn6r\" (UID: \"f013ff43-3cb6-47f5-bc35-a4bf02143db0\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-wzn6r" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.347077 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmbdz\" (UniqueName: \"kubernetes.io/projected/651b9dd5-bce9-4ca0-b6f7-cca0c3fb30eb-kube-api-access-kmbdz\") pod \"keystone-operator-controller-manager-ddb98f99b-6zjhd\" (UID: \"651b9dd5-bce9-4ca0-b6f7-cca0c3fb30eb\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-6zjhd" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.347217 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-c4bdm" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.347225 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2k9c\" (UniqueName: \"kubernetes.io/projected/f4b78ea6-51d8-4a7a-b5d3-cc4bdc3b5ba4-kube-api-access-q2k9c\") pod \"cinder-operator-controller-manager-59cdc64769-pnb2g\" (UID: \"f4b78ea6-51d8-4a7a-b5d3-cc4bdc3b5ba4\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-pnb2g" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.347477 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxzn4\" (UniqueName: \"kubernetes.io/projected/ab93ff28-c8ec-4514-bd82-dbab0fe25cee-kube-api-access-nxzn4\") pod \"barbican-operator-controller-manager-64f84fcdbb-kxkjp\" (UID: \"ab93ff28-c8ec-4514-bd82-dbab0fe25cee\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-kxkjp" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.347531 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt8m6\" (UniqueName: \"kubernetes.io/projected/64ce70f3-641d-4dfd-811e-c786365c9859-kube-api-access-qt8m6\") pod \"horizon-operator-controller-manager-6d74794d9b-jg6r2\" (UID: \"64ce70f3-641d-4dfd-811e-c786365c9859\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-jg6r2" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.347571 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bc5e8dd-bc95-4b65-afda-a821512a89dd-cert\") pod \"infra-operator-controller-manager-585fc5b659-jzwqf\" (UID: \"3bc5e8dd-bc95-4b65-afda-a821512a89dd\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-jzwqf" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.347640 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz9sx\" (UniqueName: \"kubernetes.io/projected/d5e1695b-e7fb-4c23-9848-c6abacde588c-kube-api-access-rz9sx\") pod \"glance-operator-controller-manager-7bb46cd7d-pvjzc\" (UID: \"d5e1695b-e7fb-4c23-9848-c6abacde588c\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-pvjzc" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.348016 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-679fj" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.348266 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-6zjhd"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.390557 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2k9c\" (UniqueName: \"kubernetes.io/projected/f4b78ea6-51d8-4a7a-b5d3-cc4bdc3b5ba4-kube-api-access-q2k9c\") pod \"cinder-operator-controller-manager-59cdc64769-pnb2g\" (UID: \"f4b78ea6-51d8-4a7a-b5d3-cc4bdc3b5ba4\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-pnb2g" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.394640 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb6fm\" (UniqueName: \"kubernetes.io/projected/f013ff43-3cb6-47f5-bc35-a4bf02143db0-kube-api-access-cb6fm\") pod \"designate-operator-controller-manager-687df44cdb-wzn6r\" (UID: \"f013ff43-3cb6-47f5-bc35-a4bf02143db0\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-wzn6r" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.396432 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-tfk7f"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.396702 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-pnb2g" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.397969 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxzn4\" (UniqueName: \"kubernetes.io/projected/ab93ff28-c8ec-4514-bd82-dbab0fe25cee-kube-api-access-nxzn4\") pod \"barbican-operator-controller-manager-64f84fcdbb-kxkjp\" (UID: \"ab93ff28-c8ec-4514-bd82-dbab0fe25cee\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-kxkjp" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.414382 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-zj22f"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.415277 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-kxkjp" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.415444 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-zj22f" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.418556 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-jmxxn" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.421884 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-wzn6r" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.426822 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-tmxgv"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.428401 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-tmxgv" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.433531 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-qbtsl"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.435037 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-qbtsl" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.441772 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-4qc7c" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.445714 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-kdrfq" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.451130 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ncnx\" (UniqueName: \"kubernetes.io/projected/6b82d858-736f-487f-ba35-c1478301b229-kube-api-access-9ncnx\") pod \"nova-operator-controller-manager-57bb74c7bf-tmxgv\" (UID: \"6b82d858-736f-487f-ba35-c1478301b229\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-tmxgv" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.451454 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbr4d\" (UniqueName: \"kubernetes.io/projected/14a3f87a-25c5-476e-8379-0b15d3511315-kube-api-access-jbr4d\") pod \"neutron-operator-controller-manager-797d478b46-qbtsl\" (UID: \"14a3f87a-25c5-476e-8379-0b15d3511315\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-qbtsl" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.451633 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmbdz\" (UniqueName: \"kubernetes.io/projected/651b9dd5-bce9-4ca0-b6f7-cca0c3fb30eb-kube-api-access-kmbdz\") pod \"keystone-operator-controller-manager-ddb98f99b-6zjhd\" (UID: \"651b9dd5-bce9-4ca0-b6f7-cca0c3fb30eb\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-6zjhd" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.451721 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l92qs\" (UniqueName: \"kubernetes.io/projected/2292f494-d606-40b2-bb8b-7dcc6e9dfeb4-kube-api-access-l92qs\") pod \"manila-operator-controller-manager-59578bc799-tfk7f\" (UID: \"2292f494-d606-40b2-bb8b-7dcc6e9dfeb4\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-tfk7f" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.451772 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt8m6\" (UniqueName: \"kubernetes.io/projected/64ce70f3-641d-4dfd-811e-c786365c9859-kube-api-access-qt8m6\") pod \"horizon-operator-controller-manager-6d74794d9b-jg6r2\" (UID: \"64ce70f3-641d-4dfd-811e-c786365c9859\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-jg6r2" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.451800 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bc5e8dd-bc95-4b65-afda-a821512a89dd-cert\") pod \"infra-operator-controller-manager-585fc5b659-jzwqf\" (UID: \"3bc5e8dd-bc95-4b65-afda-a821512a89dd\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-jzwqf" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.451845 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q2p9\" (UniqueName: \"kubernetes.io/projected/bc8d9b2a-7a74-40f1-8a70-e8f0013fad38-kube-api-access-6q2p9\") pod \"mariadb-operator-controller-manager-5777b4f897-zj22f\" (UID: \"bc8d9b2a-7a74-40f1-8a70-e8f0013fad38\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-zj22f" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.451868 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz9sx\" (UniqueName: \"kubernetes.io/projected/d5e1695b-e7fb-4c23-9848-c6abacde588c-kube-api-access-rz9sx\") pod \"glance-operator-controller-manager-7bb46cd7d-pvjzc\" (UID: \"d5e1695b-e7fb-4c23-9848-c6abacde588c\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-pvjzc" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.451930 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hlgg\" (UniqueName: \"kubernetes.io/projected/3bc5e8dd-bc95-4b65-afda-a821512a89dd-kube-api-access-2hlgg\") pod \"infra-operator-controller-manager-585fc5b659-jzwqf\" (UID: \"3bc5e8dd-bc95-4b65-afda-a821512a89dd\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-jzwqf" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.451960 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcvt7\" (UniqueName: \"kubernetes.io/projected/8b59c5dc-f309-48cc-9c66-7a5c42050f8e-kube-api-access-jcvt7\") pod \"ironic-operator-controller-manager-74cb5cbc49-j945k\" (UID: \"8b59c5dc-f309-48cc-9c66-7a5c42050f8e\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-j945k" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.451991 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrwcx\" (UniqueName: \"kubernetes.io/projected/582c5c2a-a5b2-43bf-bbdb-4c3fb1b21c09-kube-api-access-lrwcx\") pod \"heat-operator-controller-manager-6d9967f8dd-z4mpg\" (UID: \"582c5c2a-a5b2-43bf-bbdb-4c3fb1b21c09\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-z4mpg" Oct 09 15:32:32 crc kubenswrapper[4719]: E1009 15:32:32.452623 4719 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 09 15:32:32 crc kubenswrapper[4719]: E1009 15:32:32.452675 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bc5e8dd-bc95-4b65-afda-a821512a89dd-cert podName:3bc5e8dd-bc95-4b65-afda-a821512a89dd nodeName:}" failed. No retries permitted until 2025-10-09 15:32:32.952660546 +0000 UTC m=+858.462371831 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3bc5e8dd-bc95-4b65-afda-a821512a89dd-cert") pod "infra-operator-controller-manager-585fc5b659-jzwqf" (UID: "3bc5e8dd-bc95-4b65-afda-a821512a89dd") : secret "infra-operator-webhook-server-cert" not found Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.481078 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-zj22f"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.483723 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz9sx\" (UniqueName: \"kubernetes.io/projected/d5e1695b-e7fb-4c23-9848-c6abacde588c-kube-api-access-rz9sx\") pod \"glance-operator-controller-manager-7bb46cd7d-pvjzc\" (UID: \"d5e1695b-e7fb-4c23-9848-c6abacde588c\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-pvjzc" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.492972 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrwcx\" (UniqueName: \"kubernetes.io/projected/582c5c2a-a5b2-43bf-bbdb-4c3fb1b21c09-kube-api-access-lrwcx\") pod \"heat-operator-controller-manager-6d9967f8dd-z4mpg\" (UID: \"582c5c2a-a5b2-43bf-bbdb-4c3fb1b21c09\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-z4mpg" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.493096 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hlgg\" (UniqueName: \"kubernetes.io/projected/3bc5e8dd-bc95-4b65-afda-a821512a89dd-kube-api-access-2hlgg\") pod \"infra-operator-controller-manager-585fc5b659-jzwqf\" (UID: \"3bc5e8dd-bc95-4b65-afda-a821512a89dd\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-jzwqf" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.494692 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmbdz\" (UniqueName: \"kubernetes.io/projected/651b9dd5-bce9-4ca0-b6f7-cca0c3fb30eb-kube-api-access-kmbdz\") pod \"keystone-operator-controller-manager-ddb98f99b-6zjhd\" (UID: \"651b9dd5-bce9-4ca0-b6f7-cca0c3fb30eb\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-6zjhd" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.503489 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-tmxgv"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.509326 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcvt7\" (UniqueName: \"kubernetes.io/projected/8b59c5dc-f309-48cc-9c66-7a5c42050f8e-kube-api-access-jcvt7\") pod \"ironic-operator-controller-manager-74cb5cbc49-j945k\" (UID: \"8b59c5dc-f309-48cc-9c66-7a5c42050f8e\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-j945k" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.513797 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-z4mpg" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.515284 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt8m6\" (UniqueName: \"kubernetes.io/projected/64ce70f3-641d-4dfd-811e-c786365c9859-kube-api-access-qt8m6\") pod \"horizon-operator-controller-manager-6d74794d9b-jg6r2\" (UID: \"64ce70f3-641d-4dfd-811e-c786365c9859\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-jg6r2" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.519678 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-d284h"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.520872 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-d284h" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.526029 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-544st" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.534511 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-jg6r2" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.535637 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-qbtsl"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.541771 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-d284h"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.559040 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q2p9\" (UniqueName: \"kubernetes.io/projected/bc8d9b2a-7a74-40f1-8a70-e8f0013fad38-kube-api-access-6q2p9\") pod \"mariadb-operator-controller-manager-5777b4f897-zj22f\" (UID: \"bc8d9b2a-7a74-40f1-8a70-e8f0013fad38\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-zj22f" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.559145 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ncnx\" (UniqueName: \"kubernetes.io/projected/6b82d858-736f-487f-ba35-c1478301b229-kube-api-access-9ncnx\") pod \"nova-operator-controller-manager-57bb74c7bf-tmxgv\" (UID: \"6b82d858-736f-487f-ba35-c1478301b229\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-tmxgv" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.559181 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccwvb\" (UniqueName: \"kubernetes.io/projected/308fe096-8aff-4a3b-a83a-bb2b1ef8c5df-kube-api-access-ccwvb\") pod \"octavia-operator-controller-manager-6d7c7ddf95-d284h\" (UID: \"308fe096-8aff-4a3b-a83a-bb2b1ef8c5df\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-d284h" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.559231 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbr4d\" (UniqueName: \"kubernetes.io/projected/14a3f87a-25c5-476e-8379-0b15d3511315-kube-api-access-jbr4d\") pod \"neutron-operator-controller-manager-797d478b46-qbtsl\" (UID: \"14a3f87a-25c5-476e-8379-0b15d3511315\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-qbtsl" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.559278 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l92qs\" (UniqueName: \"kubernetes.io/projected/2292f494-d606-40b2-bb8b-7dcc6e9dfeb4-kube-api-access-l92qs\") pod \"manila-operator-controller-manager-59578bc799-tfk7f\" (UID: \"2292f494-d606-40b2-bb8b-7dcc6e9dfeb4\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-tfk7f" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.578116 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-869cc7797f-k8dck"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.580534 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l92qs\" (UniqueName: \"kubernetes.io/projected/2292f494-d606-40b2-bb8b-7dcc6e9dfeb4-kube-api-access-l92qs\") pod \"manila-operator-controller-manager-59578bc799-tfk7f\" (UID: \"2292f494-d606-40b2-bb8b-7dcc6e9dfeb4\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-tfk7f" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.581109 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q2p9\" (UniqueName: \"kubernetes.io/projected/bc8d9b2a-7a74-40f1-8a70-e8f0013fad38-kube-api-access-6q2p9\") pod \"mariadb-operator-controller-manager-5777b4f897-zj22f\" (UID: \"bc8d9b2a-7a74-40f1-8a70-e8f0013fad38\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-zj22f" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.581606 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.581640 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbr4d\" (UniqueName: \"kubernetes.io/projected/14a3f87a-25c5-476e-8379-0b15d3511315-kube-api-access-jbr4d\") pod \"neutron-operator-controller-manager-797d478b46-qbtsl\" (UID: \"14a3f87a-25c5-476e-8379-0b15d3511315\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-qbtsl" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.581836 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-k8dck" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.582013 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ncnx\" (UniqueName: \"kubernetes.io/projected/6b82d858-736f-487f-ba35-c1478301b229-kube-api-access-9ncnx\") pod \"nova-operator-controller-manager-57bb74c7bf-tmxgv\" (UID: \"6b82d858-736f-487f-ba35-c1478301b229\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-tmxgv" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.583625 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-89zct" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.586243 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-869cc7797f-k8dck"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.586393 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.588177 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-qbtsl" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.588577 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.588707 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-bk5jf" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.602904 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-2td8x"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.625297 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-j945k" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.633573 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-664664cb68-2td8x" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.676022 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-vtz7v" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.681744 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-6zjhd" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.703294 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-59578bc799-tfk7f" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.703807 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5584dd28-d59b-41bf-b24a-ec18d01029e1-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk\" (UID: \"5584dd28-d59b-41bf-b24a-ec18d01029e1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.709282 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvpws\" (UniqueName: \"kubernetes.io/projected/5584dd28-d59b-41bf-b24a-ec18d01029e1-kube-api-access-cvpws\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk\" (UID: \"5584dd28-d59b-41bf-b24a-ec18d01029e1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.709435 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccwvb\" (UniqueName: \"kubernetes.io/projected/308fe096-8aff-4a3b-a83a-bb2b1ef8c5df-kube-api-access-ccwvb\") pod \"octavia-operator-controller-manager-6d7c7ddf95-d284h\" (UID: \"308fe096-8aff-4a3b-a83a-bb2b1ef8c5df\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-d284h" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.752341 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccwvb\" (UniqueName: \"kubernetes.io/projected/308fe096-8aff-4a3b-a83a-bb2b1ef8c5df-kube-api-access-ccwvb\") pod \"octavia-operator-controller-manager-6d7c7ddf95-d284h\" (UID: \"308fe096-8aff-4a3b-a83a-bb2b1ef8c5df\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-d284h" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.776612 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-2td8x"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.780116 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-pvjzc" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.807402 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-lh9st"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.808619 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-lh9st" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.810986 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgk6c\" (UniqueName: \"kubernetes.io/projected/08380711-65b1-4957-80ba-36c2f064e618-kube-api-access-dgk6c\") pod \"placement-operator-controller-manager-664664cb68-2td8x\" (UID: \"08380711-65b1-4957-80ba-36c2f064e618\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-2td8x" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.811057 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvpws\" (UniqueName: \"kubernetes.io/projected/5584dd28-d59b-41bf-b24a-ec18d01029e1-kube-api-access-cvpws\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk\" (UID: \"5584dd28-d59b-41bf-b24a-ec18d01029e1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.811098 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5584dd28-d59b-41bf-b24a-ec18d01029e1-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk\" (UID: \"5584dd28-d59b-41bf-b24a-ec18d01029e1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.811118 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh47j\" (UniqueName: \"kubernetes.io/projected/607972ec-63ef-43a7-a1ed-0aab9fffc680-kube-api-access-jh47j\") pod \"ovn-operator-controller-manager-869cc7797f-k8dck\" (UID: \"607972ec-63ef-43a7-a1ed-0aab9fffc680\") " pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-k8dck" Oct 09 15:32:32 crc kubenswrapper[4719]: E1009 15:32:32.811478 4719 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 09 15:32:32 crc kubenswrapper[4719]: E1009 15:32:32.811516 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5584dd28-d59b-41bf-b24a-ec18d01029e1-cert podName:5584dd28-d59b-41bf-b24a-ec18d01029e1 nodeName:}" failed. No retries permitted until 2025-10-09 15:32:33.31150251 +0000 UTC m=+858.821213795 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5584dd28-d59b-41bf-b24a-ec18d01029e1-cert") pod "openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk" (UID: "5584dd28-d59b-41bf-b24a-ec18d01029e1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.815569 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-8x9dn" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.834394 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.859448 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-zj22f" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.860486 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-578874c84d-w6fj7"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.864581 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-w6fj7" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.870544 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-95fxt" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.876628 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvpws\" (UniqueName: \"kubernetes.io/projected/5584dd28-d59b-41bf-b24a-ec18d01029e1-kube-api-access-cvpws\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk\" (UID: \"5584dd28-d59b-41bf-b24a-ec18d01029e1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.878089 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-tmxgv" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.892641 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-lh9st"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.901398 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-578874c84d-w6fj7"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.901993 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-d284h" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.910645 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-q9cm2"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.912131 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-q9cm2" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.916739 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-q9cm2"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.918751 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-9wnwm" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.922635 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh47j\" (UniqueName: \"kubernetes.io/projected/607972ec-63ef-43a7-a1ed-0aab9fffc680-kube-api-access-jh47j\") pod \"ovn-operator-controller-manager-869cc7797f-k8dck\" (UID: \"607972ec-63ef-43a7-a1ed-0aab9fffc680\") " pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-k8dck" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.922734 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgk6c\" (UniqueName: \"kubernetes.io/projected/08380711-65b1-4957-80ba-36c2f064e618-kube-api-access-dgk6c\") pod \"placement-operator-controller-manager-664664cb68-2td8x\" (UID: \"08380711-65b1-4957-80ba-36c2f064e618\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-2td8x" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.922777 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjpgv\" (UniqueName: \"kubernetes.io/projected/6d60ce50-53c4-47c1-b222-88b92c43fd4d-kube-api-access-bjpgv\") pod \"swift-operator-controller-manager-5f4d5dfdc6-lh9st\" (UID: \"6d60ce50-53c4-47c1-b222-88b92c43fd4d\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-lh9st" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.938481 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-cc79478c-885gj"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.939620 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-cc79478c-885gj" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.945549 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-gwbgp" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.950967 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh47j\" (UniqueName: \"kubernetes.io/projected/607972ec-63ef-43a7-a1ed-0aab9fffc680-kube-api-access-jh47j\") pod \"ovn-operator-controller-manager-869cc7797f-k8dck\" (UID: \"607972ec-63ef-43a7-a1ed-0aab9fffc680\") " pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-k8dck" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.958058 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-cc79478c-885gj"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.959401 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgk6c\" (UniqueName: \"kubernetes.io/projected/08380711-65b1-4957-80ba-36c2f064e618-kube-api-access-dgk6c\") pod \"placement-operator-controller-manager-664664cb68-2td8x\" (UID: \"08380711-65b1-4957-80ba-36c2f064e618\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-2td8x" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.968741 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-664664cb68-2td8x" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.995846 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-67f69c4d95-5p5fq"] Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.997335 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-67f69c4d95-5p5fq" Oct 09 15:32:32 crc kubenswrapper[4719]: I1009 15:32:32.999191 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.003694 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-67f69c4d95-5p5fq"] Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.030179 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-46v88" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.032639 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6674n\" (UniqueName: \"kubernetes.io/projected/288a232e-38ff-44b7-9fda-738becefc8d7-kube-api-access-6674n\") pod \"test-operator-controller-manager-ffcdd6c94-q9cm2\" (UID: \"288a232e-38ff-44b7-9fda-738becefc8d7\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-q9cm2" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.032683 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/406c7514-3092-45dc-abde-352acbfa0108-cert\") pod \"openstack-operator-controller-manager-67f69c4d95-5p5fq\" (UID: \"406c7514-3092-45dc-abde-352acbfa0108\") " pod="openstack-operators/openstack-operator-controller-manager-67f69c4d95-5p5fq" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.032709 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w49nx\" (UniqueName: \"kubernetes.io/projected/c56b1641-8023-4761-a55f-763dfe5f7c4f-kube-api-access-w49nx\") pod \"telemetry-operator-controller-manager-578874c84d-w6fj7\" (UID: \"c56b1641-8023-4761-a55f-763dfe5f7c4f\") " pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-w6fj7" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.032746 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bc5e8dd-bc95-4b65-afda-a821512a89dd-cert\") pod \"infra-operator-controller-manager-585fc5b659-jzwqf\" (UID: \"3bc5e8dd-bc95-4b65-afda-a821512a89dd\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-jzwqf" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.032811 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjpgv\" (UniqueName: \"kubernetes.io/projected/6d60ce50-53c4-47c1-b222-88b92c43fd4d-kube-api-access-bjpgv\") pod \"swift-operator-controller-manager-5f4d5dfdc6-lh9st\" (UID: \"6d60ce50-53c4-47c1-b222-88b92c43fd4d\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-lh9st" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.032838 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt2wv\" (UniqueName: \"kubernetes.io/projected/44ee4b27-7bdd-4a5e-98ed-1b8b5f01b54f-kube-api-access-pt2wv\") pod \"watcher-operator-controller-manager-cc79478c-885gj\" (UID: \"44ee4b27-7bdd-4a5e-98ed-1b8b5f01b54f\") " pod="openstack-operators/watcher-operator-controller-manager-cc79478c-885gj" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.032862 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cndr\" (UniqueName: \"kubernetes.io/projected/406c7514-3092-45dc-abde-352acbfa0108-kube-api-access-8cndr\") pod \"openstack-operator-controller-manager-67f69c4d95-5p5fq\" (UID: \"406c7514-3092-45dc-abde-352acbfa0108\") " pod="openstack-operators/openstack-operator-controller-manager-67f69c4d95-5p5fq" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.054728 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r4m97"] Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.061696 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bc5e8dd-bc95-4b65-afda-a821512a89dd-cert\") pod \"infra-operator-controller-manager-585fc5b659-jzwqf\" (UID: \"3bc5e8dd-bc95-4b65-afda-a821512a89dd\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-jzwqf" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.076156 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r4m97" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.103081 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-rf9pn" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.116337 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjpgv\" (UniqueName: \"kubernetes.io/projected/6d60ce50-53c4-47c1-b222-88b92c43fd4d-kube-api-access-bjpgv\") pod \"swift-operator-controller-manager-5f4d5dfdc6-lh9st\" (UID: \"6d60ce50-53c4-47c1-b222-88b92c43fd4d\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-lh9st" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.129691 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r4m97"] Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.137646 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6674n\" (UniqueName: \"kubernetes.io/projected/288a232e-38ff-44b7-9fda-738becefc8d7-kube-api-access-6674n\") pod \"test-operator-controller-manager-ffcdd6c94-q9cm2\" (UID: \"288a232e-38ff-44b7-9fda-738becefc8d7\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-q9cm2" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.137698 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/406c7514-3092-45dc-abde-352acbfa0108-cert\") pod \"openstack-operator-controller-manager-67f69c4d95-5p5fq\" (UID: \"406c7514-3092-45dc-abde-352acbfa0108\") " pod="openstack-operators/openstack-operator-controller-manager-67f69c4d95-5p5fq" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.137719 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w49nx\" (UniqueName: \"kubernetes.io/projected/c56b1641-8023-4761-a55f-763dfe5f7c4f-kube-api-access-w49nx\") pod \"telemetry-operator-controller-manager-578874c84d-w6fj7\" (UID: \"c56b1641-8023-4761-a55f-763dfe5f7c4f\") " pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-w6fj7" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.137786 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt2wv\" (UniqueName: \"kubernetes.io/projected/44ee4b27-7bdd-4a5e-98ed-1b8b5f01b54f-kube-api-access-pt2wv\") pod \"watcher-operator-controller-manager-cc79478c-885gj\" (UID: \"44ee4b27-7bdd-4a5e-98ed-1b8b5f01b54f\") " pod="openstack-operators/watcher-operator-controller-manager-cc79478c-885gj" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.137802 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cndr\" (UniqueName: \"kubernetes.io/projected/406c7514-3092-45dc-abde-352acbfa0108-kube-api-access-8cndr\") pod \"openstack-operator-controller-manager-67f69c4d95-5p5fq\" (UID: \"406c7514-3092-45dc-abde-352acbfa0108\") " pod="openstack-operators/openstack-operator-controller-manager-67f69c4d95-5p5fq" Oct 09 15:32:33 crc kubenswrapper[4719]: E1009 15:32:33.147740 4719 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 09 15:32:33 crc kubenswrapper[4719]: E1009 15:32:33.147825 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/406c7514-3092-45dc-abde-352acbfa0108-cert podName:406c7514-3092-45dc-abde-352acbfa0108 nodeName:}" failed. No retries permitted until 2025-10-09 15:32:33.647806828 +0000 UTC m=+859.157518113 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/406c7514-3092-45dc-abde-352acbfa0108-cert") pod "openstack-operator-controller-manager-67f69c4d95-5p5fq" (UID: "406c7514-3092-45dc-abde-352acbfa0108") : secret "webhook-server-cert" not found Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.173235 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt2wv\" (UniqueName: \"kubernetes.io/projected/44ee4b27-7bdd-4a5e-98ed-1b8b5f01b54f-kube-api-access-pt2wv\") pod \"watcher-operator-controller-manager-cc79478c-885gj\" (UID: \"44ee4b27-7bdd-4a5e-98ed-1b8b5f01b54f\") " pod="openstack-operators/watcher-operator-controller-manager-cc79478c-885gj" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.196674 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-lh9st" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.197236 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-jzwqf" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.202866 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w49nx\" (UniqueName: \"kubernetes.io/projected/c56b1641-8023-4761-a55f-763dfe5f7c4f-kube-api-access-w49nx\") pod \"telemetry-operator-controller-manager-578874c84d-w6fj7\" (UID: \"c56b1641-8023-4761-a55f-763dfe5f7c4f\") " pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-w6fj7" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.213934 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cndr\" (UniqueName: \"kubernetes.io/projected/406c7514-3092-45dc-abde-352acbfa0108-kube-api-access-8cndr\") pod \"openstack-operator-controller-manager-67f69c4d95-5p5fq\" (UID: \"406c7514-3092-45dc-abde-352acbfa0108\") " pod="openstack-operators/openstack-operator-controller-manager-67f69c4d95-5p5fq" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.227905 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6674n\" (UniqueName: \"kubernetes.io/projected/288a232e-38ff-44b7-9fda-738becefc8d7-kube-api-access-6674n\") pod \"test-operator-controller-manager-ffcdd6c94-q9cm2\" (UID: \"288a232e-38ff-44b7-9fda-738becefc8d7\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-q9cm2" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.240552 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-k8dck" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.252983 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jwb6\" (UniqueName: \"kubernetes.io/projected/6776ccc8-9114-46e5-a2a2-699f8917bfac-kube-api-access-2jwb6\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-r4m97\" (UID: \"6776ccc8-9114-46e5-a2a2-699f8917bfac\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r4m97" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.280756 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-w6fj7" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.298611 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-q9cm2" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.354705 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5584dd28-d59b-41bf-b24a-ec18d01029e1-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk\" (UID: \"5584dd28-d59b-41bf-b24a-ec18d01029e1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.354785 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jwb6\" (UniqueName: \"kubernetes.io/projected/6776ccc8-9114-46e5-a2a2-699f8917bfac-kube-api-access-2jwb6\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-r4m97\" (UID: \"6776ccc8-9114-46e5-a2a2-699f8917bfac\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r4m97" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.360113 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5584dd28-d59b-41bf-b24a-ec18d01029e1-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk\" (UID: \"5584dd28-d59b-41bf-b24a-ec18d01029e1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.386083 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jwb6\" (UniqueName: \"kubernetes.io/projected/6776ccc8-9114-46e5-a2a2-699f8917bfac-kube-api-access-2jwb6\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-r4m97\" (UID: \"6776ccc8-9114-46e5-a2a2-699f8917bfac\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r4m97" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.426511 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-wzn6r"] Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.443845 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-cc79478c-885gj" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.499670 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r4m97" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.559823 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-pnb2g"] Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.569667 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk" Oct 09 15:32:33 crc kubenswrapper[4719]: W1009 15:32:33.634570 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b78ea6_51d8_4a7a_b5d3_cc4bdc3b5ba4.slice/crio-aac3a21e0035828a3440fe96939b7df5d39c3ee817272f492ca4e6714b385d34 WatchSource:0}: Error finding container aac3a21e0035828a3440fe96939b7df5d39c3ee817272f492ca4e6714b385d34: Status 404 returned error can't find the container with id aac3a21e0035828a3440fe96939b7df5d39c3ee817272f492ca4e6714b385d34 Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.663008 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/406c7514-3092-45dc-abde-352acbfa0108-cert\") pod \"openstack-operator-controller-manager-67f69c4d95-5p5fq\" (UID: \"406c7514-3092-45dc-abde-352acbfa0108\") " pod="openstack-operators/openstack-operator-controller-manager-67f69c4d95-5p5fq" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.669295 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/406c7514-3092-45dc-abde-352acbfa0108-cert\") pod \"openstack-operator-controller-manager-67f69c4d95-5p5fq\" (UID: \"406c7514-3092-45dc-abde-352acbfa0108\") " pod="openstack-operators/openstack-operator-controller-manager-67f69c4d95-5p5fq" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.748914 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-67f69c4d95-5p5fq" Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.968896 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-jg6r2"] Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.975962 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-j945k"] Oct 09 15:32:33 crc kubenswrapper[4719]: W1009 15:32:33.980673 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b59c5dc_f309_48cc_9c66_7a5c42050f8e.slice/crio-53e9329c549711d7ba0456370ac2d951b45fa6565f948c398b9633e9d66b9c37 WatchSource:0}: Error finding container 53e9329c549711d7ba0456370ac2d951b45fa6565f948c398b9633e9d66b9c37: Status 404 returned error can't find the container with id 53e9329c549711d7ba0456370ac2d951b45fa6565f948c398b9633e9d66b9c37 Oct 09 15:32:33 crc kubenswrapper[4719]: I1009 15:32:33.992637 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-tfk7f"] Oct 09 15:32:34 crc kubenswrapper[4719]: W1009 15:32:34.004542 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2292f494_d606_40b2_bb8b_7dcc6e9dfeb4.slice/crio-51d18406426a9301a943a0a648ae46bacb7d7b4bb084d387567aa9dc4273d7ae WatchSource:0}: Error finding container 51d18406426a9301a943a0a648ae46bacb7d7b4bb084d387567aa9dc4273d7ae: Status 404 returned error can't find the container with id 51d18406426a9301a943a0a648ae46bacb7d7b4bb084d387567aa9dc4273d7ae Oct 09 15:32:34 crc kubenswrapper[4719]: I1009 15:32:34.008867 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-kxkjp"] Oct 09 15:32:34 crc kubenswrapper[4719]: I1009 15:32:34.087637 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-wzn6r" event={"ID":"f013ff43-3cb6-47f5-bc35-a4bf02143db0","Type":"ContainerStarted","Data":"77e00758dec22c045c87e23125da4cdd0e1413f3165b335a684e8d2f10b58add"} Oct 09 15:32:34 crc kubenswrapper[4719]: I1009 15:32:34.088694 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-jg6r2" event={"ID":"64ce70f3-641d-4dfd-811e-c786365c9859","Type":"ContainerStarted","Data":"a4615f8f4ccb7dc96b9574c29ed53fa49142b10942b3a550af2ce2e1c39bae74"} Oct 09 15:32:34 crc kubenswrapper[4719]: I1009 15:32:34.090460 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-tfk7f" event={"ID":"2292f494-d606-40b2-bb8b-7dcc6e9dfeb4","Type":"ContainerStarted","Data":"51d18406426a9301a943a0a648ae46bacb7d7b4bb084d387567aa9dc4273d7ae"} Oct 09 15:32:34 crc kubenswrapper[4719]: I1009 15:32:34.091473 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-pnb2g" event={"ID":"f4b78ea6-51d8-4a7a-b5d3-cc4bdc3b5ba4","Type":"ContainerStarted","Data":"aac3a21e0035828a3440fe96939b7df5d39c3ee817272f492ca4e6714b385d34"} Oct 09 15:32:34 crc kubenswrapper[4719]: I1009 15:32:34.092621 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-j945k" event={"ID":"8b59c5dc-f309-48cc-9c66-7a5c42050f8e","Type":"ContainerStarted","Data":"53e9329c549711d7ba0456370ac2d951b45fa6565f948c398b9633e9d66b9c37"} Oct 09 15:32:34 crc kubenswrapper[4719]: I1009 15:32:34.094646 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-kxkjp" event={"ID":"ab93ff28-c8ec-4514-bd82-dbab0fe25cee","Type":"ContainerStarted","Data":"258a6b7732936e979683150ae64a972634b24031b0bd94acd3dab04ecb39833f"} Oct 09 15:32:34 crc kubenswrapper[4719]: I1009 15:32:34.175232 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-z4mpg"] Oct 09 15:32:34 crc kubenswrapper[4719]: W1009 15:32:34.177797 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod582c5c2a_a5b2_43bf_bbdb_4c3fb1b21c09.slice/crio-baef426319d6bc02d44672b1dd3cb762d9376e2b13258765635897176e85ca44 WatchSource:0}: Error finding container baef426319d6bc02d44672b1dd3cb762d9376e2b13258765635897176e85ca44: Status 404 returned error can't find the container with id baef426319d6bc02d44672b1dd3cb762d9376e2b13258765635897176e85ca44 Oct 09 15:32:34 crc kubenswrapper[4719]: I1009 15:32:34.423427 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-tmxgv"] Oct 09 15:32:34 crc kubenswrapper[4719]: I1009 15:32:34.435623 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-qbtsl"] Oct 09 15:32:34 crc kubenswrapper[4719]: I1009 15:32:34.452515 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-zj22f"] Oct 09 15:32:34 crc kubenswrapper[4719]: I1009 15:32:34.458549 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-6zjhd"] Oct 09 15:32:34 crc kubenswrapper[4719]: W1009 15:32:34.463465 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14a3f87a_25c5_476e_8379_0b15d3511315.slice/crio-ee535b567b3f83a9088a5e1187316c816f8d26a41929cc8dce0dfcf5f2459ec2 WatchSource:0}: Error finding container ee535b567b3f83a9088a5e1187316c816f8d26a41929cc8dce0dfcf5f2459ec2: Status 404 returned error can't find the container with id ee535b567b3f83a9088a5e1187316c816f8d26a41929cc8dce0dfcf5f2459ec2 Oct 09 15:32:34 crc kubenswrapper[4719]: I1009 15:32:34.482700 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-pvjzc"] Oct 09 15:32:34 crc kubenswrapper[4719]: W1009 15:32:34.510767 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod651b9dd5_bce9_4ca0_b6f7_cca0c3fb30eb.slice/crio-96474acc63c1a199f012485dbc6f60431c0249ed35019766d5f91d5c5296846e WatchSource:0}: Error finding container 96474acc63c1a199f012485dbc6f60431c0249ed35019766d5f91d5c5296846e: Status 404 returned error can't find the container with id 96474acc63c1a199f012485dbc6f60431c0249ed35019766d5f91d5c5296846e Oct 09 15:32:34 crc kubenswrapper[4719]: I1009 15:32:34.626471 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-q9cm2"] Oct 09 15:32:34 crc kubenswrapper[4719]: I1009 15:32:34.645151 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-2td8x"] Oct 09 15:32:34 crc kubenswrapper[4719]: W1009 15:32:34.647081 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod288a232e_38ff_44b7_9fda_738becefc8d7.slice/crio-7f1ca5b91ea55156d61b832abc5b3d614a479ee2a86e5a3d073c8493c0dde867 WatchSource:0}: Error finding container 7f1ca5b91ea55156d61b832abc5b3d614a479ee2a86e5a3d073c8493c0dde867: Status 404 returned error can't find the container with id 7f1ca5b91ea55156d61b832abc5b3d614a479ee2a86e5a3d073c8493c0dde867 Oct 09 15:32:34 crc kubenswrapper[4719]: I1009 15:32:34.673415 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-869cc7797f-k8dck"] Oct 09 15:32:34 crc kubenswrapper[4719]: I1009 15:32:34.685002 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r4m97"] Oct 09 15:32:34 crc kubenswrapper[4719]: I1009 15:32:34.693433 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-578874c84d-w6fj7"] Oct 09 15:32:34 crc kubenswrapper[4719]: I1009 15:32:34.726577 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-lh9st"] Oct 09 15:32:34 crc kubenswrapper[4719]: I1009 15:32:34.726635 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-cc79478c-885gj"] Oct 09 15:32:34 crc kubenswrapper[4719]: E1009 15:32:34.731382 4719 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w49nx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-578874c84d-w6fj7_openstack-operators(c56b1641-8023-4761-a55f-763dfe5f7c4f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 15:32:34 crc kubenswrapper[4719]: E1009 15:32:34.731681 4719 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bjpgv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f4d5dfdc6-lh9st_openstack-operators(6d60ce50-53c4-47c1-b222-88b92c43fd4d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 15:32:34 crc kubenswrapper[4719]: I1009 15:32:34.734530 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-d284h"] Oct 09 15:32:34 crc kubenswrapper[4719]: E1009 15:32:34.748114 4719 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2jwb6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-r4m97_openstack-operators(6776ccc8-9114-46e5-a2a2-699f8917bfac): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 15:32:34 crc kubenswrapper[4719]: E1009 15:32:34.748273 4719 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.66:5001/openstack-k8s-operators/watcher-operator:9c9ab71408802f8b5a792bcba160111ecd2c1247,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pt2wv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-cc79478c-885gj_openstack-operators(44ee4b27-7bdd-4a5e-98ed-1b8b5f01b54f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 15:32:34 crc kubenswrapper[4719]: E1009 15:32:34.748429 4719 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:09deecf840d38ff6af3c924729cf0a9444bc985848bfbe7c918019b88a6bc4d7,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ccwvb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-6d7c7ddf95-d284h_openstack-operators(308fe096-8aff-4a3b-a83a-bb2b1ef8c5df): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 15:32:34 crc kubenswrapper[4719]: I1009 15:32:34.748965 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk"] Oct 09 15:32:34 crc kubenswrapper[4719]: E1009 15:32:34.750461 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r4m97" podUID="6776ccc8-9114-46e5-a2a2-699f8917bfac" Oct 09 15:32:34 crc kubenswrapper[4719]: I1009 15:32:34.753886 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-jzwqf"] Oct 09 15:32:34 crc kubenswrapper[4719]: I1009 15:32:34.758864 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-67f69c4d95-5p5fq"] Oct 09 15:32:34 crc kubenswrapper[4719]: E1009 15:32:34.800309 4719 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:5cfb2ae1092445950b39dd59caa9a8c9367f42fb8353a8c3848d3bc729f24492,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2hlgg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-585fc5b659-jzwqf_openstack-operators(3bc5e8dd-bc95-4b65-afda-a821512a89dd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 15:32:35 crc kubenswrapper[4719]: I1009 15:32:35.145643 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk" event={"ID":"5584dd28-d59b-41bf-b24a-ec18d01029e1","Type":"ContainerStarted","Data":"4124ce35d1745c19961cf3e9559e909afadd62b0fffeb3afecbbc4a9631c9bf0"} Oct 09 15:32:35 crc kubenswrapper[4719]: I1009 15:32:35.156614 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-pvjzc" event={"ID":"d5e1695b-e7fb-4c23-9848-c6abacde588c","Type":"ContainerStarted","Data":"dda9823e8d5c437c2471da934d52918bf53b477e6b251bcd8e72dae7c6832511"} Oct 09 15:32:35 crc kubenswrapper[4719]: E1009 15:32:35.189315 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-cc79478c-885gj" podUID="44ee4b27-7bdd-4a5e-98ed-1b8b5f01b54f" Oct 09 15:32:35 crc kubenswrapper[4719]: E1009 15:32:35.192214 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-d284h" podUID="308fe096-8aff-4a3b-a83a-bb2b1ef8c5df" Oct 09 15:32:35 crc kubenswrapper[4719]: E1009 15:32:35.203890 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-lh9st" podUID="6d60ce50-53c4-47c1-b222-88b92c43fd4d" Oct 09 15:32:35 crc kubenswrapper[4719]: I1009 15:32:35.226412 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-w6fj7" event={"ID":"c56b1641-8023-4761-a55f-763dfe5f7c4f","Type":"ContainerStarted","Data":"e9abe5fb96a350cc00571fef54048bfd9371d76411d1fe04c540735e93105d73"} Oct 09 15:32:35 crc kubenswrapper[4719]: I1009 15:32:35.226754 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-2td8x" event={"ID":"08380711-65b1-4957-80ba-36c2f064e618","Type":"ContainerStarted","Data":"8d5e765711b819e004bb4ec646745af2bcf8241e611e4562c3479049258cd014"} Oct 09 15:32:35 crc kubenswrapper[4719]: I1009 15:32:35.226767 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-d284h" event={"ID":"308fe096-8aff-4a3b-a83a-bb2b1ef8c5df","Type":"ContainerStarted","Data":"6b6fc571c22fc999b7133b4908fd4a92f52d92442b6ba87568f210f85d9113da"} Oct 09 15:32:35 crc kubenswrapper[4719]: I1009 15:32:35.226776 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-tmxgv" event={"ID":"6b82d858-736f-487f-ba35-c1478301b229","Type":"ContainerStarted","Data":"9afdcde1d91aacbb2b11fce508d0fe9cbda17408859ed9e83f9645a25903b083"} Oct 09 15:32:35 crc kubenswrapper[4719]: I1009 15:32:35.226787 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-qbtsl" event={"ID":"14a3f87a-25c5-476e-8379-0b15d3511315","Type":"ContainerStarted","Data":"ee535b567b3f83a9088a5e1187316c816f8d26a41929cc8dce0dfcf5f2459ec2"} Oct 09 15:32:35 crc kubenswrapper[4719]: I1009 15:32:35.226797 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-z4mpg" event={"ID":"582c5c2a-a5b2-43bf-bbdb-4c3fb1b21c09","Type":"ContainerStarted","Data":"baef426319d6bc02d44672b1dd3cb762d9376e2b13258765635897176e85ca44"} Oct 09 15:32:35 crc kubenswrapper[4719]: I1009 15:32:35.226806 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-q9cm2" event={"ID":"288a232e-38ff-44b7-9fda-738becefc8d7","Type":"ContainerStarted","Data":"7f1ca5b91ea55156d61b832abc5b3d614a479ee2a86e5a3d073c8493c0dde867"} Oct 09 15:32:35 crc kubenswrapper[4719]: I1009 15:32:35.226815 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-cc79478c-885gj" event={"ID":"44ee4b27-7bdd-4a5e-98ed-1b8b5f01b54f","Type":"ContainerStarted","Data":"e604d5da132af204e4667ccbad6f9b4561f78ec77f2b14ec13295279b9c89254"} Oct 09 15:32:35 crc kubenswrapper[4719]: I1009 15:32:35.226825 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-6zjhd" event={"ID":"651b9dd5-bce9-4ca0-b6f7-cca0c3fb30eb","Type":"ContainerStarted","Data":"96474acc63c1a199f012485dbc6f60431c0249ed35019766d5f91d5c5296846e"} Oct 09 15:32:35 crc kubenswrapper[4719]: I1009 15:32:35.226834 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-zj22f" event={"ID":"bc8d9b2a-7a74-40f1-8a70-e8f0013fad38","Type":"ContainerStarted","Data":"e12dab19a8d262457ad348c20d2db8bb382f315cd7a83713bcdf51a8a4ea2e2f"} Oct 09 15:32:35 crc kubenswrapper[4719]: I1009 15:32:35.226843 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-67f69c4d95-5p5fq" event={"ID":"406c7514-3092-45dc-abde-352acbfa0108","Type":"ContainerStarted","Data":"ff06c027db806637767a0d580c22122cf390d84fef114eea6b94ff6fb5592eef"} Oct 09 15:32:35 crc kubenswrapper[4719]: I1009 15:32:35.226852 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-jzwqf" event={"ID":"3bc5e8dd-bc95-4b65-afda-a821512a89dd","Type":"ContainerStarted","Data":"c4c34cfb36e10e25c257f6021fc4d7874111881a57d2012c59871b2715c8dce0"} Oct 09 15:32:35 crc kubenswrapper[4719]: I1009 15:32:35.226863 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-k8dck" event={"ID":"607972ec-63ef-43a7-a1ed-0aab9fffc680","Type":"ContainerStarted","Data":"bbbae01911d88898c1ad61baa705680c03f0be9706ff9eb6e872760d04c9356d"} Oct 09 15:32:35 crc kubenswrapper[4719]: E1009 15:32:35.231949 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.66:5001/openstack-k8s-operators/watcher-operator:9c9ab71408802f8b5a792bcba160111ecd2c1247\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-cc79478c-885gj" podUID="44ee4b27-7bdd-4a5e-98ed-1b8b5f01b54f" Oct 09 15:32:35 crc kubenswrapper[4719]: I1009 15:32:35.234195 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r4m97" event={"ID":"6776ccc8-9114-46e5-a2a2-699f8917bfac","Type":"ContainerStarted","Data":"3ced096f1e41b2982e06c0fa0d216eddc05848d1c12eaceef9fca1cffdcb1ab9"} Oct 09 15:32:35 crc kubenswrapper[4719]: E1009 15:32:35.239602 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r4m97" podUID="6776ccc8-9114-46e5-a2a2-699f8917bfac" Oct 09 15:32:35 crc kubenswrapper[4719]: E1009 15:32:35.239976 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-w6fj7" podUID="c56b1641-8023-4761-a55f-763dfe5f7c4f" Oct 09 15:32:35 crc kubenswrapper[4719]: I1009 15:32:35.247577 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-lh9st" event={"ID":"6d60ce50-53c4-47c1-b222-88b92c43fd4d","Type":"ContainerStarted","Data":"c2e01dca4e11cb5d5b12e2406b9fa75c05ae518226a72016f4094c4e72666118"} Oct 09 15:32:35 crc kubenswrapper[4719]: E1009 15:32:35.251627 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-lh9st" podUID="6d60ce50-53c4-47c1-b222-88b92c43fd4d" Oct 09 15:32:35 crc kubenswrapper[4719]: E1009 15:32:35.311796 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-jzwqf" podUID="3bc5e8dd-bc95-4b65-afda-a821512a89dd" Oct 09 15:32:36 crc kubenswrapper[4719]: I1009 15:32:36.262161 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-lh9st" event={"ID":"6d60ce50-53c4-47c1-b222-88b92c43fd4d","Type":"ContainerStarted","Data":"8a06761b5dfabc09e71b3852eb1af2ab8a9643445a170427f31c7aba56b57a8a"} Oct 09 15:32:36 crc kubenswrapper[4719]: E1009 15:32:36.263782 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-lh9st" podUID="6d60ce50-53c4-47c1-b222-88b92c43fd4d" Oct 09 15:32:36 crc kubenswrapper[4719]: I1009 15:32:36.266077 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-w6fj7" event={"ID":"c56b1641-8023-4761-a55f-763dfe5f7c4f","Type":"ContainerStarted","Data":"7f2b4897726e9b32c99437df8b78231bd163a7308b4d40aaf908d44dadfdb0bc"} Oct 09 15:32:36 crc kubenswrapper[4719]: E1009 15:32:36.269614 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-w6fj7" podUID="c56b1641-8023-4761-a55f-763dfe5f7c4f" Oct 09 15:32:36 crc kubenswrapper[4719]: I1009 15:32:36.282149 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-cc79478c-885gj" event={"ID":"44ee4b27-7bdd-4a5e-98ed-1b8b5f01b54f","Type":"ContainerStarted","Data":"aaf7321d0e7ed6321bed10efa5195150ffbf62159348edde085188f886b99cc3"} Oct 09 15:32:36 crc kubenswrapper[4719]: E1009 15:32:36.286180 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.66:5001/openstack-k8s-operators/watcher-operator:9c9ab71408802f8b5a792bcba160111ecd2c1247\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-cc79478c-885gj" podUID="44ee4b27-7bdd-4a5e-98ed-1b8b5f01b54f" Oct 09 15:32:36 crc kubenswrapper[4719]: I1009 15:32:36.288816 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-67f69c4d95-5p5fq" event={"ID":"406c7514-3092-45dc-abde-352acbfa0108","Type":"ContainerStarted","Data":"da3861fb4158f39c788a37e377a14ad57ca5690c25350de5f3ffb08d1b9dcac9"} Oct 09 15:32:36 crc kubenswrapper[4719]: I1009 15:32:36.288847 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-67f69c4d95-5p5fq" event={"ID":"406c7514-3092-45dc-abde-352acbfa0108","Type":"ContainerStarted","Data":"7eaaba389f42d8c3126177a8f91c49ddcff2b18b42456e171fa92e666377f74f"} Oct 09 15:32:36 crc kubenswrapper[4719]: I1009 15:32:36.289492 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-67f69c4d95-5p5fq" Oct 09 15:32:36 crc kubenswrapper[4719]: I1009 15:32:36.293635 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-jzwqf" event={"ID":"3bc5e8dd-bc95-4b65-afda-a821512a89dd","Type":"ContainerStarted","Data":"9aea2148364be3bab6455bda6d5bca1ba80e472386587c5978b4b1a4033aed72"} Oct 09 15:32:36 crc kubenswrapper[4719]: E1009 15:32:36.296966 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:5cfb2ae1092445950b39dd59caa9a8c9367f42fb8353a8c3848d3bc729f24492\\\"\"" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-jzwqf" podUID="3bc5e8dd-bc95-4b65-afda-a821512a89dd" Oct 09 15:32:36 crc kubenswrapper[4719]: I1009 15:32:36.338224 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-d284h" event={"ID":"308fe096-8aff-4a3b-a83a-bb2b1ef8c5df","Type":"ContainerStarted","Data":"f25cc302f676d898da990bb86cc3d75395cba75659d3839469160550430232b4"} Oct 09 15:32:36 crc kubenswrapper[4719]: E1009 15:32:36.338938 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:09deecf840d38ff6af3c924729cf0a9444bc985848bfbe7c918019b88a6bc4d7\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-d284h" podUID="308fe096-8aff-4a3b-a83a-bb2b1ef8c5df" Oct 09 15:32:36 crc kubenswrapper[4719]: E1009 15:32:36.361404 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r4m97" podUID="6776ccc8-9114-46e5-a2a2-699f8917bfac" Oct 09 15:32:36 crc kubenswrapper[4719]: I1009 15:32:36.450175 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-67f69c4d95-5p5fq" podStartSLOduration=4.450148479 podStartE2EDuration="4.450148479s" podCreationTimestamp="2025-10-09 15:32:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:32:36.444834891 +0000 UTC m=+861.954546176" watchObservedRunningTime="2025-10-09 15:32:36.450148479 +0000 UTC m=+861.959859784" Oct 09 15:32:37 crc kubenswrapper[4719]: E1009 15:32:37.363682 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:5cfb2ae1092445950b39dd59caa9a8c9367f42fb8353a8c3848d3bc729f24492\\\"\"" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-jzwqf" podUID="3bc5e8dd-bc95-4b65-afda-a821512a89dd" Oct 09 15:32:37 crc kubenswrapper[4719]: E1009 15:32:37.365760 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:09deecf840d38ff6af3c924729cf0a9444bc985848bfbe7c918019b88a6bc4d7\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-d284h" podUID="308fe096-8aff-4a3b-a83a-bb2b1ef8c5df" Oct 09 15:32:37 crc kubenswrapper[4719]: E1009 15:32:37.365830 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.66:5001/openstack-k8s-operators/watcher-operator:9c9ab71408802f8b5a792bcba160111ecd2c1247\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-cc79478c-885gj" podUID="44ee4b27-7bdd-4a5e-98ed-1b8b5f01b54f" Oct 09 15:32:37 crc kubenswrapper[4719]: E1009 15:32:37.365944 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-lh9st" podUID="6d60ce50-53c4-47c1-b222-88b92c43fd4d" Oct 09 15:32:37 crc kubenswrapper[4719]: E1009 15:32:37.372675 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-w6fj7" podUID="c56b1641-8023-4761-a55f-763dfe5f7c4f" Oct 09 15:32:43 crc kubenswrapper[4719]: I1009 15:32:43.755577 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-67f69c4d95-5p5fq" Oct 09 15:32:47 crc kubenswrapper[4719]: I1009 15:32:47.447843 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-pvjzc" event={"ID":"d5e1695b-e7fb-4c23-9848-c6abacde588c","Type":"ContainerStarted","Data":"e38acbca71f8e0885685f44a09c1daa378b8217222121832ca9d65a506e45b38"} Oct 09 15:32:47 crc kubenswrapper[4719]: I1009 15:32:47.457998 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-qbtsl" event={"ID":"14a3f87a-25c5-476e-8379-0b15d3511315","Type":"ContainerStarted","Data":"2e780fed735a31a6ed2baada3f0ca1e7ff35307d0b6872acc1c90097bf4591fd"} Oct 09 15:32:47 crc kubenswrapper[4719]: I1009 15:32:47.479762 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-q9cm2" event={"ID":"288a232e-38ff-44b7-9fda-738becefc8d7","Type":"ContainerStarted","Data":"619bda1955003b1586f9a1552782a408f7fbc5e4a1b35755df7e6ca28cd037a8"} Oct 09 15:32:47 crc kubenswrapper[4719]: I1009 15:32:47.481741 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-zj22f" event={"ID":"bc8d9b2a-7a74-40f1-8a70-e8f0013fad38","Type":"ContainerStarted","Data":"4354aee04d74b1d30cb26343934e70700f07f5653a7fe397ccc600c028473c43"} Oct 09 15:32:47 crc kubenswrapper[4719]: I1009 15:32:47.483085 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-pnb2g" event={"ID":"f4b78ea6-51d8-4a7a-b5d3-cc4bdc3b5ba4","Type":"ContainerStarted","Data":"929cc71a7cc5e70db5d2c53a69d78c6e3dfca6db311959db8100a0600d9cac76"} Oct 09 15:32:47 crc kubenswrapper[4719]: I1009 15:32:47.498655 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-tmxgv" event={"ID":"6b82d858-736f-487f-ba35-c1478301b229","Type":"ContainerStarted","Data":"21f1250431bfc108df8b1f439b79bfc62b668e51c69f6c17ac455d9bb8b79ab5"} Oct 09 15:32:47 crc kubenswrapper[4719]: I1009 15:32:47.503412 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-kxkjp" event={"ID":"ab93ff28-c8ec-4514-bd82-dbab0fe25cee","Type":"ContainerStarted","Data":"8ea88f765db41f10df112a34edd1b5f9d172fae32b82b966cee8f4ac57af8b86"} Oct 09 15:32:47 crc kubenswrapper[4719]: I1009 15:32:47.517226 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-jg6r2" event={"ID":"64ce70f3-641d-4dfd-811e-c786365c9859","Type":"ContainerStarted","Data":"781b1f1611a37177e77afb3a3c41dd85f3d090dad1e4051420964ab9d7127d20"} Oct 09 15:32:47 crc kubenswrapper[4719]: I1009 15:32:47.524994 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-6zjhd" event={"ID":"651b9dd5-bce9-4ca0-b6f7-cca0c3fb30eb","Type":"ContainerStarted","Data":"d7742e0c574bf300e3bbdaf86cd0e555e2bb1eec5d359fc741cac57d00f45cdd"} Oct 09 15:32:47 crc kubenswrapper[4719]: I1009 15:32:47.530181 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-j945k" event={"ID":"8b59c5dc-f309-48cc-9c66-7a5c42050f8e","Type":"ContainerStarted","Data":"b7c2dc6b715b14124ec468427e2dc314ed803b6f840e2231148b368f3500cb74"} Oct 09 15:32:47 crc kubenswrapper[4719]: I1009 15:32:47.532341 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-tfk7f" event={"ID":"2292f494-d606-40b2-bb8b-7dcc6e9dfeb4","Type":"ContainerStarted","Data":"ea0d8eda2dd3eaeaf4ca240ce0be9aade68f4d6bb3baba49b7b24366d14109c1"} Oct 09 15:32:47 crc kubenswrapper[4719]: I1009 15:32:47.533449 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-59578bc799-tfk7f" Oct 09 15:32:47 crc kubenswrapper[4719]: I1009 15:32:47.539174 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk" event={"ID":"5584dd28-d59b-41bf-b24a-ec18d01029e1","Type":"ContainerStarted","Data":"ebc45a54760deab1f819b94ffc932a00c1cbba3cb9d0b487de83b06c0d2ea377"} Oct 09 15:32:47 crc kubenswrapper[4719]: I1009 15:32:47.542505 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-k8dck" event={"ID":"607972ec-63ef-43a7-a1ed-0aab9fffc680","Type":"ContainerStarted","Data":"aca76d991929375f23dd72eaf942a623567e3c0fcc517d5832a78ed6baea2a60"} Oct 09 15:32:47 crc kubenswrapper[4719]: I1009 15:32:47.559314 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-wzn6r" event={"ID":"f013ff43-3cb6-47f5-bc35-a4bf02143db0","Type":"ContainerStarted","Data":"4553efe83820e0b649244fe812dd46a532f5817e090e912e757355c789f10734"} Oct 09 15:32:47 crc kubenswrapper[4719]: I1009 15:32:47.559489 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-wzn6r" Oct 09 15:32:47 crc kubenswrapper[4719]: I1009 15:32:47.575604 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-z4mpg" event={"ID":"582c5c2a-a5b2-43bf-bbdb-4c3fb1b21c09","Type":"ContainerStarted","Data":"f4485c673947c3e102df911b137033d04975c3b025665e42b35cc8289784e485"} Oct 09 15:32:47 crc kubenswrapper[4719]: I1009 15:32:47.576064 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-59578bc799-tfk7f" podStartSLOduration=3.324419736 podStartE2EDuration="15.576043905s" podCreationTimestamp="2025-10-09 15:32:32 +0000 UTC" firstStartedPulling="2025-10-09 15:32:34.01557991 +0000 UTC m=+859.525291195" lastFinishedPulling="2025-10-09 15:32:46.267204079 +0000 UTC m=+871.776915364" observedRunningTime="2025-10-09 15:32:47.566947447 +0000 UTC m=+873.076658732" watchObservedRunningTime="2025-10-09 15:32:47.576043905 +0000 UTC m=+873.085755190" Oct 09 15:32:47 crc kubenswrapper[4719]: I1009 15:32:47.588638 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-2td8x" event={"ID":"08380711-65b1-4957-80ba-36c2f064e618","Type":"ContainerStarted","Data":"4849a02847c9b6fa96b38fdba310178f5aaaf66d6e361e33adb7a3e1821a2b54"} Oct 09 15:32:47 crc kubenswrapper[4719]: I1009 15:32:47.589256 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-wzn6r" podStartSLOduration=2.827516491 podStartE2EDuration="15.589240715s" podCreationTimestamp="2025-10-09 15:32:32 +0000 UTC" firstStartedPulling="2025-10-09 15:32:33.529437565 +0000 UTC m=+859.039148850" lastFinishedPulling="2025-10-09 15:32:46.291161789 +0000 UTC m=+871.800873074" observedRunningTime="2025-10-09 15:32:47.588280954 +0000 UTC m=+873.097992249" watchObservedRunningTime="2025-10-09 15:32:47.589240715 +0000 UTC m=+873.098952000" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.166659 4719 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.596203 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-pvjzc" event={"ID":"d5e1695b-e7fb-4c23-9848-c6abacde588c","Type":"ContainerStarted","Data":"8c74b66aa86d5d24ce314c30712bbe88caf557c9db5ea84495a9eee460e408f9"} Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.596585 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-pvjzc" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.597643 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-wzn6r" event={"ID":"f013ff43-3cb6-47f5-bc35-a4bf02143db0","Type":"ContainerStarted","Data":"b9d213972021279761f0d1d91b034fa69842c09f5fad4b5c7272ccb41ab12c21"} Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.599523 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-tfk7f" event={"ID":"2292f494-d606-40b2-bb8b-7dcc6e9dfeb4","Type":"ContainerStarted","Data":"1c019869a2462bc0c25a9d1d306e5d222c563ddd3423606f18a8bf16fe8e6cc2"} Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.601193 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-6zjhd" event={"ID":"651b9dd5-bce9-4ca0-b6f7-cca0c3fb30eb","Type":"ContainerStarted","Data":"cbd2e516c426775dadc392448e7ba57552f9277fdf625c0146757b20f84865dc"} Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.601324 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-6zjhd" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.602629 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk" event={"ID":"5584dd28-d59b-41bf-b24a-ec18d01029e1","Type":"ContainerStarted","Data":"7144dac30c77836753dbaed9b9e2dbc76e950ca0ca5c58ce433d0136be6c28fd"} Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.602834 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.604489 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-tmxgv" event={"ID":"6b82d858-736f-487f-ba35-c1478301b229","Type":"ContainerStarted","Data":"adbf861c1a0fed1b005ed9d92fb8ba9c89e7f9c7415c398308371cc6b4739176"} Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.604586 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-tmxgv" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.606456 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-qbtsl" event={"ID":"14a3f87a-25c5-476e-8379-0b15d3511315","Type":"ContainerStarted","Data":"ffa087b46b41bb7d03b9e0ced71cbba4afd28960bc4855c0ae045d016039f033"} Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.606562 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-qbtsl" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.608242 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-kxkjp" event={"ID":"ab93ff28-c8ec-4514-bd82-dbab0fe25cee","Type":"ContainerStarted","Data":"89a4e155c4aac61c3022d4f10806154784a795fae5a5bfc41b09636641b7e44b"} Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.608514 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-kxkjp" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.609760 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-jg6r2" event={"ID":"64ce70f3-641d-4dfd-811e-c786365c9859","Type":"ContainerStarted","Data":"67afdbaac88d13b3bcf77061bbb94979402b693432acc466e66a74a8a3451d29"} Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.609926 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-jg6r2" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.611887 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-q9cm2" event={"ID":"288a232e-38ff-44b7-9fda-738becefc8d7","Type":"ContainerStarted","Data":"235b0fa07eb5c63e7f16f404dd1919c803c7aa650b88c4d199424742fce98815"} Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.612023 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-q9cm2" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.614285 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-2td8x" event={"ID":"08380711-65b1-4957-80ba-36c2f064e618","Type":"ContainerStarted","Data":"55bed6082451126e642350465daefea59458bc8cce590810e63cf0c6a8232295"} Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.614329 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-664664cb68-2td8x" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.616633 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-pnb2g" event={"ID":"f4b78ea6-51d8-4a7a-b5d3-cc4bdc3b5ba4","Type":"ContainerStarted","Data":"871a8fe719a38a6ffac442bec271ab8e402bfb30cf5ca7e70907b1b5e4b5446f"} Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.616795 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-pnb2g" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.618713 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-j945k" event={"ID":"8b59c5dc-f309-48cc-9c66-7a5c42050f8e","Type":"ContainerStarted","Data":"f0f529d781cb396a5197b67fc76bd61a2e1170e94ab1d100351c205d4743aded"} Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.621001 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-z4mpg" event={"ID":"582c5c2a-a5b2-43bf-bbdb-4c3fb1b21c09","Type":"ContainerStarted","Data":"21b916f16ca223f0c626e8f86dc4f13ace691b702546755747b4aed8a13d7a12"} Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.621248 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-z4mpg" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.623237 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-pvjzc" podStartSLOduration=4.907024286 podStartE2EDuration="16.623224484s" podCreationTimestamp="2025-10-09 15:32:32 +0000 UTC" firstStartedPulling="2025-10-09 15:32:34.566344457 +0000 UTC m=+860.076055742" lastFinishedPulling="2025-10-09 15:32:46.282544655 +0000 UTC m=+871.792255940" observedRunningTime="2025-10-09 15:32:48.620546279 +0000 UTC m=+874.130257574" watchObservedRunningTime="2025-10-09 15:32:48.623224484 +0000 UTC m=+874.132935769" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.623487 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-zj22f" event={"ID":"bc8d9b2a-7a74-40f1-8a70-e8f0013fad38","Type":"ContainerStarted","Data":"b9629b73df36965cf2c28a146e170fcac3716020950975756c39214c0cca7099"} Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.623624 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-zj22f" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.625753 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-k8dck" event={"ID":"607972ec-63ef-43a7-a1ed-0aab9fffc680","Type":"ContainerStarted","Data":"d7e7b748a4dbdc30018b75c3ab48506a970fa19818472e2931a0a79f676d08a3"} Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.625947 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-k8dck" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.651022 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-j945k" podStartSLOduration=4.298943919 podStartE2EDuration="16.651004336s" podCreationTimestamp="2025-10-09 15:32:32 +0000 UTC" firstStartedPulling="2025-10-09 15:32:33.984548065 +0000 UTC m=+859.494259350" lastFinishedPulling="2025-10-09 15:32:46.336608492 +0000 UTC m=+871.846319767" observedRunningTime="2025-10-09 15:32:48.646326677 +0000 UTC m=+874.156037982" watchObservedRunningTime="2025-10-09 15:32:48.651004336 +0000 UTC m=+874.160715631" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.696842 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-jg6r2" podStartSLOduration=4.392477748 podStartE2EDuration="16.696821771s" podCreationTimestamp="2025-10-09 15:32:32 +0000 UTC" firstStartedPulling="2025-10-09 15:32:33.977654476 +0000 UTC m=+859.487365751" lastFinishedPulling="2025-10-09 15:32:46.281998489 +0000 UTC m=+871.791709774" observedRunningTime="2025-10-09 15:32:48.692190434 +0000 UTC m=+874.201901739" watchObservedRunningTime="2025-10-09 15:32:48.696821771 +0000 UTC m=+874.206533056" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.696951 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-q9cm2" podStartSLOduration=5.033641205 podStartE2EDuration="16.696946055s" podCreationTimestamp="2025-10-09 15:32:32 +0000 UTC" firstStartedPulling="2025-10-09 15:32:34.663316246 +0000 UTC m=+860.173027531" lastFinishedPulling="2025-10-09 15:32:46.326621106 +0000 UTC m=+871.836332381" observedRunningTime="2025-10-09 15:32:48.672871511 +0000 UTC m=+874.182582806" watchObservedRunningTime="2025-10-09 15:32:48.696946055 +0000 UTC m=+874.206657350" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.729626 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk" podStartSLOduration=5.18838766 podStartE2EDuration="16.729610692s" podCreationTimestamp="2025-10-09 15:32:32 +0000 UTC" firstStartedPulling="2025-10-09 15:32:34.783581735 +0000 UTC m=+860.293293020" lastFinishedPulling="2025-10-09 15:32:46.324804767 +0000 UTC m=+871.834516052" observedRunningTime="2025-10-09 15:32:48.726159672 +0000 UTC m=+874.235870957" watchObservedRunningTime="2025-10-09 15:32:48.729610692 +0000 UTC m=+874.239321977" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.752271 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-664664cb68-2td8x" podStartSLOduration=5.195121123 podStartE2EDuration="16.752255451s" podCreationTimestamp="2025-10-09 15:32:32 +0000 UTC" firstStartedPulling="2025-10-09 15:32:34.726666908 +0000 UTC m=+860.236378193" lastFinishedPulling="2025-10-09 15:32:46.283801236 +0000 UTC m=+871.793512521" observedRunningTime="2025-10-09 15:32:48.748306116 +0000 UTC m=+874.258017411" watchObservedRunningTime="2025-10-09 15:32:48.752255451 +0000 UTC m=+874.261966736" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.772120 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-tmxgv" podStartSLOduration=4.927518735 podStartE2EDuration="16.772103961s" podCreationTimestamp="2025-10-09 15:32:32 +0000 UTC" firstStartedPulling="2025-10-09 15:32:34.437419543 +0000 UTC m=+859.947130838" lastFinishedPulling="2025-10-09 15:32:46.282004779 +0000 UTC m=+871.791716064" observedRunningTime="2025-10-09 15:32:48.771309485 +0000 UTC m=+874.281020780" watchObservedRunningTime="2025-10-09 15:32:48.772103961 +0000 UTC m=+874.281815246" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.790753 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-kxkjp" podStartSLOduration=4.472485088 podStartE2EDuration="16.790736952s" podCreationTimestamp="2025-10-09 15:32:32 +0000 UTC" firstStartedPulling="2025-10-09 15:32:34.018208483 +0000 UTC m=+859.527919768" lastFinishedPulling="2025-10-09 15:32:46.336460347 +0000 UTC m=+871.846171632" observedRunningTime="2025-10-09 15:32:48.787473759 +0000 UTC m=+874.297185044" watchObservedRunningTime="2025-10-09 15:32:48.790736952 +0000 UTC m=+874.300448237" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.809275 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-pnb2g" podStartSLOduration=4.151328042 podStartE2EDuration="16.809254131s" podCreationTimestamp="2025-10-09 15:32:32 +0000 UTC" firstStartedPulling="2025-10-09 15:32:33.644677224 +0000 UTC m=+859.154388509" lastFinishedPulling="2025-10-09 15:32:46.302603303 +0000 UTC m=+871.812314598" observedRunningTime="2025-10-09 15:32:48.805665326 +0000 UTC m=+874.315376601" watchObservedRunningTime="2025-10-09 15:32:48.809254131 +0000 UTC m=+874.318965416" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.829312 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-6zjhd" podStartSLOduration=5.07064999 podStartE2EDuration="16.829295427s" podCreationTimestamp="2025-10-09 15:32:32 +0000 UTC" firstStartedPulling="2025-10-09 15:32:34.543804091 +0000 UTC m=+860.053515376" lastFinishedPulling="2025-10-09 15:32:46.302449528 +0000 UTC m=+871.812160813" observedRunningTime="2025-10-09 15:32:48.827231591 +0000 UTC m=+874.336942886" watchObservedRunningTime="2025-10-09 15:32:48.829295427 +0000 UTC m=+874.339006722" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.886686 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-qbtsl" podStartSLOduration=5.028801221 podStartE2EDuration="16.886670198s" podCreationTimestamp="2025-10-09 15:32:32 +0000 UTC" firstStartedPulling="2025-10-09 15:32:34.468714147 +0000 UTC m=+859.978425432" lastFinishedPulling="2025-10-09 15:32:46.326583114 +0000 UTC m=+871.836294409" observedRunningTime="2025-10-09 15:32:48.859479275 +0000 UTC m=+874.369190560" watchObservedRunningTime="2025-10-09 15:32:48.886670198 +0000 UTC m=+874.396381483" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.888900 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-k8dck" podStartSLOduration=5.281889558 podStartE2EDuration="16.888893879s" podCreationTimestamp="2025-10-09 15:32:32 +0000 UTC" firstStartedPulling="2025-10-09 15:32:34.719664566 +0000 UTC m=+860.229375851" lastFinishedPulling="2025-10-09 15:32:46.326668887 +0000 UTC m=+871.836380172" observedRunningTime="2025-10-09 15:32:48.882238558 +0000 UTC m=+874.391949843" watchObservedRunningTime="2025-10-09 15:32:48.888893879 +0000 UTC m=+874.398605164" Oct 09 15:32:48 crc kubenswrapper[4719]: I1009 15:32:48.906661 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-zj22f" podStartSLOduration=5.043671094 podStartE2EDuration="16.906641503s" podCreationTimestamp="2025-10-09 15:32:32 +0000 UTC" firstStartedPulling="2025-10-09 15:32:34.463554914 +0000 UTC m=+859.973266199" lastFinishedPulling="2025-10-09 15:32:46.326525323 +0000 UTC m=+871.836236608" observedRunningTime="2025-10-09 15:32:48.900246789 +0000 UTC m=+874.409958074" watchObservedRunningTime="2025-10-09 15:32:48.906641503 +0000 UTC m=+874.416352788" Oct 09 15:32:49 crc kubenswrapper[4719]: I1009 15:32:49.640222 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-j945k" Oct 09 15:32:50 crc kubenswrapper[4719]: I1009 15:32:50.648318 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-lh9st" event={"ID":"6d60ce50-53c4-47c1-b222-88b92c43fd4d","Type":"ContainerStarted","Data":"a981d1ba5d40cb301ce239cb07c7fc787c9d13030410384154ad8d7b3e7777f4"} Oct 09 15:32:50 crc kubenswrapper[4719]: I1009 15:32:50.648936 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-lh9st" Oct 09 15:32:50 crc kubenswrapper[4719]: I1009 15:32:50.650902 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-jzwqf" event={"ID":"3bc5e8dd-bc95-4b65-afda-a821512a89dd","Type":"ContainerStarted","Data":"02daf47268849a2c46417b5d37ad46c1863ba5f788f97a924577d2f65a0e975d"} Oct 09 15:32:50 crc kubenswrapper[4719]: I1009 15:32:50.666407 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-lh9st" podStartSLOduration=3.019507435 podStartE2EDuration="18.666387066s" podCreationTimestamp="2025-10-09 15:32:32 +0000 UTC" firstStartedPulling="2025-10-09 15:32:34.731517371 +0000 UTC m=+860.241228656" lastFinishedPulling="2025-10-09 15:32:50.378397002 +0000 UTC m=+875.888108287" observedRunningTime="2025-10-09 15:32:50.664704252 +0000 UTC m=+876.174415557" watchObservedRunningTime="2025-10-09 15:32:50.666387066 +0000 UTC m=+876.176098351" Oct 09 15:32:50 crc kubenswrapper[4719]: I1009 15:32:50.670421 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-z4mpg" podStartSLOduration=6.559256205 podStartE2EDuration="18.670406084s" podCreationTimestamp="2025-10-09 15:32:32 +0000 UTC" firstStartedPulling="2025-10-09 15:32:34.17999314 +0000 UTC m=+859.689704425" lastFinishedPulling="2025-10-09 15:32:46.291143019 +0000 UTC m=+871.800854304" observedRunningTime="2025-10-09 15:32:48.923799638 +0000 UTC m=+874.433510943" watchObservedRunningTime="2025-10-09 15:32:50.670406084 +0000 UTC m=+876.180117369" Oct 09 15:32:50 crc kubenswrapper[4719]: I1009 15:32:50.687194 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-jzwqf" podStartSLOduration=3.10598397 podStartE2EDuration="18.687151115s" podCreationTimestamp="2025-10-09 15:32:32 +0000 UTC" firstStartedPulling="2025-10-09 15:32:34.799249482 +0000 UTC m=+860.308961217" lastFinishedPulling="2025-10-09 15:32:50.380417077 +0000 UTC m=+875.890128362" observedRunningTime="2025-10-09 15:32:50.683274292 +0000 UTC m=+876.192985577" watchObservedRunningTime="2025-10-09 15:32:50.687151115 +0000 UTC m=+876.196862400" Oct 09 15:32:52 crc kubenswrapper[4719]: I1009 15:32:52.399750 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-pnb2g" Oct 09 15:32:52 crc kubenswrapper[4719]: I1009 15:32:52.422128 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-kxkjp" Oct 09 15:32:52 crc kubenswrapper[4719]: I1009 15:32:52.436654 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-wzn6r" Oct 09 15:32:52 crc kubenswrapper[4719]: I1009 15:32:52.519599 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-z4mpg" Oct 09 15:32:52 crc kubenswrapper[4719]: I1009 15:32:52.544889 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-jg6r2" Oct 09 15:32:52 crc kubenswrapper[4719]: I1009 15:32:52.612701 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-qbtsl" Oct 09 15:32:52 crc kubenswrapper[4719]: I1009 15:32:52.646717 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-j945k" Oct 09 15:32:52 crc kubenswrapper[4719]: I1009 15:32:52.686274 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-w6fj7" event={"ID":"c56b1641-8023-4761-a55f-763dfe5f7c4f","Type":"ContainerStarted","Data":"76b0234da0bff8282aeef22f2fcc9238de3d9914bdabdbf554b11049b23ff581"} Oct 09 15:32:52 crc kubenswrapper[4719]: I1009 15:32:52.687175 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-w6fj7" Oct 09 15:32:52 crc kubenswrapper[4719]: I1009 15:32:52.687451 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-6zjhd" Oct 09 15:32:52 crc kubenswrapper[4719]: I1009 15:32:52.689624 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-d284h" event={"ID":"308fe096-8aff-4a3b-a83a-bb2b1ef8c5df","Type":"ContainerStarted","Data":"c7041980839e93248ed893a249cc551b451c7f29680ef96e1cd29c8b1e8d1de2"} Oct 09 15:32:52 crc kubenswrapper[4719]: I1009 15:32:52.690217 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-d284h" Oct 09 15:32:52 crc kubenswrapper[4719]: I1009 15:32:52.713742 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-59578bc799-tfk7f" Oct 09 15:32:52 crc kubenswrapper[4719]: I1009 15:32:52.762424 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-w6fj7" podStartSLOduration=3.374426594 podStartE2EDuration="20.762403746s" podCreationTimestamp="2025-10-09 15:32:32 +0000 UTC" firstStartedPulling="2025-10-09 15:32:34.730919533 +0000 UTC m=+860.240630818" lastFinishedPulling="2025-10-09 15:32:52.118896685 +0000 UTC m=+877.628607970" observedRunningTime="2025-10-09 15:32:52.758666227 +0000 UTC m=+878.268377512" watchObservedRunningTime="2025-10-09 15:32:52.762403746 +0000 UTC m=+878.272115041" Oct 09 15:32:52 crc kubenswrapper[4719]: I1009 15:32:52.777591 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-d284h" podStartSLOduration=3.404370693 podStartE2EDuration="20.777574607s" podCreationTimestamp="2025-10-09 15:32:32 +0000 UTC" firstStartedPulling="2025-10-09 15:32:34.748341905 +0000 UTC m=+860.258053190" lastFinishedPulling="2025-10-09 15:32:52.121545829 +0000 UTC m=+877.631257104" observedRunningTime="2025-10-09 15:32:52.776716191 +0000 UTC m=+878.286427476" watchObservedRunningTime="2025-10-09 15:32:52.777574607 +0000 UTC m=+878.287285892" Oct 09 15:32:52 crc kubenswrapper[4719]: I1009 15:32:52.789224 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-pvjzc" Oct 09 15:32:52 crc kubenswrapper[4719]: I1009 15:32:52.864393 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-zj22f" Oct 09 15:32:52 crc kubenswrapper[4719]: I1009 15:32:52.885437 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-tmxgv" Oct 09 15:32:52 crc kubenswrapper[4719]: I1009 15:32:52.974580 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-664664cb68-2td8x" Oct 09 15:32:53 crc kubenswrapper[4719]: I1009 15:32:53.198405 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-jzwqf" Oct 09 15:32:53 crc kubenswrapper[4719]: I1009 15:32:53.242919 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-k8dck" Oct 09 15:32:53 crc kubenswrapper[4719]: I1009 15:32:53.301849 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-q9cm2" Oct 09 15:32:53 crc kubenswrapper[4719]: I1009 15:32:53.575488 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk" Oct 09 15:32:54 crc kubenswrapper[4719]: I1009 15:32:54.705495 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-cc79478c-885gj" event={"ID":"44ee4b27-7bdd-4a5e-98ed-1b8b5f01b54f","Type":"ContainerStarted","Data":"4350982b16f7dc10bf5779bdf60a68ee4cc838380815f0603b110ef6321e91c0"} Oct 09 15:32:54 crc kubenswrapper[4719]: I1009 15:32:54.706157 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-cc79478c-885gj" Oct 09 15:32:54 crc kubenswrapper[4719]: I1009 15:32:54.708029 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r4m97" event={"ID":"6776ccc8-9114-46e5-a2a2-699f8917bfac","Type":"ContainerStarted","Data":"9bf1bdd3153026d01211c5bedc7f2b3388b7d19cf3b5995b0fa42dd0f29d1c19"} Oct 09 15:32:54 crc kubenswrapper[4719]: I1009 15:32:54.723983 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-cc79478c-885gj" podStartSLOduration=3.300231509 podStartE2EDuration="22.723964587s" podCreationTimestamp="2025-10-09 15:32:32 +0000 UTC" firstStartedPulling="2025-10-09 15:32:34.748137849 +0000 UTC m=+860.257849134" lastFinishedPulling="2025-10-09 15:32:54.171870917 +0000 UTC m=+879.681582212" observedRunningTime="2025-10-09 15:32:54.723318996 +0000 UTC m=+880.233030281" watchObservedRunningTime="2025-10-09 15:32:54.723964587 +0000 UTC m=+880.233675872" Oct 09 15:32:54 crc kubenswrapper[4719]: I1009 15:32:54.745179 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-r4m97" podStartSLOduration=3.318736057 podStartE2EDuration="22.745162381s" podCreationTimestamp="2025-10-09 15:32:32 +0000 UTC" firstStartedPulling="2025-10-09 15:32:34.747975174 +0000 UTC m=+860.257686459" lastFinishedPulling="2025-10-09 15:32:54.174401498 +0000 UTC m=+879.684112783" observedRunningTime="2025-10-09 15:32:54.741156393 +0000 UTC m=+880.250867688" watchObservedRunningTime="2025-10-09 15:32:54.745162381 +0000 UTC m=+880.254873666" Oct 09 15:33:02 crc kubenswrapper[4719]: I1009 15:33:02.904835 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-d284h" Oct 09 15:33:03 crc kubenswrapper[4719]: I1009 15:33:03.200177 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-lh9st" Oct 09 15:33:03 crc kubenswrapper[4719]: I1009 15:33:03.203346 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-jzwqf" Oct 09 15:33:03 crc kubenswrapper[4719]: I1009 15:33:03.282710 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-w6fj7" Oct 09 15:33:03 crc kubenswrapper[4719]: I1009 15:33:03.447551 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-cc79478c-885gj" Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.584300 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59964f465-shbzw"] Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.591119 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59964f465-shbzw" Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.598498 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.598873 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.599026 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.602341 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59964f465-shbzw"] Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.609568 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-2kdbg" Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.648536 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-559d4fdc95-rm968"] Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.650619 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559d4fdc95-rm968" Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.659468 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.660553 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-559d4fdc95-rm968"] Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.705613 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a1f494c-3a8c-4f29-9fae-abf4598e90ab-config\") pod \"dnsmasq-dns-59964f465-shbzw\" (UID: \"5a1f494c-3a8c-4f29-9fae-abf4598e90ab\") " pod="openstack/dnsmasq-dns-59964f465-shbzw" Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.705692 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqq2v\" (UniqueName: \"kubernetes.io/projected/5a1f494c-3a8c-4f29-9fae-abf4598e90ab-kube-api-access-jqq2v\") pod \"dnsmasq-dns-59964f465-shbzw\" (UID: \"5a1f494c-3a8c-4f29-9fae-abf4598e90ab\") " pod="openstack/dnsmasq-dns-59964f465-shbzw" Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.807012 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a1f494c-3a8c-4f29-9fae-abf4598e90ab-config\") pod \"dnsmasq-dns-59964f465-shbzw\" (UID: \"5a1f494c-3a8c-4f29-9fae-abf4598e90ab\") " pod="openstack/dnsmasq-dns-59964f465-shbzw" Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.807101 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9457534e-d349-4851-820f-95d1261c44ac-config\") pod \"dnsmasq-dns-559d4fdc95-rm968\" (UID: \"9457534e-d349-4851-820f-95d1261c44ac\") " pod="openstack/dnsmasq-dns-559d4fdc95-rm968" Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.807123 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9457534e-d349-4851-820f-95d1261c44ac-dns-svc\") pod \"dnsmasq-dns-559d4fdc95-rm968\" (UID: \"9457534e-d349-4851-820f-95d1261c44ac\") " pod="openstack/dnsmasq-dns-559d4fdc95-rm968" Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.807167 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqq2v\" (UniqueName: \"kubernetes.io/projected/5a1f494c-3a8c-4f29-9fae-abf4598e90ab-kube-api-access-jqq2v\") pod \"dnsmasq-dns-59964f465-shbzw\" (UID: \"5a1f494c-3a8c-4f29-9fae-abf4598e90ab\") " pod="openstack/dnsmasq-dns-59964f465-shbzw" Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.807404 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44wsf\" (UniqueName: \"kubernetes.io/projected/9457534e-d349-4851-820f-95d1261c44ac-kube-api-access-44wsf\") pod \"dnsmasq-dns-559d4fdc95-rm968\" (UID: \"9457534e-d349-4851-820f-95d1261c44ac\") " pod="openstack/dnsmasq-dns-559d4fdc95-rm968" Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.808080 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a1f494c-3a8c-4f29-9fae-abf4598e90ab-config\") pod \"dnsmasq-dns-59964f465-shbzw\" (UID: \"5a1f494c-3a8c-4f29-9fae-abf4598e90ab\") " pod="openstack/dnsmasq-dns-59964f465-shbzw" Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.827725 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqq2v\" (UniqueName: \"kubernetes.io/projected/5a1f494c-3a8c-4f29-9fae-abf4598e90ab-kube-api-access-jqq2v\") pod \"dnsmasq-dns-59964f465-shbzw\" (UID: \"5a1f494c-3a8c-4f29-9fae-abf4598e90ab\") " pod="openstack/dnsmasq-dns-59964f465-shbzw" Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.909472 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9457534e-d349-4851-820f-95d1261c44ac-config\") pod \"dnsmasq-dns-559d4fdc95-rm968\" (UID: \"9457534e-d349-4851-820f-95d1261c44ac\") " pod="openstack/dnsmasq-dns-559d4fdc95-rm968" Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.909523 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9457534e-d349-4851-820f-95d1261c44ac-dns-svc\") pod \"dnsmasq-dns-559d4fdc95-rm968\" (UID: \"9457534e-d349-4851-820f-95d1261c44ac\") " pod="openstack/dnsmasq-dns-559d4fdc95-rm968" Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.909579 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44wsf\" (UniqueName: \"kubernetes.io/projected/9457534e-d349-4851-820f-95d1261c44ac-kube-api-access-44wsf\") pod \"dnsmasq-dns-559d4fdc95-rm968\" (UID: \"9457534e-d349-4851-820f-95d1261c44ac\") " pod="openstack/dnsmasq-dns-559d4fdc95-rm968" Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.910589 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9457534e-d349-4851-820f-95d1261c44ac-config\") pod \"dnsmasq-dns-559d4fdc95-rm968\" (UID: \"9457534e-d349-4851-820f-95d1261c44ac\") " pod="openstack/dnsmasq-dns-559d4fdc95-rm968" Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.910625 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9457534e-d349-4851-820f-95d1261c44ac-dns-svc\") pod \"dnsmasq-dns-559d4fdc95-rm968\" (UID: \"9457534e-d349-4851-820f-95d1261c44ac\") " pod="openstack/dnsmasq-dns-559d4fdc95-rm968" Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.914391 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59964f465-shbzw" Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.943237 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44wsf\" (UniqueName: \"kubernetes.io/projected/9457534e-d349-4851-820f-95d1261c44ac-kube-api-access-44wsf\") pod \"dnsmasq-dns-559d4fdc95-rm968\" (UID: \"9457534e-d349-4851-820f-95d1261c44ac\") " pod="openstack/dnsmasq-dns-559d4fdc95-rm968" Oct 09 15:33:21 crc kubenswrapper[4719]: I1009 15:33:21.977816 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559d4fdc95-rm968" Oct 09 15:33:22 crc kubenswrapper[4719]: I1009 15:33:22.433477 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-559d4fdc95-rm968"] Oct 09 15:33:22 crc kubenswrapper[4719]: I1009 15:33:22.494254 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59964f465-shbzw"] Oct 09 15:33:22 crc kubenswrapper[4719]: W1009 15:33:22.500816 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a1f494c_3a8c_4f29_9fae_abf4598e90ab.slice/crio-270ff36aeb4f323e309a2268783e13def334d415cff7c86b30cf0ccd40593372 WatchSource:0}: Error finding container 270ff36aeb4f323e309a2268783e13def334d415cff7c86b30cf0ccd40593372: Status 404 returned error can't find the container with id 270ff36aeb4f323e309a2268783e13def334d415cff7c86b30cf0ccd40593372 Oct 09 15:33:22 crc kubenswrapper[4719]: I1009 15:33:22.925556 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59964f465-shbzw" event={"ID":"5a1f494c-3a8c-4f29-9fae-abf4598e90ab","Type":"ContainerStarted","Data":"270ff36aeb4f323e309a2268783e13def334d415cff7c86b30cf0ccd40593372"} Oct 09 15:33:22 crc kubenswrapper[4719]: I1009 15:33:22.926723 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-559d4fdc95-rm968" event={"ID":"9457534e-d349-4851-820f-95d1261c44ac","Type":"ContainerStarted","Data":"9818280dc9835c3ceae96b933cb387e7ea706fd96e772c7990330f9a574579b9"} Oct 09 15:33:25 crc kubenswrapper[4719]: I1009 15:33:25.670327 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-559d4fdc95-rm968"] Oct 09 15:33:25 crc kubenswrapper[4719]: I1009 15:33:25.689416 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-596d6b547-grdml"] Oct 09 15:33:25 crc kubenswrapper[4719]: I1009 15:33:25.690939 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-596d6b547-grdml" Oct 09 15:33:25 crc kubenswrapper[4719]: I1009 15:33:25.724934 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-596d6b547-grdml"] Oct 09 15:33:25 crc kubenswrapper[4719]: I1009 15:33:25.880018 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f700697-a8c2-4887-94ce-3c2c2b67efc3-config\") pod \"dnsmasq-dns-596d6b547-grdml\" (UID: \"9f700697-a8c2-4887-94ce-3c2c2b67efc3\") " pod="openstack/dnsmasq-dns-596d6b547-grdml" Oct 09 15:33:25 crc kubenswrapper[4719]: I1009 15:33:25.880077 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk8sc\" (UniqueName: \"kubernetes.io/projected/9f700697-a8c2-4887-94ce-3c2c2b67efc3-kube-api-access-vk8sc\") pod \"dnsmasq-dns-596d6b547-grdml\" (UID: \"9f700697-a8c2-4887-94ce-3c2c2b67efc3\") " pod="openstack/dnsmasq-dns-596d6b547-grdml" Oct 09 15:33:25 crc kubenswrapper[4719]: I1009 15:33:25.880114 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f700697-a8c2-4887-94ce-3c2c2b67efc3-dns-svc\") pod \"dnsmasq-dns-596d6b547-grdml\" (UID: \"9f700697-a8c2-4887-94ce-3c2c2b67efc3\") " pod="openstack/dnsmasq-dns-596d6b547-grdml" Oct 09 15:33:25 crc kubenswrapper[4719]: I1009 15:33:25.981106 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59964f465-shbzw"] Oct 09 15:33:25 crc kubenswrapper[4719]: I1009 15:33:25.981293 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk8sc\" (UniqueName: \"kubernetes.io/projected/9f700697-a8c2-4887-94ce-3c2c2b67efc3-kube-api-access-vk8sc\") pod \"dnsmasq-dns-596d6b547-grdml\" (UID: \"9f700697-a8c2-4887-94ce-3c2c2b67efc3\") " pod="openstack/dnsmasq-dns-596d6b547-grdml" Oct 09 15:33:25 crc kubenswrapper[4719]: I1009 15:33:25.981387 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f700697-a8c2-4887-94ce-3c2c2b67efc3-dns-svc\") pod \"dnsmasq-dns-596d6b547-grdml\" (UID: \"9f700697-a8c2-4887-94ce-3c2c2b67efc3\") " pod="openstack/dnsmasq-dns-596d6b547-grdml" Oct 09 15:33:25 crc kubenswrapper[4719]: I1009 15:33:25.981475 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f700697-a8c2-4887-94ce-3c2c2b67efc3-config\") pod \"dnsmasq-dns-596d6b547-grdml\" (UID: \"9f700697-a8c2-4887-94ce-3c2c2b67efc3\") " pod="openstack/dnsmasq-dns-596d6b547-grdml" Oct 09 15:33:25 crc kubenswrapper[4719]: I1009 15:33:25.982572 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f700697-a8c2-4887-94ce-3c2c2b67efc3-config\") pod \"dnsmasq-dns-596d6b547-grdml\" (UID: \"9f700697-a8c2-4887-94ce-3c2c2b67efc3\") " pod="openstack/dnsmasq-dns-596d6b547-grdml" Oct 09 15:33:25 crc kubenswrapper[4719]: I1009 15:33:25.982715 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f700697-a8c2-4887-94ce-3c2c2b67efc3-dns-svc\") pod \"dnsmasq-dns-596d6b547-grdml\" (UID: \"9f700697-a8c2-4887-94ce-3c2c2b67efc3\") " pod="openstack/dnsmasq-dns-596d6b547-grdml" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.020283 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d6d58d699-6vpbr"] Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.024334 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk8sc\" (UniqueName: \"kubernetes.io/projected/9f700697-a8c2-4887-94ce-3c2c2b67efc3-kube-api-access-vk8sc\") pod \"dnsmasq-dns-596d6b547-grdml\" (UID: \"9f700697-a8c2-4887-94ce-3c2c2b67efc3\") " pod="openstack/dnsmasq-dns-596d6b547-grdml" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.035536 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d6d58d699-6vpbr" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.037337 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d6d58d699-6vpbr"] Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.185615 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b13a1a4-8f28-426c-951c-be1bbc5229bf-dns-svc\") pod \"dnsmasq-dns-5d6d58d699-6vpbr\" (UID: \"0b13a1a4-8f28-426c-951c-be1bbc5229bf\") " pod="openstack/dnsmasq-dns-5d6d58d699-6vpbr" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.185682 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b13a1a4-8f28-426c-951c-be1bbc5229bf-config\") pod \"dnsmasq-dns-5d6d58d699-6vpbr\" (UID: \"0b13a1a4-8f28-426c-951c-be1bbc5229bf\") " pod="openstack/dnsmasq-dns-5d6d58d699-6vpbr" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.185717 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48wrp\" (UniqueName: \"kubernetes.io/projected/0b13a1a4-8f28-426c-951c-be1bbc5229bf-kube-api-access-48wrp\") pod \"dnsmasq-dns-5d6d58d699-6vpbr\" (UID: \"0b13a1a4-8f28-426c-951c-be1bbc5229bf\") " pod="openstack/dnsmasq-dns-5d6d58d699-6vpbr" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.287693 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b13a1a4-8f28-426c-951c-be1bbc5229bf-dns-svc\") pod \"dnsmasq-dns-5d6d58d699-6vpbr\" (UID: \"0b13a1a4-8f28-426c-951c-be1bbc5229bf\") " pod="openstack/dnsmasq-dns-5d6d58d699-6vpbr" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.287802 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b13a1a4-8f28-426c-951c-be1bbc5229bf-config\") pod \"dnsmasq-dns-5d6d58d699-6vpbr\" (UID: \"0b13a1a4-8f28-426c-951c-be1bbc5229bf\") " pod="openstack/dnsmasq-dns-5d6d58d699-6vpbr" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.287879 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48wrp\" (UniqueName: \"kubernetes.io/projected/0b13a1a4-8f28-426c-951c-be1bbc5229bf-kube-api-access-48wrp\") pod \"dnsmasq-dns-5d6d58d699-6vpbr\" (UID: \"0b13a1a4-8f28-426c-951c-be1bbc5229bf\") " pod="openstack/dnsmasq-dns-5d6d58d699-6vpbr" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.288716 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b13a1a4-8f28-426c-951c-be1bbc5229bf-dns-svc\") pod \"dnsmasq-dns-5d6d58d699-6vpbr\" (UID: \"0b13a1a4-8f28-426c-951c-be1bbc5229bf\") " pod="openstack/dnsmasq-dns-5d6d58d699-6vpbr" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.288901 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b13a1a4-8f28-426c-951c-be1bbc5229bf-config\") pod \"dnsmasq-dns-5d6d58d699-6vpbr\" (UID: \"0b13a1a4-8f28-426c-951c-be1bbc5229bf\") " pod="openstack/dnsmasq-dns-5d6d58d699-6vpbr" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.308273 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48wrp\" (UniqueName: \"kubernetes.io/projected/0b13a1a4-8f28-426c-951c-be1bbc5229bf-kube-api-access-48wrp\") pod \"dnsmasq-dns-5d6d58d699-6vpbr\" (UID: \"0b13a1a4-8f28-426c-951c-be1bbc5229bf\") " pod="openstack/dnsmasq-dns-5d6d58d699-6vpbr" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.315058 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-596d6b547-grdml" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.369584 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d6d58d699-6vpbr" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.375970 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-596d6b547-grdml"] Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.404629 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dd68b64f-t69c6"] Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.406461 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dd68b64f-t69c6" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.429439 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dd68b64f-t69c6"] Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.592433 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6-dns-svc\") pod \"dnsmasq-dns-6dd68b64f-t69c6\" (UID: \"4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6\") " pod="openstack/dnsmasq-dns-6dd68b64f-t69c6" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.592521 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6-config\") pod \"dnsmasq-dns-6dd68b64f-t69c6\" (UID: \"4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6\") " pod="openstack/dnsmasq-dns-6dd68b64f-t69c6" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.592578 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvhcc\" (UniqueName: \"kubernetes.io/projected/4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6-kube-api-access-gvhcc\") pod \"dnsmasq-dns-6dd68b64f-t69c6\" (UID: \"4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6\") " pod="openstack/dnsmasq-dns-6dd68b64f-t69c6" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.694623 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6-dns-svc\") pod \"dnsmasq-dns-6dd68b64f-t69c6\" (UID: \"4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6\") " pod="openstack/dnsmasq-dns-6dd68b64f-t69c6" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.694751 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6-config\") pod \"dnsmasq-dns-6dd68b64f-t69c6\" (UID: \"4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6\") " pod="openstack/dnsmasq-dns-6dd68b64f-t69c6" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.694803 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvhcc\" (UniqueName: \"kubernetes.io/projected/4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6-kube-api-access-gvhcc\") pod \"dnsmasq-dns-6dd68b64f-t69c6\" (UID: \"4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6\") " pod="openstack/dnsmasq-dns-6dd68b64f-t69c6" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.695700 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6-dns-svc\") pod \"dnsmasq-dns-6dd68b64f-t69c6\" (UID: \"4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6\") " pod="openstack/dnsmasq-dns-6dd68b64f-t69c6" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.696293 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6-config\") pod \"dnsmasq-dns-6dd68b64f-t69c6\" (UID: \"4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6\") " pod="openstack/dnsmasq-dns-6dd68b64f-t69c6" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.714676 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvhcc\" (UniqueName: \"kubernetes.io/projected/4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6-kube-api-access-gvhcc\") pod \"dnsmasq-dns-6dd68b64f-t69c6\" (UID: \"4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6\") " pod="openstack/dnsmasq-dns-6dd68b64f-t69c6" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.737569 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dd68b64f-t69c6" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.859931 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.862918 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.868109 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.870456 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.871367 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.871846 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.871961 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.872169 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.872303 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-4c4vp" Oct 09 15:33:26 crc kubenswrapper[4719]: I1009 15:33:26.879398 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.001762 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c8f5a6f9-5554-485d-9aee-47449402e37b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.002365 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c8f5a6f9-5554-485d-9aee-47449402e37b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.002402 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c8f5a6f9-5554-485d-9aee-47449402e37b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.002477 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c8f5a6f9-5554-485d-9aee-47449402e37b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.002518 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c8f5a6f9-5554-485d-9aee-47449402e37b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.002714 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c8f5a6f9-5554-485d-9aee-47449402e37b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.002784 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.002822 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-565ml\" (UniqueName: \"kubernetes.io/projected/c8f5a6f9-5554-485d-9aee-47449402e37b-kube-api-access-565ml\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.002870 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c8f5a6f9-5554-485d-9aee-47449402e37b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.002896 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c8f5a6f9-5554-485d-9aee-47449402e37b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.002925 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c8f5a6f9-5554-485d-9aee-47449402e37b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.103892 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c8f5a6f9-5554-485d-9aee-47449402e37b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.103954 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.103997 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-565ml\" (UniqueName: \"kubernetes.io/projected/c8f5a6f9-5554-485d-9aee-47449402e37b-kube-api-access-565ml\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.104026 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c8f5a6f9-5554-485d-9aee-47449402e37b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.104051 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c8f5a6f9-5554-485d-9aee-47449402e37b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.104073 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c8f5a6f9-5554-485d-9aee-47449402e37b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.104118 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c8f5a6f9-5554-485d-9aee-47449402e37b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.104140 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c8f5a6f9-5554-485d-9aee-47449402e37b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.104163 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c8f5a6f9-5554-485d-9aee-47449402e37b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.104183 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c8f5a6f9-5554-485d-9aee-47449402e37b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.104207 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c8f5a6f9-5554-485d-9aee-47449402e37b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.104940 4719 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.105298 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c8f5a6f9-5554-485d-9aee-47449402e37b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.105608 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c8f5a6f9-5554-485d-9aee-47449402e37b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.105802 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c8f5a6f9-5554-485d-9aee-47449402e37b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.105821 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c8f5a6f9-5554-485d-9aee-47449402e37b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.106566 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c8f5a6f9-5554-485d-9aee-47449402e37b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.109803 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c8f5a6f9-5554-485d-9aee-47449402e37b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.110670 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c8f5a6f9-5554-485d-9aee-47449402e37b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.120222 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c8f5a6f9-5554-485d-9aee-47449402e37b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.120232 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c8f5a6f9-5554-485d-9aee-47449402e37b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.125878 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-565ml\" (UniqueName: \"kubernetes.io/projected/c8f5a6f9-5554-485d-9aee-47449402e37b-kube-api-access-565ml\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.157820 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.192166 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.232635 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.233973 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.236173 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.236589 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-rhngj" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.236741 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.236926 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.237083 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.237199 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.237331 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.239831 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.308847 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d3a820d9-3c13-47ec-a39e-dea4d60b7536-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.308897 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.308922 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3a820d9-3c13-47ec-a39e-dea4d60b7536-config-data\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.308972 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d3a820d9-3c13-47ec-a39e-dea4d60b7536-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.309057 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlpkb\" (UniqueName: \"kubernetes.io/projected/d3a820d9-3c13-47ec-a39e-dea4d60b7536-kube-api-access-rlpkb\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.309118 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d3a820d9-3c13-47ec-a39e-dea4d60b7536-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.309182 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d3a820d9-3c13-47ec-a39e-dea4d60b7536-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.309275 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d3a820d9-3c13-47ec-a39e-dea4d60b7536-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.309297 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d3a820d9-3c13-47ec-a39e-dea4d60b7536-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.309312 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d3a820d9-3c13-47ec-a39e-dea4d60b7536-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.309390 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d3a820d9-3c13-47ec-a39e-dea4d60b7536-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.410934 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d3a820d9-3c13-47ec-a39e-dea4d60b7536-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.411001 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.411030 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3a820d9-3c13-47ec-a39e-dea4d60b7536-config-data\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.411051 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d3a820d9-3c13-47ec-a39e-dea4d60b7536-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.411082 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlpkb\" (UniqueName: \"kubernetes.io/projected/d3a820d9-3c13-47ec-a39e-dea4d60b7536-kube-api-access-rlpkb\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.411119 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d3a820d9-3c13-47ec-a39e-dea4d60b7536-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.411147 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d3a820d9-3c13-47ec-a39e-dea4d60b7536-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.411175 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d3a820d9-3c13-47ec-a39e-dea4d60b7536-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.411197 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d3a820d9-3c13-47ec-a39e-dea4d60b7536-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.411223 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d3a820d9-3c13-47ec-a39e-dea4d60b7536-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.411246 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d3a820d9-3c13-47ec-a39e-dea4d60b7536-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.411887 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d3a820d9-3c13-47ec-a39e-dea4d60b7536-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.412005 4719 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.413030 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d3a820d9-3c13-47ec-a39e-dea4d60b7536-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.413728 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3a820d9-3c13-47ec-a39e-dea4d60b7536-config-data\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.414621 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d3a820d9-3c13-47ec-a39e-dea4d60b7536-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.415871 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d3a820d9-3c13-47ec-a39e-dea4d60b7536-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.434119 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d3a820d9-3c13-47ec-a39e-dea4d60b7536-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.434344 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d3a820d9-3c13-47ec-a39e-dea4d60b7536-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.434486 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d3a820d9-3c13-47ec-a39e-dea4d60b7536-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.435140 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d3a820d9-3c13-47ec-a39e-dea4d60b7536-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.435549 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.436305 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlpkb\" (UniqueName: \"kubernetes.io/projected/d3a820d9-3c13-47ec-a39e-dea4d60b7536-kube-api-access-rlpkb\") pod \"rabbitmq-server-0\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.539941 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.541303 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.544134 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-default-user" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.545958 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-erlang-cookie" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.546115 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-plugins-conf" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.546224 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-server-dockercfg-fk4c5" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.546331 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-notifications-svc" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.546484 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-config-data" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.546616 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-server-conf" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.555610 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.563525 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.614474 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1df540c9-8b54-44a5-9c5d-03cf736ee67a-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.614547 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1df540c9-8b54-44a5-9c5d-03cf736ee67a-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.614578 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtw86\" (UniqueName: \"kubernetes.io/projected/1df540c9-8b54-44a5-9c5d-03cf736ee67a-kube-api-access-mtw86\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.614620 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1df540c9-8b54-44a5-9c5d-03cf736ee67a-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.614670 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1df540c9-8b54-44a5-9c5d-03cf736ee67a-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.614730 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1df540c9-8b54-44a5-9c5d-03cf736ee67a-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.614760 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.614821 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1df540c9-8b54-44a5-9c5d-03cf736ee67a-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.614882 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1df540c9-8b54-44a5-9c5d-03cf736ee67a-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.614980 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1df540c9-8b54-44a5-9c5d-03cf736ee67a-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.615042 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1df540c9-8b54-44a5-9c5d-03cf736ee67a-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.716835 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1df540c9-8b54-44a5-9c5d-03cf736ee67a-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.716911 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1df540c9-8b54-44a5-9c5d-03cf736ee67a-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.716981 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1df540c9-8b54-44a5-9c5d-03cf736ee67a-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.717018 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1df540c9-8b54-44a5-9c5d-03cf736ee67a-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.717038 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtw86\" (UniqueName: \"kubernetes.io/projected/1df540c9-8b54-44a5-9c5d-03cf736ee67a-kube-api-access-mtw86\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.717057 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1df540c9-8b54-44a5-9c5d-03cf736ee67a-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.717139 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1df540c9-8b54-44a5-9c5d-03cf736ee67a-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.717206 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1df540c9-8b54-44a5-9c5d-03cf736ee67a-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.717246 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.717273 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1df540c9-8b54-44a5-9c5d-03cf736ee67a-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.717295 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1df540c9-8b54-44a5-9c5d-03cf736ee67a-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.717849 4719 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.718728 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1df540c9-8b54-44a5-9c5d-03cf736ee67a-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.718787 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1df540c9-8b54-44a5-9c5d-03cf736ee67a-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.720219 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1df540c9-8b54-44a5-9c5d-03cf736ee67a-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.720466 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1df540c9-8b54-44a5-9c5d-03cf736ee67a-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.720975 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1df540c9-8b54-44a5-9c5d-03cf736ee67a-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.721610 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1df540c9-8b54-44a5-9c5d-03cf736ee67a-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.721638 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1df540c9-8b54-44a5-9c5d-03cf736ee67a-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.722226 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1df540c9-8b54-44a5-9c5d-03cf736ee67a-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.724876 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1df540c9-8b54-44a5-9c5d-03cf736ee67a-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.734469 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtw86\" (UniqueName: \"kubernetes.io/projected/1df540c9-8b54-44a5-9c5d-03cf736ee67a-kube-api-access-mtw86\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.741874 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"1df540c9-8b54-44a5-9c5d-03cf736ee67a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:27 crc kubenswrapper[4719]: I1009 15:33:27.868570 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.393075 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.395150 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.397979 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.398142 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.398231 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.398338 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.401412 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.403796 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-c4r8v" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.431794 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.471428 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.472791 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/05ff8a95-a910-4095-930b-e42c575bf4b8-kolla-config\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.472850 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.473284 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/05ff8a95-a910-4095-930b-e42c575bf4b8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.473360 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ff8a95-a910-4095-930b-e42c575bf4b8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.473396 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05ff8a95-a910-4095-930b-e42c575bf4b8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.473788 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.478995 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.479926 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.480127 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-64lzc" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.480286 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.480538 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/05ff8a95-a910-4095-930b-e42c575bf4b8-config-data-default\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.480574 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/05ff8a95-a910-4095-930b-e42c575bf4b8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.480627 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66bqz\" (UniqueName: \"kubernetes.io/projected/05ff8a95-a910-4095-930b-e42c575bf4b8-kube-api-access-66bqz\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.480941 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/05ff8a95-a910-4095-930b-e42c575bf4b8-secrets\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.481160 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.582511 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/05ff8a95-a910-4095-930b-e42c575bf4b8-secrets\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.582568 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/05ff8a95-a910-4095-930b-e42c575bf4b8-kolla-config\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.582596 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.582621 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6970b67-4ebd-401d-838b-8be92b8ba72f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.582647 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ff8a95-a910-4095-930b-e42c575bf4b8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.582663 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/05ff8a95-a910-4095-930b-e42c575bf4b8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.582683 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05ff8a95-a910-4095-930b-e42c575bf4b8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.582703 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfkhh\" (UniqueName: \"kubernetes.io/projected/d6970b67-4ebd-401d-838b-8be92b8ba72f-kube-api-access-qfkhh\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.582725 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/05ff8a95-a910-4095-930b-e42c575bf4b8-config-data-default\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.582742 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/05ff8a95-a910-4095-930b-e42c575bf4b8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.582764 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66bqz\" (UniqueName: \"kubernetes.io/projected/05ff8a95-a910-4095-930b-e42c575bf4b8-kube-api-access-66bqz\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.582785 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d6970b67-4ebd-401d-838b-8be92b8ba72f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.582812 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d6970b67-4ebd-401d-838b-8be92b8ba72f-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.582827 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6970b67-4ebd-401d-838b-8be92b8ba72f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.582845 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d6970b67-4ebd-401d-838b-8be92b8ba72f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.582865 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.582886 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d6970b67-4ebd-401d-838b-8be92b8ba72f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.582903 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6970b67-4ebd-401d-838b-8be92b8ba72f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.583849 4719 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.584341 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/05ff8a95-a910-4095-930b-e42c575bf4b8-config-data-default\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.584763 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05ff8a95-a910-4095-930b-e42c575bf4b8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.585128 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/05ff8a95-a910-4095-930b-e42c575bf4b8-kolla-config\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.585367 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/05ff8a95-a910-4095-930b-e42c575bf4b8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.595975 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/05ff8a95-a910-4095-930b-e42c575bf4b8-secrets\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.597288 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/05ff8a95-a910-4095-930b-e42c575bf4b8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.608922 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ff8a95-a910-4095-930b-e42c575bf4b8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.610322 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66bqz\" (UniqueName: \"kubernetes.io/projected/05ff8a95-a910-4095-930b-e42c575bf4b8-kube-api-access-66bqz\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.618161 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"05ff8a95-a910-4095-930b-e42c575bf4b8\") " pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.683958 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6970b67-4ebd-401d-838b-8be92b8ba72f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.684256 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfkhh\" (UniqueName: \"kubernetes.io/projected/d6970b67-4ebd-401d-838b-8be92b8ba72f-kube-api-access-qfkhh\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.684302 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d6970b67-4ebd-401d-838b-8be92b8ba72f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.684333 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d6970b67-4ebd-401d-838b-8be92b8ba72f-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.684362 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6970b67-4ebd-401d-838b-8be92b8ba72f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.684381 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d6970b67-4ebd-401d-838b-8be92b8ba72f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.684407 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.684429 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d6970b67-4ebd-401d-838b-8be92b8ba72f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.684470 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6970b67-4ebd-401d-838b-8be92b8ba72f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.684811 4719 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.685456 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d6970b67-4ebd-401d-838b-8be92b8ba72f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.686144 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d6970b67-4ebd-401d-838b-8be92b8ba72f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.686241 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d6970b67-4ebd-401d-838b-8be92b8ba72f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.686826 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6970b67-4ebd-401d-838b-8be92b8ba72f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.689038 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d6970b67-4ebd-401d-838b-8be92b8ba72f-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.689055 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6970b67-4ebd-401d-838b-8be92b8ba72f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.689496 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6970b67-4ebd-401d-838b-8be92b8ba72f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.710932 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfkhh\" (UniqueName: \"kubernetes.io/projected/d6970b67-4ebd-401d-838b-8be92b8ba72f-kube-api-access-qfkhh\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.715042 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d6970b67-4ebd-401d-838b-8be92b8ba72f\") " pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.716560 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.783680 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.791464 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.796085 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.796152 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-c4gbb" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.796305 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.796637 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.802163 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.893161 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5kkj\" (UniqueName: \"kubernetes.io/projected/c7495027-5c56-46e2-9947-1ad2d6bcaf28-kube-api-access-r5kkj\") pod \"memcached-0\" (UID: \"c7495027-5c56-46e2-9947-1ad2d6bcaf28\") " pod="openstack/memcached-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.893343 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7495027-5c56-46e2-9947-1ad2d6bcaf28-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c7495027-5c56-46e2-9947-1ad2d6bcaf28\") " pod="openstack/memcached-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.893405 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7495027-5c56-46e2-9947-1ad2d6bcaf28-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c7495027-5c56-46e2-9947-1ad2d6bcaf28\") " pod="openstack/memcached-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.893465 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c7495027-5c56-46e2-9947-1ad2d6bcaf28-config-data\") pod \"memcached-0\" (UID: \"c7495027-5c56-46e2-9947-1ad2d6bcaf28\") " pod="openstack/memcached-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.893512 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c7495027-5c56-46e2-9947-1ad2d6bcaf28-kolla-config\") pod \"memcached-0\" (UID: \"c7495027-5c56-46e2-9947-1ad2d6bcaf28\") " pod="openstack/memcached-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.995526 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c7495027-5c56-46e2-9947-1ad2d6bcaf28-config-data\") pod \"memcached-0\" (UID: \"c7495027-5c56-46e2-9947-1ad2d6bcaf28\") " pod="openstack/memcached-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.995595 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c7495027-5c56-46e2-9947-1ad2d6bcaf28-kolla-config\") pod \"memcached-0\" (UID: \"c7495027-5c56-46e2-9947-1ad2d6bcaf28\") " pod="openstack/memcached-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.995633 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5kkj\" (UniqueName: \"kubernetes.io/projected/c7495027-5c56-46e2-9947-1ad2d6bcaf28-kube-api-access-r5kkj\") pod \"memcached-0\" (UID: \"c7495027-5c56-46e2-9947-1ad2d6bcaf28\") " pod="openstack/memcached-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.995710 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7495027-5c56-46e2-9947-1ad2d6bcaf28-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c7495027-5c56-46e2-9947-1ad2d6bcaf28\") " pod="openstack/memcached-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.995735 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7495027-5c56-46e2-9947-1ad2d6bcaf28-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c7495027-5c56-46e2-9947-1ad2d6bcaf28\") " pod="openstack/memcached-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.996722 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c7495027-5c56-46e2-9947-1ad2d6bcaf28-config-data\") pod \"memcached-0\" (UID: \"c7495027-5c56-46e2-9947-1ad2d6bcaf28\") " pod="openstack/memcached-0" Oct 09 15:33:30 crc kubenswrapper[4719]: I1009 15:33:30.996791 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c7495027-5c56-46e2-9947-1ad2d6bcaf28-kolla-config\") pod \"memcached-0\" (UID: \"c7495027-5c56-46e2-9947-1ad2d6bcaf28\") " pod="openstack/memcached-0" Oct 09 15:33:31 crc kubenswrapper[4719]: I1009 15:33:31.007299 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7495027-5c56-46e2-9947-1ad2d6bcaf28-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c7495027-5c56-46e2-9947-1ad2d6bcaf28\") " pod="openstack/memcached-0" Oct 09 15:33:31 crc kubenswrapper[4719]: I1009 15:33:31.008146 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7495027-5c56-46e2-9947-1ad2d6bcaf28-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c7495027-5c56-46e2-9947-1ad2d6bcaf28\") " pod="openstack/memcached-0" Oct 09 15:33:31 crc kubenswrapper[4719]: I1009 15:33:31.036752 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5kkj\" (UniqueName: \"kubernetes.io/projected/c7495027-5c56-46e2-9947-1ad2d6bcaf28-kube-api-access-r5kkj\") pod \"memcached-0\" (UID: \"c7495027-5c56-46e2-9947-1ad2d6bcaf28\") " pod="openstack/memcached-0" Oct 09 15:33:31 crc kubenswrapper[4719]: I1009 15:33:31.112389 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 09 15:33:32 crc kubenswrapper[4719]: I1009 15:33:32.472193 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 15:33:32 crc kubenswrapper[4719]: I1009 15:33:32.473405 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 15:33:32 crc kubenswrapper[4719]: I1009 15:33:32.476774 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-kjpzv" Oct 09 15:33:32 crc kubenswrapper[4719]: I1009 15:33:32.488813 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 15:33:32 crc kubenswrapper[4719]: I1009 15:33:32.533628 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7hlp\" (UniqueName: \"kubernetes.io/projected/6b96d4d6-d2f4-4e82-9a9e-4c95e6f5389a-kube-api-access-x7hlp\") pod \"kube-state-metrics-0\" (UID: \"6b96d4d6-d2f4-4e82-9a9e-4c95e6f5389a\") " pod="openstack/kube-state-metrics-0" Oct 09 15:33:32 crc kubenswrapper[4719]: I1009 15:33:32.635701 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7hlp\" (UniqueName: \"kubernetes.io/projected/6b96d4d6-d2f4-4e82-9a9e-4c95e6f5389a-kube-api-access-x7hlp\") pod \"kube-state-metrics-0\" (UID: \"6b96d4d6-d2f4-4e82-9a9e-4c95e6f5389a\") " pod="openstack/kube-state-metrics-0" Oct 09 15:33:32 crc kubenswrapper[4719]: I1009 15:33:32.655267 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7hlp\" (UniqueName: \"kubernetes.io/projected/6b96d4d6-d2f4-4e82-9a9e-4c95e6f5389a-kube-api-access-x7hlp\") pod \"kube-state-metrics-0\" (UID: \"6b96d4d6-d2f4-4e82-9a9e-4c95e6f5389a\") " pod="openstack/kube-state-metrics-0" Oct 09 15:33:32 crc kubenswrapper[4719]: I1009 15:33:32.794663 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 15:33:33 crc kubenswrapper[4719]: I1009 15:33:33.811222 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 09 15:33:33 crc kubenswrapper[4719]: I1009 15:33:33.836606 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 09 15:33:33 crc kubenswrapper[4719]: I1009 15:33:33.836895 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:33 crc kubenswrapper[4719]: I1009 15:33:33.840457 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 09 15:33:33 crc kubenswrapper[4719]: I1009 15:33:33.840780 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 09 15:33:33 crc kubenswrapper[4719]: I1009 15:33:33.841129 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 09 15:33:33 crc kubenswrapper[4719]: I1009 15:33:33.841488 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 09 15:33:33 crc kubenswrapper[4719]: I1009 15:33:33.841716 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-4npj7" Oct 09 15:33:33 crc kubenswrapper[4719]: I1009 15:33:33.851881 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 09 15:33:33 crc kubenswrapper[4719]: I1009 15:33:33.962240 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/95b09495-a75f-42db-ae2c-99ac6e46f039-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:33 crc kubenswrapper[4719]: I1009 15:33:33.962655 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj8j6\" (UniqueName: \"kubernetes.io/projected/95b09495-a75f-42db-ae2c-99ac6e46f039-kube-api-access-tj8j6\") pod \"prometheus-metric-storage-0\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:33 crc kubenswrapper[4719]: I1009 15:33:33.962768 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/95b09495-a75f-42db-ae2c-99ac6e46f039-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:33 crc kubenswrapper[4719]: I1009 15:33:33.962799 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/95b09495-a75f-42db-ae2c-99ac6e46f039-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:33 crc kubenswrapper[4719]: I1009 15:33:33.962830 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/95b09495-a75f-42db-ae2c-99ac6e46f039-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:33 crc kubenswrapper[4719]: I1009 15:33:33.962854 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\") pod \"prometheus-metric-storage-0\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:33 crc kubenswrapper[4719]: I1009 15:33:33.962943 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/95b09495-a75f-42db-ae2c-99ac6e46f039-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:33 crc kubenswrapper[4719]: I1009 15:33:33.962971 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/95b09495-a75f-42db-ae2c-99ac6e46f039-config\") pod \"prometheus-metric-storage-0\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:34 crc kubenswrapper[4719]: I1009 15:33:34.064801 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/95b09495-a75f-42db-ae2c-99ac6e46f039-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:34 crc kubenswrapper[4719]: I1009 15:33:34.064865 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\") pod \"prometheus-metric-storage-0\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:34 crc kubenswrapper[4719]: I1009 15:33:34.064890 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/95b09495-a75f-42db-ae2c-99ac6e46f039-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:34 crc kubenswrapper[4719]: I1009 15:33:34.064912 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/95b09495-a75f-42db-ae2c-99ac6e46f039-config\") pod \"prometheus-metric-storage-0\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:34 crc kubenswrapper[4719]: I1009 15:33:34.064979 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/95b09495-a75f-42db-ae2c-99ac6e46f039-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:34 crc kubenswrapper[4719]: I1009 15:33:34.065003 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj8j6\" (UniqueName: \"kubernetes.io/projected/95b09495-a75f-42db-ae2c-99ac6e46f039-kube-api-access-tj8j6\") pod \"prometheus-metric-storage-0\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:34 crc kubenswrapper[4719]: I1009 15:33:34.065054 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/95b09495-a75f-42db-ae2c-99ac6e46f039-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:34 crc kubenswrapper[4719]: I1009 15:33:34.065076 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/95b09495-a75f-42db-ae2c-99ac6e46f039-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:34 crc kubenswrapper[4719]: I1009 15:33:34.067442 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/95b09495-a75f-42db-ae2c-99ac6e46f039-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:34 crc kubenswrapper[4719]: I1009 15:33:34.070902 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/95b09495-a75f-42db-ae2c-99ac6e46f039-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:34 crc kubenswrapper[4719]: I1009 15:33:34.072694 4719 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 09 15:33:34 crc kubenswrapper[4719]: I1009 15:33:34.072737 4719 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\") pod \"prometheus-metric-storage-0\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/37a51f19e15282ab5032b2bf09c91363092e9b48becd8acf5f5419f3d47a69ff/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:34 crc kubenswrapper[4719]: I1009 15:33:34.074700 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/95b09495-a75f-42db-ae2c-99ac6e46f039-config\") pod \"prometheus-metric-storage-0\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:34 crc kubenswrapper[4719]: I1009 15:33:34.076023 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/95b09495-a75f-42db-ae2c-99ac6e46f039-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:34 crc kubenswrapper[4719]: I1009 15:33:34.086411 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/95b09495-a75f-42db-ae2c-99ac6e46f039-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:34 crc kubenswrapper[4719]: I1009 15:33:34.087638 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/95b09495-a75f-42db-ae2c-99ac6e46f039-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:34 crc kubenswrapper[4719]: I1009 15:33:34.094787 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj8j6\" (UniqueName: \"kubernetes.io/projected/95b09495-a75f-42db-ae2c-99ac6e46f039-kube-api-access-tj8j6\") pod \"prometheus-metric-storage-0\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:34 crc kubenswrapper[4719]: I1009 15:33:34.130010 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\") pod \"prometheus-metric-storage-0\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:34 crc kubenswrapper[4719]: I1009 15:33:34.155990 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.713135 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-p4t6l"] Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.724475 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-p4t6l" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.730242 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-4tvk8" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.730586 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.730779 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.739474 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-p4t6l"] Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.774615 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-hqfvq"] Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.777225 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hqfvq" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.782202 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hqfvq"] Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.851291 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcw4k\" (UniqueName: \"kubernetes.io/projected/07b112ef-0e6a-4927-93e4-d5fc023e495f-kube-api-access-tcw4k\") pod \"ovn-controller-ovs-hqfvq\" (UID: \"07b112ef-0e6a-4927-93e4-d5fc023e495f\") " pod="openstack/ovn-controller-ovs-hqfvq" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.851389 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f0151a18-0608-47b9-b58a-7eef9dfaf31b-var-log-ovn\") pod \"ovn-controller-p4t6l\" (UID: \"f0151a18-0608-47b9-b58a-7eef9dfaf31b\") " pod="openstack/ovn-controller-p4t6l" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.851524 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0151a18-0608-47b9-b58a-7eef9dfaf31b-ovn-controller-tls-certs\") pod \"ovn-controller-p4t6l\" (UID: \"f0151a18-0608-47b9-b58a-7eef9dfaf31b\") " pod="openstack/ovn-controller-p4t6l" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.851557 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f0151a18-0608-47b9-b58a-7eef9dfaf31b-var-run\") pod \"ovn-controller-p4t6l\" (UID: \"f0151a18-0608-47b9-b58a-7eef9dfaf31b\") " pod="openstack/ovn-controller-p4t6l" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.851662 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0151a18-0608-47b9-b58a-7eef9dfaf31b-scripts\") pod \"ovn-controller-p4t6l\" (UID: \"f0151a18-0608-47b9-b58a-7eef9dfaf31b\") " pod="openstack/ovn-controller-p4t6l" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.851848 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0151a18-0608-47b9-b58a-7eef9dfaf31b-var-run-ovn\") pod \"ovn-controller-p4t6l\" (UID: \"f0151a18-0608-47b9-b58a-7eef9dfaf31b\") " pod="openstack/ovn-controller-p4t6l" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.851936 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07b112ef-0e6a-4927-93e4-d5fc023e495f-scripts\") pod \"ovn-controller-ovs-hqfvq\" (UID: \"07b112ef-0e6a-4927-93e4-d5fc023e495f\") " pod="openstack/ovn-controller-ovs-hqfvq" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.851976 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/07b112ef-0e6a-4927-93e4-d5fc023e495f-var-log\") pod \"ovn-controller-ovs-hqfvq\" (UID: \"07b112ef-0e6a-4927-93e4-d5fc023e495f\") " pod="openstack/ovn-controller-ovs-hqfvq" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.852055 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/07b112ef-0e6a-4927-93e4-d5fc023e495f-var-run\") pod \"ovn-controller-ovs-hqfvq\" (UID: \"07b112ef-0e6a-4927-93e4-d5fc023e495f\") " pod="openstack/ovn-controller-ovs-hqfvq" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.852095 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/07b112ef-0e6a-4927-93e4-d5fc023e495f-etc-ovs\") pod \"ovn-controller-ovs-hqfvq\" (UID: \"07b112ef-0e6a-4927-93e4-d5fc023e495f\") " pod="openstack/ovn-controller-ovs-hqfvq" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.852124 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0151a18-0608-47b9-b58a-7eef9dfaf31b-combined-ca-bundle\") pod \"ovn-controller-p4t6l\" (UID: \"f0151a18-0608-47b9-b58a-7eef9dfaf31b\") " pod="openstack/ovn-controller-p4t6l" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.852174 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/07b112ef-0e6a-4927-93e4-d5fc023e495f-var-lib\") pod \"ovn-controller-ovs-hqfvq\" (UID: \"07b112ef-0e6a-4927-93e4-d5fc023e495f\") " pod="openstack/ovn-controller-ovs-hqfvq" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.852242 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbxxg\" (UniqueName: \"kubernetes.io/projected/f0151a18-0608-47b9-b58a-7eef9dfaf31b-kube-api-access-nbxxg\") pod \"ovn-controller-p4t6l\" (UID: \"f0151a18-0608-47b9-b58a-7eef9dfaf31b\") " pod="openstack/ovn-controller-p4t6l" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.954028 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbxxg\" (UniqueName: \"kubernetes.io/projected/f0151a18-0608-47b9-b58a-7eef9dfaf31b-kube-api-access-nbxxg\") pod \"ovn-controller-p4t6l\" (UID: \"f0151a18-0608-47b9-b58a-7eef9dfaf31b\") " pod="openstack/ovn-controller-p4t6l" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.954099 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcw4k\" (UniqueName: \"kubernetes.io/projected/07b112ef-0e6a-4927-93e4-d5fc023e495f-kube-api-access-tcw4k\") pod \"ovn-controller-ovs-hqfvq\" (UID: \"07b112ef-0e6a-4927-93e4-d5fc023e495f\") " pod="openstack/ovn-controller-ovs-hqfvq" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.954142 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f0151a18-0608-47b9-b58a-7eef9dfaf31b-var-log-ovn\") pod \"ovn-controller-p4t6l\" (UID: \"f0151a18-0608-47b9-b58a-7eef9dfaf31b\") " pod="openstack/ovn-controller-p4t6l" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.954182 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0151a18-0608-47b9-b58a-7eef9dfaf31b-ovn-controller-tls-certs\") pod \"ovn-controller-p4t6l\" (UID: \"f0151a18-0608-47b9-b58a-7eef9dfaf31b\") " pod="openstack/ovn-controller-p4t6l" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.954206 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f0151a18-0608-47b9-b58a-7eef9dfaf31b-var-run\") pod \"ovn-controller-p4t6l\" (UID: \"f0151a18-0608-47b9-b58a-7eef9dfaf31b\") " pod="openstack/ovn-controller-p4t6l" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.954231 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0151a18-0608-47b9-b58a-7eef9dfaf31b-scripts\") pod \"ovn-controller-p4t6l\" (UID: \"f0151a18-0608-47b9-b58a-7eef9dfaf31b\") " pod="openstack/ovn-controller-p4t6l" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.954273 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0151a18-0608-47b9-b58a-7eef9dfaf31b-var-run-ovn\") pod \"ovn-controller-p4t6l\" (UID: \"f0151a18-0608-47b9-b58a-7eef9dfaf31b\") " pod="openstack/ovn-controller-p4t6l" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.954297 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07b112ef-0e6a-4927-93e4-d5fc023e495f-scripts\") pod \"ovn-controller-ovs-hqfvq\" (UID: \"07b112ef-0e6a-4927-93e4-d5fc023e495f\") " pod="openstack/ovn-controller-ovs-hqfvq" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.954316 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/07b112ef-0e6a-4927-93e4-d5fc023e495f-var-log\") pod \"ovn-controller-ovs-hqfvq\" (UID: \"07b112ef-0e6a-4927-93e4-d5fc023e495f\") " pod="openstack/ovn-controller-ovs-hqfvq" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.954344 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/07b112ef-0e6a-4927-93e4-d5fc023e495f-var-run\") pod \"ovn-controller-ovs-hqfvq\" (UID: \"07b112ef-0e6a-4927-93e4-d5fc023e495f\") " pod="openstack/ovn-controller-ovs-hqfvq" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.954380 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/07b112ef-0e6a-4927-93e4-d5fc023e495f-etc-ovs\") pod \"ovn-controller-ovs-hqfvq\" (UID: \"07b112ef-0e6a-4927-93e4-d5fc023e495f\") " pod="openstack/ovn-controller-ovs-hqfvq" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.954394 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0151a18-0608-47b9-b58a-7eef9dfaf31b-combined-ca-bundle\") pod \"ovn-controller-p4t6l\" (UID: \"f0151a18-0608-47b9-b58a-7eef9dfaf31b\") " pod="openstack/ovn-controller-p4t6l" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.954413 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/07b112ef-0e6a-4927-93e4-d5fc023e495f-var-lib\") pod \"ovn-controller-ovs-hqfvq\" (UID: \"07b112ef-0e6a-4927-93e4-d5fc023e495f\") " pod="openstack/ovn-controller-ovs-hqfvq" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.954759 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f0151a18-0608-47b9-b58a-7eef9dfaf31b-var-run\") pod \"ovn-controller-p4t6l\" (UID: \"f0151a18-0608-47b9-b58a-7eef9dfaf31b\") " pod="openstack/ovn-controller-p4t6l" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.954800 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/07b112ef-0e6a-4927-93e4-d5fc023e495f-var-log\") pod \"ovn-controller-ovs-hqfvq\" (UID: \"07b112ef-0e6a-4927-93e4-d5fc023e495f\") " pod="openstack/ovn-controller-ovs-hqfvq" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.954898 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f0151a18-0608-47b9-b58a-7eef9dfaf31b-var-log-ovn\") pod \"ovn-controller-p4t6l\" (UID: \"f0151a18-0608-47b9-b58a-7eef9dfaf31b\") " pod="openstack/ovn-controller-p4t6l" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.954957 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/07b112ef-0e6a-4927-93e4-d5fc023e495f-etc-ovs\") pod \"ovn-controller-ovs-hqfvq\" (UID: \"07b112ef-0e6a-4927-93e4-d5fc023e495f\") " pod="openstack/ovn-controller-ovs-hqfvq" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.954917 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/07b112ef-0e6a-4927-93e4-d5fc023e495f-var-lib\") pod \"ovn-controller-ovs-hqfvq\" (UID: \"07b112ef-0e6a-4927-93e4-d5fc023e495f\") " pod="openstack/ovn-controller-ovs-hqfvq" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.954916 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0151a18-0608-47b9-b58a-7eef9dfaf31b-var-run-ovn\") pod \"ovn-controller-p4t6l\" (UID: \"f0151a18-0608-47b9-b58a-7eef9dfaf31b\") " pod="openstack/ovn-controller-p4t6l" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.954979 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/07b112ef-0e6a-4927-93e4-d5fc023e495f-var-run\") pod \"ovn-controller-ovs-hqfvq\" (UID: \"07b112ef-0e6a-4927-93e4-d5fc023e495f\") " pod="openstack/ovn-controller-ovs-hqfvq" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.956632 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0151a18-0608-47b9-b58a-7eef9dfaf31b-scripts\") pod \"ovn-controller-p4t6l\" (UID: \"f0151a18-0608-47b9-b58a-7eef9dfaf31b\") " pod="openstack/ovn-controller-p4t6l" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.956995 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07b112ef-0e6a-4927-93e4-d5fc023e495f-scripts\") pod \"ovn-controller-ovs-hqfvq\" (UID: \"07b112ef-0e6a-4927-93e4-d5fc023e495f\") " pod="openstack/ovn-controller-ovs-hqfvq" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.961987 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0151a18-0608-47b9-b58a-7eef9dfaf31b-ovn-controller-tls-certs\") pod \"ovn-controller-p4t6l\" (UID: \"f0151a18-0608-47b9-b58a-7eef9dfaf31b\") " pod="openstack/ovn-controller-p4t6l" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.967682 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0151a18-0608-47b9-b58a-7eef9dfaf31b-combined-ca-bundle\") pod \"ovn-controller-p4t6l\" (UID: \"f0151a18-0608-47b9-b58a-7eef9dfaf31b\") " pod="openstack/ovn-controller-p4t6l" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.974859 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbxxg\" (UniqueName: \"kubernetes.io/projected/f0151a18-0608-47b9-b58a-7eef9dfaf31b-kube-api-access-nbxxg\") pod \"ovn-controller-p4t6l\" (UID: \"f0151a18-0608-47b9-b58a-7eef9dfaf31b\") " pod="openstack/ovn-controller-p4t6l" Oct 09 15:33:36 crc kubenswrapper[4719]: I1009 15:33:36.975126 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcw4k\" (UniqueName: \"kubernetes.io/projected/07b112ef-0e6a-4927-93e4-d5fc023e495f-kube-api-access-tcw4k\") pod \"ovn-controller-ovs-hqfvq\" (UID: \"07b112ef-0e6a-4927-93e4-d5fc023e495f\") " pod="openstack/ovn-controller-ovs-hqfvq" Oct 09 15:33:37 crc kubenswrapper[4719]: I1009 15:33:37.066120 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-p4t6l" Oct 09 15:33:37 crc kubenswrapper[4719]: I1009 15:33:37.091940 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hqfvq" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.810819 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.812630 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.815935 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-wn8tq" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.818884 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.818940 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.818884 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.819076 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.827427 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.888680 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29d7fec9-be2c-4fa8-9191-5ffaf287f825-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"29d7fec9-be2c-4fa8-9191-5ffaf287f825\") " pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.888745 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/29d7fec9-be2c-4fa8-9191-5ffaf287f825-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"29d7fec9-be2c-4fa8-9191-5ffaf287f825\") " pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.888911 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"29d7fec9-be2c-4fa8-9191-5ffaf287f825\") " pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.888958 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29d7fec9-be2c-4fa8-9191-5ffaf287f825-config\") pod \"ovsdbserver-nb-0\" (UID: \"29d7fec9-be2c-4fa8-9191-5ffaf287f825\") " pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.889105 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmkp4\" (UniqueName: \"kubernetes.io/projected/29d7fec9-be2c-4fa8-9191-5ffaf287f825-kube-api-access-mmkp4\") pod \"ovsdbserver-nb-0\" (UID: \"29d7fec9-be2c-4fa8-9191-5ffaf287f825\") " pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.889184 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/29d7fec9-be2c-4fa8-9191-5ffaf287f825-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"29d7fec9-be2c-4fa8-9191-5ffaf287f825\") " pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.889291 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/29d7fec9-be2c-4fa8-9191-5ffaf287f825-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"29d7fec9-be2c-4fa8-9191-5ffaf287f825\") " pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.889379 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29d7fec9-be2c-4fa8-9191-5ffaf287f825-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"29d7fec9-be2c-4fa8-9191-5ffaf287f825\") " pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.893906 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dd68b64f-t69c6"] Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.990776 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/29d7fec9-be2c-4fa8-9191-5ffaf287f825-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"29d7fec9-be2c-4fa8-9191-5ffaf287f825\") " pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.990839 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/29d7fec9-be2c-4fa8-9191-5ffaf287f825-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"29d7fec9-be2c-4fa8-9191-5ffaf287f825\") " pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.990888 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29d7fec9-be2c-4fa8-9191-5ffaf287f825-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"29d7fec9-be2c-4fa8-9191-5ffaf287f825\") " pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.990922 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29d7fec9-be2c-4fa8-9191-5ffaf287f825-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"29d7fec9-be2c-4fa8-9191-5ffaf287f825\") " pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.990959 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/29d7fec9-be2c-4fa8-9191-5ffaf287f825-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"29d7fec9-be2c-4fa8-9191-5ffaf287f825\") " pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.990990 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"29d7fec9-be2c-4fa8-9191-5ffaf287f825\") " pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.991565 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/29d7fec9-be2c-4fa8-9191-5ffaf287f825-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"29d7fec9-be2c-4fa8-9191-5ffaf287f825\") " pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.991951 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29d7fec9-be2c-4fa8-9191-5ffaf287f825-config\") pod \"ovsdbserver-nb-0\" (UID: \"29d7fec9-be2c-4fa8-9191-5ffaf287f825\") " pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.992021 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmkp4\" (UniqueName: \"kubernetes.io/projected/29d7fec9-be2c-4fa8-9191-5ffaf287f825-kube-api-access-mmkp4\") pod \"ovsdbserver-nb-0\" (UID: \"29d7fec9-be2c-4fa8-9191-5ffaf287f825\") " pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.992184 4719 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"29d7fec9-be2c-4fa8-9191-5ffaf287f825\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.992251 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29d7fec9-be2c-4fa8-9191-5ffaf287f825-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"29d7fec9-be2c-4fa8-9191-5ffaf287f825\") " pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.992884 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29d7fec9-be2c-4fa8-9191-5ffaf287f825-config\") pod \"ovsdbserver-nb-0\" (UID: \"29d7fec9-be2c-4fa8-9191-5ffaf287f825\") " pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.998137 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/29d7fec9-be2c-4fa8-9191-5ffaf287f825-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"29d7fec9-be2c-4fa8-9191-5ffaf287f825\") " pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.999526 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29d7fec9-be2c-4fa8-9191-5ffaf287f825-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"29d7fec9-be2c-4fa8-9191-5ffaf287f825\") " pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:38 crc kubenswrapper[4719]: I1009 15:33:38.999648 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/29d7fec9-be2c-4fa8-9191-5ffaf287f825-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"29d7fec9-be2c-4fa8-9191-5ffaf287f825\") " pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:39 crc kubenswrapper[4719]: I1009 15:33:39.010865 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmkp4\" (UniqueName: \"kubernetes.io/projected/29d7fec9-be2c-4fa8-9191-5ffaf287f825-kube-api-access-mmkp4\") pod \"ovsdbserver-nb-0\" (UID: \"29d7fec9-be2c-4fa8-9191-5ffaf287f825\") " pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:39 crc kubenswrapper[4719]: I1009 15:33:39.015086 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"29d7fec9-be2c-4fa8-9191-5ffaf287f825\") " pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:39 crc kubenswrapper[4719]: I1009 15:33:39.137701 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 09 15:33:39 crc kubenswrapper[4719]: E1009 15:33:39.448945 4719 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.66:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 09 15:33:39 crc kubenswrapper[4719]: E1009 15:33:39.449142 4719 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.66:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 09 15:33:39 crc kubenswrapper[4719]: E1009 15:33:39.449269 4719 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.66:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-44wsf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-559d4fdc95-rm968_openstack(9457534e-d349-4851-820f-95d1261c44ac): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 15:33:39 crc kubenswrapper[4719]: E1009 15:33:39.450776 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-559d4fdc95-rm968" podUID="9457534e-d349-4851-820f-95d1261c44ac" Oct 09 15:33:39 crc kubenswrapper[4719]: E1009 15:33:39.461320 4719 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.66:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 09 15:33:39 crc kubenswrapper[4719]: E1009 15:33:39.461436 4719 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.66:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 09 15:33:39 crc kubenswrapper[4719]: E1009 15:33:39.461757 4719 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.66:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jqq2v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-59964f465-shbzw_openstack(5a1f494c-3a8c-4f29-9fae-abf4598e90ab): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 15:33:39 crc kubenswrapper[4719]: E1009 15:33:39.463007 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-59964f465-shbzw" podUID="5a1f494c-3a8c-4f29-9fae-abf4598e90ab" Oct 09 15:33:39 crc kubenswrapper[4719]: I1009 15:33:39.816748 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 09 15:33:39 crc kubenswrapper[4719]: I1009 15:33:39.818524 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:39 crc kubenswrapper[4719]: I1009 15:33:39.820808 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 09 15:33:39 crc kubenswrapper[4719]: I1009 15:33:39.821153 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-p7x9j" Oct 09 15:33:39 crc kubenswrapper[4719]: I1009 15:33:39.821333 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 09 15:33:39 crc kubenswrapper[4719]: I1009 15:33:39.821400 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 09 15:33:39 crc kubenswrapper[4719]: I1009 15:33:39.836180 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 09 15:33:39 crc kubenswrapper[4719]: I1009 15:33:39.927587 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c\") " pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:39 crc kubenswrapper[4719]: I1009 15:33:39.927658 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6bg9\" (UniqueName: \"kubernetes.io/projected/dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c-kube-api-access-k6bg9\") pod \"ovsdbserver-sb-0\" (UID: \"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c\") " pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:39 crc kubenswrapper[4719]: I1009 15:33:39.927688 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c\") " pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:39 crc kubenswrapper[4719]: I1009 15:33:39.927735 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c\") " pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:39 crc kubenswrapper[4719]: I1009 15:33:39.927771 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c\") " pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:39 crc kubenswrapper[4719]: I1009 15:33:39.927787 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c\") " pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:39 crc kubenswrapper[4719]: I1009 15:33:39.927826 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c\") " pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:39 crc kubenswrapper[4719]: I1009 15:33:39.927853 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c-config\") pod \"ovsdbserver-sb-0\" (UID: \"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c\") " pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.028761 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c\") " pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.028825 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c-config\") pod \"ovsdbserver-sb-0\" (UID: \"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c\") " pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.028855 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c\") " pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.028885 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6bg9\" (UniqueName: \"kubernetes.io/projected/dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c-kube-api-access-k6bg9\") pod \"ovsdbserver-sb-0\" (UID: \"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c\") " pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.028910 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c\") " pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.028955 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c\") " pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.028988 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c\") " pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.029003 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c\") " pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.029284 4719 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.029791 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c\") " pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.029905 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c-config\") pod \"ovsdbserver-sb-0\" (UID: \"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c\") " pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.030780 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c\") " pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.041058 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c\") " pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.042956 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c\") " pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.043064 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c\") " pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.045998 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6bg9\" (UniqueName: \"kubernetes.io/projected/dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c-kube-api-access-k6bg9\") pod \"ovsdbserver-sb-0\" (UID: \"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c\") " pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.047982 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c\") " pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.080291 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.115810 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d3a820d9-3c13-47ec-a39e-dea4d60b7536","Type":"ContainerStarted","Data":"b0cf669e83b7b9b6df49c16b1c59f0a3249877edc9e4308a1a96a020ff89e25b"} Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.117374 4719 generic.go:334] "Generic (PLEG): container finished" podID="4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6" containerID="8596ece6f6330253f5920d00478b711d559cb6448c8f8d7450dca57530e7836f" exitCode=0 Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.117828 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dd68b64f-t69c6" event={"ID":"4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6","Type":"ContainerDied","Data":"8596ece6f6330253f5920d00478b711d559cb6448c8f8d7450dca57530e7836f"} Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.117876 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dd68b64f-t69c6" event={"ID":"4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6","Type":"ContainerStarted","Data":"b007fe91956c10ad2b6a6388c093a88dd0f1b2fbb896a5e295f6cebe7f64de79"} Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.193842 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.436416 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d6d58d699-6vpbr"] Oct 09 15:33:40 crc kubenswrapper[4719]: W1009 15:33:40.443839 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b13a1a4_8f28_426c_951c_be1bbc5229bf.slice/crio-b89a7d4c0a16893309ac8173603130c7ef6e82f9b98add265df8a5144c378ad9 WatchSource:0}: Error finding container b89a7d4c0a16893309ac8173603130c7ef6e82f9b98add265df8a5144c378ad9: Status 404 returned error can't find the container with id b89a7d4c0a16893309ac8173603130c7ef6e82f9b98add265df8a5144c378ad9 Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.446694 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.471712 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.486059 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-596d6b547-grdml"] Oct 09 15:33:40 crc kubenswrapper[4719]: W1009 15:33:40.487117 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f700697_a8c2_4887_94ce_3c2c2b67efc3.slice/crio-f8154e1aee17b2435b5a9d30907ca52411f6b5aeb082f7dc7b0c4c7052a233c9 WatchSource:0}: Error finding container f8154e1aee17b2435b5a9d30907ca52411f6b5aeb082f7dc7b0c4c7052a233c9: Status 404 returned error can't find the container with id f8154e1aee17b2435b5a9d30907ca52411f6b5aeb082f7dc7b0c4c7052a233c9 Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.494456 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.595371 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hqfvq"] Oct 09 15:33:40 crc kubenswrapper[4719]: W1009 15:33:40.608121 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07b112ef_0e6a_4927_93e4_d5fc023e495f.slice/crio-56297ddce8d075af2211cbcf409cdd94c2ac93b8cb718d47cb99d9792984c4b6 WatchSource:0}: Error finding container 56297ddce8d075af2211cbcf409cdd94c2ac93b8cb718d47cb99d9792984c4b6: Status 404 returned error can't find the container with id 56297ddce8d075af2211cbcf409cdd94c2ac93b8cb718d47cb99d9792984c4b6 Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.753431 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559d4fdc95-rm968" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.775299 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59964f465-shbzw" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.846922 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a1f494c-3a8c-4f29-9fae-abf4598e90ab-config\") pod \"5a1f494c-3a8c-4f29-9fae-abf4598e90ab\" (UID: \"5a1f494c-3a8c-4f29-9fae-abf4598e90ab\") " Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.847024 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9457534e-d349-4851-820f-95d1261c44ac-config\") pod \"9457534e-d349-4851-820f-95d1261c44ac\" (UID: \"9457534e-d349-4851-820f-95d1261c44ac\") " Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.847050 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqq2v\" (UniqueName: \"kubernetes.io/projected/5a1f494c-3a8c-4f29-9fae-abf4598e90ab-kube-api-access-jqq2v\") pod \"5a1f494c-3a8c-4f29-9fae-abf4598e90ab\" (UID: \"5a1f494c-3a8c-4f29-9fae-abf4598e90ab\") " Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.847115 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44wsf\" (UniqueName: \"kubernetes.io/projected/9457534e-d349-4851-820f-95d1261c44ac-kube-api-access-44wsf\") pod \"9457534e-d349-4851-820f-95d1261c44ac\" (UID: \"9457534e-d349-4851-820f-95d1261c44ac\") " Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.847133 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9457534e-d349-4851-820f-95d1261c44ac-dns-svc\") pod \"9457534e-d349-4851-820f-95d1261c44ac\" (UID: \"9457534e-d349-4851-820f-95d1261c44ac\") " Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.848172 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9457534e-d349-4851-820f-95d1261c44ac-config" (OuterVolumeSpecName: "config") pod "9457534e-d349-4851-820f-95d1261c44ac" (UID: "9457534e-d349-4851-820f-95d1261c44ac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.848901 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a1f494c-3a8c-4f29-9fae-abf4598e90ab-config" (OuterVolumeSpecName: "config") pod "5a1f494c-3a8c-4f29-9fae-abf4598e90ab" (UID: "5a1f494c-3a8c-4f29-9fae-abf4598e90ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.849216 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9457534e-d349-4851-820f-95d1261c44ac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9457534e-d349-4851-820f-95d1261c44ac" (UID: "9457534e-d349-4851-820f-95d1261c44ac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.852995 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9457534e-d349-4851-820f-95d1261c44ac-kube-api-access-44wsf" (OuterVolumeSpecName: "kube-api-access-44wsf") pod "9457534e-d349-4851-820f-95d1261c44ac" (UID: "9457534e-d349-4851-820f-95d1261c44ac"). InnerVolumeSpecName "kube-api-access-44wsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.860686 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a1f494c-3a8c-4f29-9fae-abf4598e90ab-kube-api-access-jqq2v" (OuterVolumeSpecName: "kube-api-access-jqq2v") pod "5a1f494c-3a8c-4f29-9fae-abf4598e90ab" (UID: "5a1f494c-3a8c-4f29-9fae-abf4598e90ab"). InnerVolumeSpecName "kube-api-access-jqq2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.905710 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-p4t6l"] Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.913552 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.928410 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.939478 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.947858 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.948582 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a1f494c-3a8c-4f29-9fae-abf4598e90ab-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.948610 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9457534e-d349-4851-820f-95d1261c44ac-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.948621 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqq2v\" (UniqueName: \"kubernetes.io/projected/5a1f494c-3a8c-4f29-9fae-abf4598e90ab-kube-api-access-jqq2v\") on node \"crc\" DevicePath \"\"" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.948632 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44wsf\" (UniqueName: \"kubernetes.io/projected/9457534e-d349-4851-820f-95d1261c44ac-kube-api-access-44wsf\") on node \"crc\" DevicePath \"\"" Oct 09 15:33:40 crc kubenswrapper[4719]: I1009 15:33:40.948640 4719 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9457534e-d349-4851-820f-95d1261c44ac-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 15:33:40 crc kubenswrapper[4719]: W1009 15:33:40.967544 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0151a18_0608_47b9_b58a_7eef9dfaf31b.slice/crio-f5f5bb8614fb6f8da47f62a3bd1954c3a356d15aa7d8fba45e6cb401af6743c6 WatchSource:0}: Error finding container f5f5bb8614fb6f8da47f62a3bd1954c3a356d15aa7d8fba45e6cb401af6743c6: Status 404 returned error can't find the container with id f5f5bb8614fb6f8da47f62a3bd1954c3a356d15aa7d8fba45e6cb401af6743c6 Oct 09 15:33:40 crc kubenswrapper[4719]: W1009 15:33:40.968591 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95b09495_a75f_42db_ae2c_99ac6e46f039.slice/crio-ac05b64485c378330eb8db2f1a399a43343f9b3c55c94fc16a87bf6370cf8763 WatchSource:0}: Error finding container ac05b64485c378330eb8db2f1a399a43343f9b3c55c94fc16a87bf6370cf8763: Status 404 returned error can't find the container with id ac05b64485c378330eb8db2f1a399a43343f9b3c55c94fc16a87bf6370cf8763 Oct 09 15:33:40 crc kubenswrapper[4719]: W1009 15:33:40.972609 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05ff8a95_a910_4095_930b_e42c575bf4b8.slice/crio-597ea9fa4d689d41d7c93845d49c2a2d04aed0f88f3bf4379e29c93159940075 WatchSource:0}: Error finding container 597ea9fa4d689d41d7c93845d49c2a2d04aed0f88f3bf4379e29c93159940075: Status 404 returned error can't find the container with id 597ea9fa4d689d41d7c93845d49c2a2d04aed0f88f3bf4379e29c93159940075 Oct 09 15:33:40 crc kubenswrapper[4719]: W1009 15:33:40.972966 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8f5a6f9_5554_485d_9aee_47449402e37b.slice/crio-09b8626743fb2352967231ddfa8b0c9daa6f27a4a6802bfa250b78eefc657800 WatchSource:0}: Error finding container 09b8626743fb2352967231ddfa8b0c9daa6f27a4a6802bfa250b78eefc657800: Status 404 returned error can't find the container with id 09b8626743fb2352967231ddfa8b0c9daa6f27a4a6802bfa250b78eefc657800 Oct 09 15:33:40 crc kubenswrapper[4719]: W1009 15:33:40.975496 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7495027_5c56_46e2_9947_1ad2d6bcaf28.slice/crio-72108c2d81371576df69f73fe381122d892cb2332b8b83393a07932a27f569fa WatchSource:0}: Error finding container 72108c2d81371576df69f73fe381122d892cb2332b8b83393a07932a27f569fa: Status 404 returned error can't find the container with id 72108c2d81371576df69f73fe381122d892cb2332b8b83393a07932a27f569fa Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.005315 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.094087 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 09 15:33:41 crc kubenswrapper[4719]: W1009 15:33:41.103422 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbbe3e0c_44f1_4ad5_89a9_70d73acfc81c.slice/crio-9dfcc039b0d840e7130f9b6cd866b5dc5e8adf9a18810203fa119177f26a265c WatchSource:0}: Error finding container 9dfcc039b0d840e7130f9b6cd866b5dc5e8adf9a18810203fa119177f26a265c: Status 404 returned error can't find the container with id 9dfcc039b0d840e7130f9b6cd866b5dc5e8adf9a18810203fa119177f26a265c Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.128827 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hqfvq" event={"ID":"07b112ef-0e6a-4927-93e4-d5fc023e495f","Type":"ContainerStarted","Data":"56297ddce8d075af2211cbcf409cdd94c2ac93b8cb718d47cb99d9792984c4b6"} Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.132659 4719 generic.go:334] "Generic (PLEG): container finished" podID="0b13a1a4-8f28-426c-951c-be1bbc5229bf" containerID="575fdf288adaef5c319c068c41ccd2bc62397031fd81f54262cfaa2e4711a955" exitCode=0 Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.132773 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d6d58d699-6vpbr" event={"ID":"0b13a1a4-8f28-426c-951c-be1bbc5229bf","Type":"ContainerDied","Data":"575fdf288adaef5c319c068c41ccd2bc62397031fd81f54262cfaa2e4711a955"} Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.132837 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d6d58d699-6vpbr" event={"ID":"0b13a1a4-8f28-426c-951c-be1bbc5229bf","Type":"ContainerStarted","Data":"b89a7d4c0a16893309ac8173603130c7ef6e82f9b98add265df8a5144c378ad9"} Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.134775 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c","Type":"ContainerStarted","Data":"9dfcc039b0d840e7130f9b6cd866b5dc5e8adf9a18810203fa119177f26a265c"} Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.140446 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-559d4fdc95-rm968" event={"ID":"9457534e-d349-4851-820f-95d1261c44ac","Type":"ContainerDied","Data":"9818280dc9835c3ceae96b933cb387e7ea706fd96e772c7990330f9a574579b9"} Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.140519 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559d4fdc95-rm968" Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.144595 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"29d7fec9-be2c-4fa8-9191-5ffaf287f825","Type":"ContainerStarted","Data":"d785ce846aafe13f934741e648e6ed3bbe3795b3bfd25b405de95065e2130076"} Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.150122 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"1df540c9-8b54-44a5-9c5d-03cf736ee67a","Type":"ContainerStarted","Data":"a7ca3c59074ba3f8d11c47bfb2d43cafe3c19f478a98d11d6ab0488ce10e3111"} Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.153322 4719 generic.go:334] "Generic (PLEG): container finished" podID="9f700697-a8c2-4887-94ce-3c2c2b67efc3" containerID="cade3cad137dc2d94dacd8cbfb9c1fc4f5d5c82e6b4a761a836994ffa9c8c3a3" exitCode=0 Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.153410 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-596d6b547-grdml" event={"ID":"9f700697-a8c2-4887-94ce-3c2c2b67efc3","Type":"ContainerDied","Data":"cade3cad137dc2d94dacd8cbfb9c1fc4f5d5c82e6b4a761a836994ffa9c8c3a3"} Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.153464 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-596d6b547-grdml" event={"ID":"9f700697-a8c2-4887-94ce-3c2c2b67efc3","Type":"ContainerStarted","Data":"f8154e1aee17b2435b5a9d30907ca52411f6b5aeb082f7dc7b0c4c7052a233c9"} Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.193468 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"05ff8a95-a910-4095-930b-e42c575bf4b8","Type":"ContainerStarted","Data":"597ea9fa4d689d41d7c93845d49c2a2d04aed0f88f3bf4379e29c93159940075"} Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.200807 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dd68b64f-t69c6" event={"ID":"4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6","Type":"ContainerStarted","Data":"6d0364ba444214c15623aaa0adc9ad8602740d3a67991db42395477b119e3f02"} Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.200883 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6dd68b64f-t69c6" Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.226141 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-p4t6l" event={"ID":"f0151a18-0608-47b9-b58a-7eef9dfaf31b","Type":"ContainerStarted","Data":"f5f5bb8614fb6f8da47f62a3bd1954c3a356d15aa7d8fba45e6cb401af6743c6"} Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.237469 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"95b09495-a75f-42db-ae2c-99ac6e46f039","Type":"ContainerStarted","Data":"ac05b64485c378330eb8db2f1a399a43343f9b3c55c94fc16a87bf6370cf8763"} Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.245631 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59964f465-shbzw" event={"ID":"5a1f494c-3a8c-4f29-9fae-abf4598e90ab","Type":"ContainerDied","Data":"270ff36aeb4f323e309a2268783e13def334d415cff7c86b30cf0ccd40593372"} Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.245748 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59964f465-shbzw" Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.284874 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c7495027-5c56-46e2-9947-1ad2d6bcaf28","Type":"ContainerStarted","Data":"72108c2d81371576df69f73fe381122d892cb2332b8b83393a07932a27f569fa"} Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.293130 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c8f5a6f9-5554-485d-9aee-47449402e37b","Type":"ContainerStarted","Data":"09b8626743fb2352967231ddfa8b0c9daa6f27a4a6802bfa250b78eefc657800"} Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.295056 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-559d4fdc95-rm968"] Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.295670 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6b96d4d6-d2f4-4e82-9a9e-4c95e6f5389a","Type":"ContainerStarted","Data":"54cbd68263ae2fc30f4e03beb2be53a91d482606d5e9f2298c0a7e21e5f0c884"} Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.298166 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d6970b67-4ebd-401d-838b-8be92b8ba72f","Type":"ContainerStarted","Data":"02b82dbf5ffa6173a2a9ee65ca3862e3fd70d6d7cf693d3ccbf95ee05f7bb9a7"} Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.309502 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-559d4fdc95-rm968"] Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.314005 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6dd68b64f-t69c6" podStartSLOduration=15.062708612 podStartE2EDuration="15.313984911s" podCreationTimestamp="2025-10-09 15:33:26 +0000 UTC" firstStartedPulling="2025-10-09 15:33:39.45194139 +0000 UTC m=+924.961652675" lastFinishedPulling="2025-10-09 15:33:39.703217689 +0000 UTC m=+925.212928974" observedRunningTime="2025-10-09 15:33:41.275105747 +0000 UTC m=+926.784817042" watchObservedRunningTime="2025-10-09 15:33:41.313984911 +0000 UTC m=+926.823696196" Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.340450 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59964f465-shbzw"] Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.347609 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59964f465-shbzw"] Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.835145 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-596d6b547-grdml" Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.982517 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f700697-a8c2-4887-94ce-3c2c2b67efc3-dns-svc\") pod \"9f700697-a8c2-4887-94ce-3c2c2b67efc3\" (UID: \"9f700697-a8c2-4887-94ce-3c2c2b67efc3\") " Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.982667 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk8sc\" (UniqueName: \"kubernetes.io/projected/9f700697-a8c2-4887-94ce-3c2c2b67efc3-kube-api-access-vk8sc\") pod \"9f700697-a8c2-4887-94ce-3c2c2b67efc3\" (UID: \"9f700697-a8c2-4887-94ce-3c2c2b67efc3\") " Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.982697 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f700697-a8c2-4887-94ce-3c2c2b67efc3-config\") pod \"9f700697-a8c2-4887-94ce-3c2c2b67efc3\" (UID: \"9f700697-a8c2-4887-94ce-3c2c2b67efc3\") " Oct 09 15:33:41 crc kubenswrapper[4719]: I1009 15:33:41.988068 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f700697-a8c2-4887-94ce-3c2c2b67efc3-kube-api-access-vk8sc" (OuterVolumeSpecName: "kube-api-access-vk8sc") pod "9f700697-a8c2-4887-94ce-3c2c2b67efc3" (UID: "9f700697-a8c2-4887-94ce-3c2c2b67efc3"). InnerVolumeSpecName "kube-api-access-vk8sc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:33:42 crc kubenswrapper[4719]: I1009 15:33:42.002480 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f700697-a8c2-4887-94ce-3c2c2b67efc3-config" (OuterVolumeSpecName: "config") pod "9f700697-a8c2-4887-94ce-3c2c2b67efc3" (UID: "9f700697-a8c2-4887-94ce-3c2c2b67efc3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:33:42 crc kubenswrapper[4719]: I1009 15:33:42.011440 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f700697-a8c2-4887-94ce-3c2c2b67efc3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9f700697-a8c2-4887-94ce-3c2c2b67efc3" (UID: "9f700697-a8c2-4887-94ce-3c2c2b67efc3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:33:42 crc kubenswrapper[4719]: I1009 15:33:42.084324 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f700697-a8c2-4887-94ce-3c2c2b67efc3-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:33:42 crc kubenswrapper[4719]: I1009 15:33:42.084372 4719 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f700697-a8c2-4887-94ce-3c2c2b67efc3-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 15:33:42 crc kubenswrapper[4719]: I1009 15:33:42.084382 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk8sc\" (UniqueName: \"kubernetes.io/projected/9f700697-a8c2-4887-94ce-3c2c2b67efc3-kube-api-access-vk8sc\") on node \"crc\" DevicePath \"\"" Oct 09 15:33:42 crc kubenswrapper[4719]: I1009 15:33:42.311152 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d6d58d699-6vpbr" event={"ID":"0b13a1a4-8f28-426c-951c-be1bbc5229bf","Type":"ContainerStarted","Data":"fbc7d7dcfc4182fc0da2748b41f59ef8b49cd8acf2a30b839b8f818095757a88"} Oct 09 15:33:42 crc kubenswrapper[4719]: I1009 15:33:42.311595 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d6d58d699-6vpbr" Oct 09 15:33:42 crc kubenswrapper[4719]: I1009 15:33:42.315287 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-596d6b547-grdml" Oct 09 15:33:42 crc kubenswrapper[4719]: I1009 15:33:42.316074 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-596d6b547-grdml" event={"ID":"9f700697-a8c2-4887-94ce-3c2c2b67efc3","Type":"ContainerDied","Data":"f8154e1aee17b2435b5a9d30907ca52411f6b5aeb082f7dc7b0c4c7052a233c9"} Oct 09 15:33:42 crc kubenswrapper[4719]: I1009 15:33:42.316122 4719 scope.go:117] "RemoveContainer" containerID="cade3cad137dc2d94dacd8cbfb9c1fc4f5d5c82e6b4a761a836994ffa9c8c3a3" Oct 09 15:33:42 crc kubenswrapper[4719]: I1009 15:33:42.333059 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d6d58d699-6vpbr" podStartSLOduration=17.333040307 podStartE2EDuration="17.333040307s" podCreationTimestamp="2025-10-09 15:33:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:33:42.330586629 +0000 UTC m=+927.840297924" watchObservedRunningTime="2025-10-09 15:33:42.333040307 +0000 UTC m=+927.842751592" Oct 09 15:33:42 crc kubenswrapper[4719]: I1009 15:33:42.376695 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-596d6b547-grdml"] Oct 09 15:33:42 crc kubenswrapper[4719]: I1009 15:33:42.382008 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-596d6b547-grdml"] Oct 09 15:33:43 crc kubenswrapper[4719]: I1009 15:33:43.174661 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a1f494c-3a8c-4f29-9fae-abf4598e90ab" path="/var/lib/kubelet/pods/5a1f494c-3a8c-4f29-9fae-abf4598e90ab/volumes" Oct 09 15:33:43 crc kubenswrapper[4719]: I1009 15:33:43.175038 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9457534e-d349-4851-820f-95d1261c44ac" path="/var/lib/kubelet/pods/9457534e-d349-4851-820f-95d1261c44ac/volumes" Oct 09 15:33:43 crc kubenswrapper[4719]: I1009 15:33:43.175507 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f700697-a8c2-4887-94ce-3c2c2b67efc3" path="/var/lib/kubelet/pods/9f700697-a8c2-4887-94ce-3c2c2b67efc3/volumes" Oct 09 15:33:46 crc kubenswrapper[4719]: I1009 15:33:46.371578 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d6d58d699-6vpbr" Oct 09 15:33:46 crc kubenswrapper[4719]: I1009 15:33:46.738711 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6dd68b64f-t69c6" Oct 09 15:33:46 crc kubenswrapper[4719]: I1009 15:33:46.792921 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d6d58d699-6vpbr"] Oct 09 15:33:47 crc kubenswrapper[4719]: I1009 15:33:47.351933 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d6d58d699-6vpbr" podUID="0b13a1a4-8f28-426c-951c-be1bbc5229bf" containerName="dnsmasq-dns" containerID="cri-o://fbc7d7dcfc4182fc0da2748b41f59ef8b49cd8acf2a30b839b8f818095757a88" gracePeriod=10 Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.395411 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-skb56"] Oct 09 15:33:48 crc kubenswrapper[4719]: E1009 15:33:48.396215 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f700697-a8c2-4887-94ce-3c2c2b67efc3" containerName="init" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.396236 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f700697-a8c2-4887-94ce-3c2c2b67efc3" containerName="init" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.399844 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f700697-a8c2-4887-94ce-3c2c2b67efc3" containerName="init" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.400697 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-skb56" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.405135 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.408686 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-skb56"] Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.435086 4719 generic.go:334] "Generic (PLEG): container finished" podID="0b13a1a4-8f28-426c-951c-be1bbc5229bf" containerID="fbc7d7dcfc4182fc0da2748b41f59ef8b49cd8acf2a30b839b8f818095757a88" exitCode=0 Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.435125 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d6d58d699-6vpbr" event={"ID":"0b13a1a4-8f28-426c-951c-be1bbc5229bf","Type":"ContainerDied","Data":"fbc7d7dcfc4182fc0da2748b41f59ef8b49cd8acf2a30b839b8f818095757a88"} Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.494561 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6539f12-5508-4c6d-870a-d19815ba3120-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-skb56\" (UID: \"a6539f12-5508-4c6d-870a-d19815ba3120\") " pod="openstack/ovn-controller-metrics-skb56" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.494680 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpj9g\" (UniqueName: \"kubernetes.io/projected/a6539f12-5508-4c6d-870a-d19815ba3120-kube-api-access-kpj9g\") pod \"ovn-controller-metrics-skb56\" (UID: \"a6539f12-5508-4c6d-870a-d19815ba3120\") " pod="openstack/ovn-controller-metrics-skb56" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.494708 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a6539f12-5508-4c6d-870a-d19815ba3120-ovn-rundir\") pod \"ovn-controller-metrics-skb56\" (UID: \"a6539f12-5508-4c6d-870a-d19815ba3120\") " pod="openstack/ovn-controller-metrics-skb56" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.494735 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6539f12-5508-4c6d-870a-d19815ba3120-combined-ca-bundle\") pod \"ovn-controller-metrics-skb56\" (UID: \"a6539f12-5508-4c6d-870a-d19815ba3120\") " pod="openstack/ovn-controller-metrics-skb56" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.494796 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a6539f12-5508-4c6d-870a-d19815ba3120-ovs-rundir\") pod \"ovn-controller-metrics-skb56\" (UID: \"a6539f12-5508-4c6d-870a-d19815ba3120\") " pod="openstack/ovn-controller-metrics-skb56" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.494826 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6539f12-5508-4c6d-870a-d19815ba3120-config\") pod \"ovn-controller-metrics-skb56\" (UID: \"a6539f12-5508-4c6d-870a-d19815ba3120\") " pod="openstack/ovn-controller-metrics-skb56" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.595853 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a6539f12-5508-4c6d-870a-d19815ba3120-ovs-rundir\") pod \"ovn-controller-metrics-skb56\" (UID: \"a6539f12-5508-4c6d-870a-d19815ba3120\") " pod="openstack/ovn-controller-metrics-skb56" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.595899 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6539f12-5508-4c6d-870a-d19815ba3120-config\") pod \"ovn-controller-metrics-skb56\" (UID: \"a6539f12-5508-4c6d-870a-d19815ba3120\") " pod="openstack/ovn-controller-metrics-skb56" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.595958 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6539f12-5508-4c6d-870a-d19815ba3120-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-skb56\" (UID: \"a6539f12-5508-4c6d-870a-d19815ba3120\") " pod="openstack/ovn-controller-metrics-skb56" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.596000 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpj9g\" (UniqueName: \"kubernetes.io/projected/a6539f12-5508-4c6d-870a-d19815ba3120-kube-api-access-kpj9g\") pod \"ovn-controller-metrics-skb56\" (UID: \"a6539f12-5508-4c6d-870a-d19815ba3120\") " pod="openstack/ovn-controller-metrics-skb56" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.596021 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a6539f12-5508-4c6d-870a-d19815ba3120-ovn-rundir\") pod \"ovn-controller-metrics-skb56\" (UID: \"a6539f12-5508-4c6d-870a-d19815ba3120\") " pod="openstack/ovn-controller-metrics-skb56" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.596039 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6539f12-5508-4c6d-870a-d19815ba3120-combined-ca-bundle\") pod \"ovn-controller-metrics-skb56\" (UID: \"a6539f12-5508-4c6d-870a-d19815ba3120\") " pod="openstack/ovn-controller-metrics-skb56" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.597697 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a6539f12-5508-4c6d-870a-d19815ba3120-ovs-rundir\") pod \"ovn-controller-metrics-skb56\" (UID: \"a6539f12-5508-4c6d-870a-d19815ba3120\") " pod="openstack/ovn-controller-metrics-skb56" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.600409 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6539f12-5508-4c6d-870a-d19815ba3120-config\") pod \"ovn-controller-metrics-skb56\" (UID: \"a6539f12-5508-4c6d-870a-d19815ba3120\") " pod="openstack/ovn-controller-metrics-skb56" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.600890 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a6539f12-5508-4c6d-870a-d19815ba3120-ovn-rundir\") pod \"ovn-controller-metrics-skb56\" (UID: \"a6539f12-5508-4c6d-870a-d19815ba3120\") " pod="openstack/ovn-controller-metrics-skb56" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.607246 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6539f12-5508-4c6d-870a-d19815ba3120-combined-ca-bundle\") pod \"ovn-controller-metrics-skb56\" (UID: \"a6539f12-5508-4c6d-870a-d19815ba3120\") " pod="openstack/ovn-controller-metrics-skb56" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.608538 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6539f12-5508-4c6d-870a-d19815ba3120-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-skb56\" (UID: \"a6539f12-5508-4c6d-870a-d19815ba3120\") " pod="openstack/ovn-controller-metrics-skb56" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.670685 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpj9g\" (UniqueName: \"kubernetes.io/projected/a6539f12-5508-4c6d-870a-d19815ba3120-kube-api-access-kpj9g\") pod \"ovn-controller-metrics-skb56\" (UID: \"a6539f12-5508-4c6d-870a-d19815ba3120\") " pod="openstack/ovn-controller-metrics-skb56" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.696528 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9dd8568c7-sfmbx"] Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.698947 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dd8568c7-sfmbx" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.703689 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.712846 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9dd8568c7-sfmbx"] Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.734140 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-skb56" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.800274 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b80fb56-0e00-47cc-85c5-b7118589b2f0-ovsdbserver-nb\") pod \"dnsmasq-dns-9dd8568c7-sfmbx\" (UID: \"1b80fb56-0e00-47cc-85c5-b7118589b2f0\") " pod="openstack/dnsmasq-dns-9dd8568c7-sfmbx" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.800343 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b80fb56-0e00-47cc-85c5-b7118589b2f0-dns-svc\") pod \"dnsmasq-dns-9dd8568c7-sfmbx\" (UID: \"1b80fb56-0e00-47cc-85c5-b7118589b2f0\") " pod="openstack/dnsmasq-dns-9dd8568c7-sfmbx" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.800427 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9pk4\" (UniqueName: \"kubernetes.io/projected/1b80fb56-0e00-47cc-85c5-b7118589b2f0-kube-api-access-z9pk4\") pod \"dnsmasq-dns-9dd8568c7-sfmbx\" (UID: \"1b80fb56-0e00-47cc-85c5-b7118589b2f0\") " pod="openstack/dnsmasq-dns-9dd8568c7-sfmbx" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.800460 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b80fb56-0e00-47cc-85c5-b7118589b2f0-config\") pod \"dnsmasq-dns-9dd8568c7-sfmbx\" (UID: \"1b80fb56-0e00-47cc-85c5-b7118589b2f0\") " pod="openstack/dnsmasq-dns-9dd8568c7-sfmbx" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.901966 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b80fb56-0e00-47cc-85c5-b7118589b2f0-config\") pod \"dnsmasq-dns-9dd8568c7-sfmbx\" (UID: \"1b80fb56-0e00-47cc-85c5-b7118589b2f0\") " pod="openstack/dnsmasq-dns-9dd8568c7-sfmbx" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.902040 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b80fb56-0e00-47cc-85c5-b7118589b2f0-ovsdbserver-nb\") pod \"dnsmasq-dns-9dd8568c7-sfmbx\" (UID: \"1b80fb56-0e00-47cc-85c5-b7118589b2f0\") " pod="openstack/dnsmasq-dns-9dd8568c7-sfmbx" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.902082 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b80fb56-0e00-47cc-85c5-b7118589b2f0-dns-svc\") pod \"dnsmasq-dns-9dd8568c7-sfmbx\" (UID: \"1b80fb56-0e00-47cc-85c5-b7118589b2f0\") " pod="openstack/dnsmasq-dns-9dd8568c7-sfmbx" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.902146 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9pk4\" (UniqueName: \"kubernetes.io/projected/1b80fb56-0e00-47cc-85c5-b7118589b2f0-kube-api-access-z9pk4\") pod \"dnsmasq-dns-9dd8568c7-sfmbx\" (UID: \"1b80fb56-0e00-47cc-85c5-b7118589b2f0\") " pod="openstack/dnsmasq-dns-9dd8568c7-sfmbx" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.903121 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b80fb56-0e00-47cc-85c5-b7118589b2f0-config\") pod \"dnsmasq-dns-9dd8568c7-sfmbx\" (UID: \"1b80fb56-0e00-47cc-85c5-b7118589b2f0\") " pod="openstack/dnsmasq-dns-9dd8568c7-sfmbx" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.903310 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b80fb56-0e00-47cc-85c5-b7118589b2f0-ovsdbserver-nb\") pod \"dnsmasq-dns-9dd8568c7-sfmbx\" (UID: \"1b80fb56-0e00-47cc-85c5-b7118589b2f0\") " pod="openstack/dnsmasq-dns-9dd8568c7-sfmbx" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.903417 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b80fb56-0e00-47cc-85c5-b7118589b2f0-dns-svc\") pod \"dnsmasq-dns-9dd8568c7-sfmbx\" (UID: \"1b80fb56-0e00-47cc-85c5-b7118589b2f0\") " pod="openstack/dnsmasq-dns-9dd8568c7-sfmbx" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.937445 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9pk4\" (UniqueName: \"kubernetes.io/projected/1b80fb56-0e00-47cc-85c5-b7118589b2f0-kube-api-access-z9pk4\") pod \"dnsmasq-dns-9dd8568c7-sfmbx\" (UID: \"1b80fb56-0e00-47cc-85c5-b7118589b2f0\") " pod="openstack/dnsmasq-dns-9dd8568c7-sfmbx" Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.967746 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9dd8568c7-sfmbx"] Oct 09 15:33:48 crc kubenswrapper[4719]: I1009 15:33:48.968487 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dd8568c7-sfmbx" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.009412 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5568d9d8bc-7jxwj"] Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.010772 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.014103 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.038333 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5568d9d8bc-7jxwj"] Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.042293 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d6d58d699-6vpbr" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.105534 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b13a1a4-8f28-426c-951c-be1bbc5229bf-config\") pod \"0b13a1a4-8f28-426c-951c-be1bbc5229bf\" (UID: \"0b13a1a4-8f28-426c-951c-be1bbc5229bf\") " Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.105645 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48wrp\" (UniqueName: \"kubernetes.io/projected/0b13a1a4-8f28-426c-951c-be1bbc5229bf-kube-api-access-48wrp\") pod \"0b13a1a4-8f28-426c-951c-be1bbc5229bf\" (UID: \"0b13a1a4-8f28-426c-951c-be1bbc5229bf\") " Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.105721 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b13a1a4-8f28-426c-951c-be1bbc5229bf-dns-svc\") pod \"0b13a1a4-8f28-426c-951c-be1bbc5229bf\" (UID: \"0b13a1a4-8f28-426c-951c-be1bbc5229bf\") " Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.106029 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8a234cd-5325-4f81-8b64-0af5562bff5e-ovsdbserver-sb\") pod \"dnsmasq-dns-5568d9d8bc-7jxwj\" (UID: \"d8a234cd-5325-4f81-8b64-0af5562bff5e\") " pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.106057 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpn8z\" (UniqueName: \"kubernetes.io/projected/d8a234cd-5325-4f81-8b64-0af5562bff5e-kube-api-access-kpn8z\") pod \"dnsmasq-dns-5568d9d8bc-7jxwj\" (UID: \"d8a234cd-5325-4f81-8b64-0af5562bff5e\") " pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.106384 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8a234cd-5325-4f81-8b64-0af5562bff5e-dns-svc\") pod \"dnsmasq-dns-5568d9d8bc-7jxwj\" (UID: \"d8a234cd-5325-4f81-8b64-0af5562bff5e\") " pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.106804 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8a234cd-5325-4f81-8b64-0af5562bff5e-config\") pod \"dnsmasq-dns-5568d9d8bc-7jxwj\" (UID: \"d8a234cd-5325-4f81-8b64-0af5562bff5e\") " pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.106850 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8a234cd-5325-4f81-8b64-0af5562bff5e-ovsdbserver-nb\") pod \"dnsmasq-dns-5568d9d8bc-7jxwj\" (UID: \"d8a234cd-5325-4f81-8b64-0af5562bff5e\") " pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.111374 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b13a1a4-8f28-426c-951c-be1bbc5229bf-kube-api-access-48wrp" (OuterVolumeSpecName: "kube-api-access-48wrp") pod "0b13a1a4-8f28-426c-951c-be1bbc5229bf" (UID: "0b13a1a4-8f28-426c-951c-be1bbc5229bf"). InnerVolumeSpecName "kube-api-access-48wrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.143540 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b13a1a4-8f28-426c-951c-be1bbc5229bf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0b13a1a4-8f28-426c-951c-be1bbc5229bf" (UID: "0b13a1a4-8f28-426c-951c-be1bbc5229bf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.146044 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b13a1a4-8f28-426c-951c-be1bbc5229bf-config" (OuterVolumeSpecName: "config") pod "0b13a1a4-8f28-426c-951c-be1bbc5229bf" (UID: "0b13a1a4-8f28-426c-951c-be1bbc5229bf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.208805 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8a234cd-5325-4f81-8b64-0af5562bff5e-dns-svc\") pod \"dnsmasq-dns-5568d9d8bc-7jxwj\" (UID: \"d8a234cd-5325-4f81-8b64-0af5562bff5e\") " pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.208995 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8a234cd-5325-4f81-8b64-0af5562bff5e-config\") pod \"dnsmasq-dns-5568d9d8bc-7jxwj\" (UID: \"d8a234cd-5325-4f81-8b64-0af5562bff5e\") " pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.209025 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8a234cd-5325-4f81-8b64-0af5562bff5e-ovsdbserver-nb\") pod \"dnsmasq-dns-5568d9d8bc-7jxwj\" (UID: \"d8a234cd-5325-4f81-8b64-0af5562bff5e\") " pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.209065 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8a234cd-5325-4f81-8b64-0af5562bff5e-ovsdbserver-sb\") pod \"dnsmasq-dns-5568d9d8bc-7jxwj\" (UID: \"d8a234cd-5325-4f81-8b64-0af5562bff5e\") " pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.209090 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpn8z\" (UniqueName: \"kubernetes.io/projected/d8a234cd-5325-4f81-8b64-0af5562bff5e-kube-api-access-kpn8z\") pod \"dnsmasq-dns-5568d9d8bc-7jxwj\" (UID: \"d8a234cd-5325-4f81-8b64-0af5562bff5e\") " pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.209148 4719 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b13a1a4-8f28-426c-951c-be1bbc5229bf-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.209164 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b13a1a4-8f28-426c-951c-be1bbc5229bf-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.209177 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48wrp\" (UniqueName: \"kubernetes.io/projected/0b13a1a4-8f28-426c-951c-be1bbc5229bf-kube-api-access-48wrp\") on node \"crc\" DevicePath \"\"" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.210377 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8a234cd-5325-4f81-8b64-0af5562bff5e-config\") pod \"dnsmasq-dns-5568d9d8bc-7jxwj\" (UID: \"d8a234cd-5325-4f81-8b64-0af5562bff5e\") " pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.211194 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8a234cd-5325-4f81-8b64-0af5562bff5e-ovsdbserver-nb\") pod \"dnsmasq-dns-5568d9d8bc-7jxwj\" (UID: \"d8a234cd-5325-4f81-8b64-0af5562bff5e\") " pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.211399 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8a234cd-5325-4f81-8b64-0af5562bff5e-dns-svc\") pod \"dnsmasq-dns-5568d9d8bc-7jxwj\" (UID: \"d8a234cd-5325-4f81-8b64-0af5562bff5e\") " pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.211916 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8a234cd-5325-4f81-8b64-0af5562bff5e-ovsdbserver-sb\") pod \"dnsmasq-dns-5568d9d8bc-7jxwj\" (UID: \"d8a234cd-5325-4f81-8b64-0af5562bff5e\") " pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.231119 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpn8z\" (UniqueName: \"kubernetes.io/projected/d8a234cd-5325-4f81-8b64-0af5562bff5e-kube-api-access-kpn8z\") pod \"dnsmasq-dns-5568d9d8bc-7jxwj\" (UID: \"d8a234cd-5325-4f81-8b64-0af5562bff5e\") " pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.333227 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.446848 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d6d58d699-6vpbr" event={"ID":"0b13a1a4-8f28-426c-951c-be1bbc5229bf","Type":"ContainerDied","Data":"b89a7d4c0a16893309ac8173603130c7ef6e82f9b98add265df8a5144c378ad9"} Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.446910 4719 scope.go:117] "RemoveContainer" containerID="fbc7d7dcfc4182fc0da2748b41f59ef8b49cd8acf2a30b839b8f818095757a88" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.447061 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d6d58d699-6vpbr" Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.467601 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d6d58d699-6vpbr"] Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.474729 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d6d58d699-6vpbr"] Oct 09 15:33:49 crc kubenswrapper[4719]: I1009 15:33:49.744790 4719 scope.go:117] "RemoveContainer" containerID="575fdf288adaef5c319c068c41ccd2bc62397031fd81f54262cfaa2e4711a955" Oct 09 15:33:50 crc kubenswrapper[4719]: I1009 15:33:50.282406 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-skb56"] Oct 09 15:33:50 crc kubenswrapper[4719]: W1009 15:33:50.338670 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6539f12_5508_4c6d_870a_d19815ba3120.slice/crio-caacd82621ed0313f877f6f212ca42ae5a2e801e463df9d29c84db3e8a5f3eac WatchSource:0}: Error finding container caacd82621ed0313f877f6f212ca42ae5a2e801e463df9d29c84db3e8a5f3eac: Status 404 returned error can't find the container with id caacd82621ed0313f877f6f212ca42ae5a2e801e463df9d29c84db3e8a5f3eac Oct 09 15:33:50 crc kubenswrapper[4719]: I1009 15:33:50.448892 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9dd8568c7-sfmbx"] Oct 09 15:33:50 crc kubenswrapper[4719]: W1009 15:33:50.474645 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b80fb56_0e00_47cc_85c5_b7118589b2f0.slice/crio-377dc39dd37544fdac7c275364a595df35dafe2b8206c2ed22612e4b8c5d5950 WatchSource:0}: Error finding container 377dc39dd37544fdac7c275364a595df35dafe2b8206c2ed22612e4b8c5d5950: Status 404 returned error can't find the container with id 377dc39dd37544fdac7c275364a595df35dafe2b8206c2ed22612e4b8c5d5950 Oct 09 15:33:50 crc kubenswrapper[4719]: I1009 15:33:50.485402 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-skb56" event={"ID":"a6539f12-5508-4c6d-870a-d19815ba3120","Type":"ContainerStarted","Data":"caacd82621ed0313f877f6f212ca42ae5a2e801e463df9d29c84db3e8a5f3eac"} Oct 09 15:33:50 crc kubenswrapper[4719]: I1009 15:33:50.487378 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c7495027-5c56-46e2-9947-1ad2d6bcaf28","Type":"ContainerStarted","Data":"b9eb2355879ad8c7f4909517971444255d20435ee54b10757dff4d78eeb5fac1"} Oct 09 15:33:50 crc kubenswrapper[4719]: I1009 15:33:50.488628 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 09 15:33:50 crc kubenswrapper[4719]: I1009 15:33:50.507400 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"05ff8a95-a910-4095-930b-e42c575bf4b8","Type":"ContainerStarted","Data":"6d9aceec8c2ca7be7a970c0607762a12034a2ef43e5821c07b98ad71429699a2"} Oct 09 15:33:50 crc kubenswrapper[4719]: I1009 15:33:50.526770 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"29d7fec9-be2c-4fa8-9191-5ffaf287f825","Type":"ContainerStarted","Data":"ceb7dc7f8eb37692c3d769996a7a2cc2f3133684f29b635028475c9b20efc357"} Oct 09 15:33:50 crc kubenswrapper[4719]: I1009 15:33:50.534614 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5568d9d8bc-7jxwj"] Oct 09 15:33:50 crc kubenswrapper[4719]: I1009 15:33:50.541251 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=13.347198208 podStartE2EDuration="20.541220323s" podCreationTimestamp="2025-10-09 15:33:30 +0000 UTC" firstStartedPulling="2025-10-09 15:33:40.976852008 +0000 UTC m=+926.486563293" lastFinishedPulling="2025-10-09 15:33:48.170874123 +0000 UTC m=+933.680585408" observedRunningTime="2025-10-09 15:33:50.523200642 +0000 UTC m=+936.032911927" watchObservedRunningTime="2025-10-09 15:33:50.541220323 +0000 UTC m=+936.050931608" Oct 09 15:33:50 crc kubenswrapper[4719]: I1009 15:33:50.556514 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c","Type":"ContainerStarted","Data":"f66c35192a5c4996ffd6710cc4339fabfd2f3c8f226a0ae407d4691f7df29c30"} Oct 09 15:33:51 crc kubenswrapper[4719]: I1009 15:33:51.183963 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b13a1a4-8f28-426c-951c-be1bbc5229bf" path="/var/lib/kubelet/pods/0b13a1a4-8f28-426c-951c-be1bbc5229bf/volumes" Oct 09 15:33:51 crc kubenswrapper[4719]: I1009 15:33:51.572208 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6b96d4d6-d2f4-4e82-9a9e-4c95e6f5389a","Type":"ContainerStarted","Data":"ef9b73cf72dbe624f8649238f78b7a283967c35b1b38f59eb3ceb64324f2b069"} Oct 09 15:33:51 crc kubenswrapper[4719]: I1009 15:33:51.572304 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 09 15:33:51 crc kubenswrapper[4719]: I1009 15:33:51.578007 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-p4t6l" event={"ID":"f0151a18-0608-47b9-b58a-7eef9dfaf31b","Type":"ContainerStarted","Data":"6de2eece1d4c24221b19bc47b38f728be546c612bac0e6564a751e25c63a93bd"} Oct 09 15:33:51 crc kubenswrapper[4719]: I1009 15:33:51.578163 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-p4t6l" Oct 09 15:33:51 crc kubenswrapper[4719]: I1009 15:33:51.585713 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d6970b67-4ebd-401d-838b-8be92b8ba72f","Type":"ContainerStarted","Data":"1836628de450835f8d59572f50da26d2b05d8d8ef76dc945b91fdaa1d19e3504"} Oct 09 15:33:51 crc kubenswrapper[4719]: I1009 15:33:51.590652 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.060214565 podStartE2EDuration="19.590628403s" podCreationTimestamp="2025-10-09 15:33:32 +0000 UTC" firstStartedPulling="2025-10-09 15:33:40.485104024 +0000 UTC m=+925.994815319" lastFinishedPulling="2025-10-09 15:33:50.015517872 +0000 UTC m=+935.525229157" observedRunningTime="2025-10-09 15:33:51.589760336 +0000 UTC m=+937.099471631" watchObservedRunningTime="2025-10-09 15:33:51.590628403 +0000 UTC m=+937.100339688" Oct 09 15:33:51 crc kubenswrapper[4719]: I1009 15:33:51.595699 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hqfvq" event={"ID":"07b112ef-0e6a-4927-93e4-d5fc023e495f","Type":"ContainerStarted","Data":"348c945edbbaf3565b1ae90c1a0afc306f40ff11edac9c40ec49d75c06a445b7"} Oct 09 15:33:51 crc kubenswrapper[4719]: I1009 15:33:51.605185 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" event={"ID":"d8a234cd-5325-4f81-8b64-0af5562bff5e","Type":"ContainerStarted","Data":"561f2cf8bcc34410f6a846f00eae6b5d221da76120eb127cb85836cc02a4df5c"} Oct 09 15:33:51 crc kubenswrapper[4719]: I1009 15:33:51.611385 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d3a820d9-3c13-47ec-a39e-dea4d60b7536","Type":"ContainerStarted","Data":"72455dcfecc11206bc40520b8d088cd8e9d4106ebbf33631ff25928e9b1dc487"} Oct 09 15:33:51 crc kubenswrapper[4719]: I1009 15:33:51.625147 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dd8568c7-sfmbx" event={"ID":"1b80fb56-0e00-47cc-85c5-b7118589b2f0","Type":"ContainerStarted","Data":"377dc39dd37544fdac7c275364a595df35dafe2b8206c2ed22612e4b8c5d5950"} Oct 09 15:33:51 crc kubenswrapper[4719]: I1009 15:33:51.630585 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c8f5a6f9-5554-485d-9aee-47449402e37b","Type":"ContainerStarted","Data":"df5f9951dcb79d436b3ba8bc9106853f92f66edecb4cb1855c4a6cc3746f4ea4"} Oct 09 15:33:51 crc kubenswrapper[4719]: I1009 15:33:51.645051 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-p4t6l" podStartSLOduration=6.871917606 podStartE2EDuration="15.64502337s" podCreationTimestamp="2025-10-09 15:33:36 +0000 UTC" firstStartedPulling="2025-10-09 15:33:40.971878429 +0000 UTC m=+926.481589714" lastFinishedPulling="2025-10-09 15:33:49.744984193 +0000 UTC m=+935.254695478" observedRunningTime="2025-10-09 15:33:51.641931702 +0000 UTC m=+937.151643017" watchObservedRunningTime="2025-10-09 15:33:51.64502337 +0000 UTC m=+937.154734655" Oct 09 15:33:52 crc kubenswrapper[4719]: I1009 15:33:52.640538 4719 generic.go:334] "Generic (PLEG): container finished" podID="d8a234cd-5325-4f81-8b64-0af5562bff5e" containerID="f169d03e68245ead0b553a51022662d277e3ef4f7ae3562060873febcde75294" exitCode=0 Oct 09 15:33:52 crc kubenswrapper[4719]: I1009 15:33:52.640817 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" event={"ID":"d8a234cd-5325-4f81-8b64-0af5562bff5e","Type":"ContainerDied","Data":"f169d03e68245ead0b553a51022662d277e3ef4f7ae3562060873febcde75294"} Oct 09 15:33:52 crc kubenswrapper[4719]: I1009 15:33:52.644388 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"1df540c9-8b54-44a5-9c5d-03cf736ee67a","Type":"ContainerStarted","Data":"38009f663fbfc9c961727713c72827abf1225d73ea39016f1ab401dd44130afd"} Oct 09 15:33:52 crc kubenswrapper[4719]: I1009 15:33:52.645771 4719 generic.go:334] "Generic (PLEG): container finished" podID="1b80fb56-0e00-47cc-85c5-b7118589b2f0" containerID="7bf658ee6ec5200cdfdcac31f2371fd90e5a70d6ede62e8b9a2e7360c340d160" exitCode=0 Oct 09 15:33:52 crc kubenswrapper[4719]: I1009 15:33:52.645842 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dd8568c7-sfmbx" event={"ID":"1b80fb56-0e00-47cc-85c5-b7118589b2f0","Type":"ContainerDied","Data":"7bf658ee6ec5200cdfdcac31f2371fd90e5a70d6ede62e8b9a2e7360c340d160"} Oct 09 15:33:52 crc kubenswrapper[4719]: I1009 15:33:52.649778 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"95b09495-a75f-42db-ae2c-99ac6e46f039","Type":"ContainerStarted","Data":"c71605342389cbf78dd4673d88ba0c916c140e7ccdb2f4ba39746c61fdc45be9"} Oct 09 15:33:52 crc kubenswrapper[4719]: I1009 15:33:52.652894 4719 generic.go:334] "Generic (PLEG): container finished" podID="07b112ef-0e6a-4927-93e4-d5fc023e495f" containerID="348c945edbbaf3565b1ae90c1a0afc306f40ff11edac9c40ec49d75c06a445b7" exitCode=0 Oct 09 15:33:52 crc kubenswrapper[4719]: I1009 15:33:52.654296 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hqfvq" event={"ID":"07b112ef-0e6a-4927-93e4-d5fc023e495f","Type":"ContainerDied","Data":"348c945edbbaf3565b1ae90c1a0afc306f40ff11edac9c40ec49d75c06a445b7"} Oct 09 15:33:56 crc kubenswrapper[4719]: I1009 15:33:56.114305 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 09 15:34:00 crc kubenswrapper[4719]: I1009 15:34:00.477453 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dd8568c7-sfmbx" Oct 09 15:34:00 crc kubenswrapper[4719]: I1009 15:34:00.541240 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b80fb56-0e00-47cc-85c5-b7118589b2f0-ovsdbserver-nb\") pod \"1b80fb56-0e00-47cc-85c5-b7118589b2f0\" (UID: \"1b80fb56-0e00-47cc-85c5-b7118589b2f0\") " Oct 09 15:34:00 crc kubenswrapper[4719]: I1009 15:34:00.541298 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9pk4\" (UniqueName: \"kubernetes.io/projected/1b80fb56-0e00-47cc-85c5-b7118589b2f0-kube-api-access-z9pk4\") pod \"1b80fb56-0e00-47cc-85c5-b7118589b2f0\" (UID: \"1b80fb56-0e00-47cc-85c5-b7118589b2f0\") " Oct 09 15:34:00 crc kubenswrapper[4719]: I1009 15:34:00.541338 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b80fb56-0e00-47cc-85c5-b7118589b2f0-dns-svc\") pod \"1b80fb56-0e00-47cc-85c5-b7118589b2f0\" (UID: \"1b80fb56-0e00-47cc-85c5-b7118589b2f0\") " Oct 09 15:34:00 crc kubenswrapper[4719]: I1009 15:34:00.541495 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b80fb56-0e00-47cc-85c5-b7118589b2f0-config\") pod \"1b80fb56-0e00-47cc-85c5-b7118589b2f0\" (UID: \"1b80fb56-0e00-47cc-85c5-b7118589b2f0\") " Oct 09 15:34:00 crc kubenswrapper[4719]: I1009 15:34:00.549100 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b80fb56-0e00-47cc-85c5-b7118589b2f0-kube-api-access-z9pk4" (OuterVolumeSpecName: "kube-api-access-z9pk4") pod "1b80fb56-0e00-47cc-85c5-b7118589b2f0" (UID: "1b80fb56-0e00-47cc-85c5-b7118589b2f0"). InnerVolumeSpecName "kube-api-access-z9pk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:34:00 crc kubenswrapper[4719]: I1009 15:34:00.574567 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b80fb56-0e00-47cc-85c5-b7118589b2f0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1b80fb56-0e00-47cc-85c5-b7118589b2f0" (UID: "1b80fb56-0e00-47cc-85c5-b7118589b2f0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:34:00 crc kubenswrapper[4719]: I1009 15:34:00.575991 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b80fb56-0e00-47cc-85c5-b7118589b2f0-config" (OuterVolumeSpecName: "config") pod "1b80fb56-0e00-47cc-85c5-b7118589b2f0" (UID: "1b80fb56-0e00-47cc-85c5-b7118589b2f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:34:00 crc kubenswrapper[4719]: I1009 15:34:00.587042 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b80fb56-0e00-47cc-85c5-b7118589b2f0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1b80fb56-0e00-47cc-85c5-b7118589b2f0" (UID: "1b80fb56-0e00-47cc-85c5-b7118589b2f0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:34:00 crc kubenswrapper[4719]: I1009 15:34:00.643491 4719 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b80fb56-0e00-47cc-85c5-b7118589b2f0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:00 crc kubenswrapper[4719]: I1009 15:34:00.643529 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9pk4\" (UniqueName: \"kubernetes.io/projected/1b80fb56-0e00-47cc-85c5-b7118589b2f0-kube-api-access-z9pk4\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:00 crc kubenswrapper[4719]: I1009 15:34:00.643544 4719 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b80fb56-0e00-47cc-85c5-b7118589b2f0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:00 crc kubenswrapper[4719]: I1009 15:34:00.643556 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b80fb56-0e00-47cc-85c5-b7118589b2f0-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:00 crc kubenswrapper[4719]: I1009 15:34:00.715995 4719 generic.go:334] "Generic (PLEG): container finished" podID="95b09495-a75f-42db-ae2c-99ac6e46f039" containerID="c71605342389cbf78dd4673d88ba0c916c140e7ccdb2f4ba39746c61fdc45be9" exitCode=0 Oct 09 15:34:00 crc kubenswrapper[4719]: I1009 15:34:00.716065 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"95b09495-a75f-42db-ae2c-99ac6e46f039","Type":"ContainerDied","Data":"c71605342389cbf78dd4673d88ba0c916c140e7ccdb2f4ba39746c61fdc45be9"} Oct 09 15:34:00 crc kubenswrapper[4719]: I1009 15:34:00.729482 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dd8568c7-sfmbx" event={"ID":"1b80fb56-0e00-47cc-85c5-b7118589b2f0","Type":"ContainerDied","Data":"377dc39dd37544fdac7c275364a595df35dafe2b8206c2ed22612e4b8c5d5950"} Oct 09 15:34:00 crc kubenswrapper[4719]: I1009 15:34:00.729558 4719 scope.go:117] "RemoveContainer" containerID="7bf658ee6ec5200cdfdcac31f2371fd90e5a70d6ede62e8b9a2e7360c340d160" Oct 09 15:34:00 crc kubenswrapper[4719]: I1009 15:34:00.729572 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dd8568c7-sfmbx" Oct 09 15:34:00 crc kubenswrapper[4719]: I1009 15:34:00.809561 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9dd8568c7-sfmbx"] Oct 09 15:34:00 crc kubenswrapper[4719]: I1009 15:34:00.821559 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9dd8568c7-sfmbx"] Oct 09 15:34:01 crc kubenswrapper[4719]: I1009 15:34:01.174322 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b80fb56-0e00-47cc-85c5-b7118589b2f0" path="/var/lib/kubelet/pods/1b80fb56-0e00-47cc-85c5-b7118589b2f0/volumes" Oct 09 15:34:01 crc kubenswrapper[4719]: I1009 15:34:01.738880 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-skb56" event={"ID":"a6539f12-5508-4c6d-870a-d19815ba3120","Type":"ContainerStarted","Data":"baddbae9d310824dc60498f46c395026089aec36b886a125a0e817bbf98b5e07"} Oct 09 15:34:01 crc kubenswrapper[4719]: I1009 15:34:01.743077 4719 generic.go:334] "Generic (PLEG): container finished" podID="05ff8a95-a910-4095-930b-e42c575bf4b8" containerID="6d9aceec8c2ca7be7a970c0607762a12034a2ef43e5821c07b98ad71429699a2" exitCode=0 Oct 09 15:34:01 crc kubenswrapper[4719]: I1009 15:34:01.743170 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"05ff8a95-a910-4095-930b-e42c575bf4b8","Type":"ContainerDied","Data":"6d9aceec8c2ca7be7a970c0607762a12034a2ef43e5821c07b98ad71429699a2"} Oct 09 15:34:01 crc kubenswrapper[4719]: I1009 15:34:01.745508 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"29d7fec9-be2c-4fa8-9191-5ffaf287f825","Type":"ContainerStarted","Data":"9196d9ed66fc5c458eddddbbb296d8c53737396104bd43d59b093b749c040f4d"} Oct 09 15:34:01 crc kubenswrapper[4719]: I1009 15:34:01.747248 4719 generic.go:334] "Generic (PLEG): container finished" podID="d6970b67-4ebd-401d-838b-8be92b8ba72f" containerID="1836628de450835f8d59572f50da26d2b05d8d8ef76dc945b91fdaa1d19e3504" exitCode=0 Oct 09 15:34:01 crc kubenswrapper[4719]: I1009 15:34:01.747451 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d6970b67-4ebd-401d-838b-8be92b8ba72f","Type":"ContainerDied","Data":"1836628de450835f8d59572f50da26d2b05d8d8ef76dc945b91fdaa1d19e3504"} Oct 09 15:34:01 crc kubenswrapper[4719]: I1009 15:34:01.760224 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hqfvq" event={"ID":"07b112ef-0e6a-4927-93e4-d5fc023e495f","Type":"ContainerStarted","Data":"16c9418c62aef40f937250258b7c7b55f48b0698448dcce40d2e9b940fb42f02"} Oct 09 15:34:01 crc kubenswrapper[4719]: I1009 15:34:01.760273 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hqfvq" event={"ID":"07b112ef-0e6a-4927-93e4-d5fc023e495f","Type":"ContainerStarted","Data":"66d7279b5a890d54bce3a90f8e483bb751329c40eba4f6259cd0146e8ebf9a73"} Oct 09 15:34:01 crc kubenswrapper[4719]: I1009 15:34:01.760938 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hqfvq" Oct 09 15:34:01 crc kubenswrapper[4719]: I1009 15:34:01.760972 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hqfvq" Oct 09 15:34:01 crc kubenswrapper[4719]: I1009 15:34:01.772787 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-skb56" podStartSLOduration=3.056365579 podStartE2EDuration="13.772768424s" podCreationTimestamp="2025-10-09 15:33:48 +0000 UTC" firstStartedPulling="2025-10-09 15:33:50.34112032 +0000 UTC m=+935.850831605" lastFinishedPulling="2025-10-09 15:34:01.057523165 +0000 UTC m=+946.567234450" observedRunningTime="2025-10-09 15:34:01.772521047 +0000 UTC m=+947.282232332" watchObservedRunningTime="2025-10-09 15:34:01.772768424 +0000 UTC m=+947.282479709" Oct 09 15:34:01 crc kubenswrapper[4719]: I1009 15:34:01.773179 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c","Type":"ContainerStarted","Data":"2fe66684759514a32ca808a0d30322c846cb79a9ed1da45aebb0fa59250a800c"} Oct 09 15:34:01 crc kubenswrapper[4719]: I1009 15:34:01.779650 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" event={"ID":"d8a234cd-5325-4f81-8b64-0af5562bff5e","Type":"ContainerStarted","Data":"1647d952d1603085c0c951167634914b9c9b3f1b41faf5e2f18836a5cc0ebaaf"} Oct 09 15:34:01 crc kubenswrapper[4719]: I1009 15:34:01.780553 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" Oct 09 15:34:01 crc kubenswrapper[4719]: I1009 15:34:01.862767 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.888474442 podStartE2EDuration="24.862674068s" podCreationTimestamp="2025-10-09 15:33:37 +0000 UTC" firstStartedPulling="2025-10-09 15:33:41.012560761 +0000 UTC m=+926.522272046" lastFinishedPulling="2025-10-09 15:34:00.986760387 +0000 UTC m=+946.496471672" observedRunningTime="2025-10-09 15:34:01.825155108 +0000 UTC m=+947.334866413" watchObservedRunningTime="2025-10-09 15:34:01.862674068 +0000 UTC m=+947.372385373" Oct 09 15:34:01 crc kubenswrapper[4719]: I1009 15:34:01.928469 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-hqfvq" podStartSLOduration=18.275465409 podStartE2EDuration="25.928446837s" podCreationTimestamp="2025-10-09 15:33:36 +0000 UTC" firstStartedPulling="2025-10-09 15:33:40.611019192 +0000 UTC m=+926.120730477" lastFinishedPulling="2025-10-09 15:33:48.26400061 +0000 UTC m=+933.773711905" observedRunningTime="2025-10-09 15:34:01.892702683 +0000 UTC m=+947.402413968" watchObservedRunningTime="2025-10-09 15:34:01.928446837 +0000 UTC m=+947.438158122" Oct 09 15:34:01 crc kubenswrapper[4719]: I1009 15:34:01.957755 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" podStartSLOduration=13.957737187 podStartE2EDuration="13.957737187s" podCreationTimestamp="2025-10-09 15:33:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:34:01.94903799 +0000 UTC m=+947.458749285" watchObservedRunningTime="2025-10-09 15:34:01.957737187 +0000 UTC m=+947.467448472" Oct 09 15:34:02 crc kubenswrapper[4719]: I1009 15:34:02.794173 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"05ff8a95-a910-4095-930b-e42c575bf4b8","Type":"ContainerStarted","Data":"a530acda5e96810126bdc7eccdfc258ffe81d47f4c94dec43ae9bbcf874b63d2"} Oct 09 15:34:02 crc kubenswrapper[4719]: I1009 15:34:02.828670 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 09 15:34:02 crc kubenswrapper[4719]: I1009 15:34:02.831587 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d6970b67-4ebd-401d-838b-8be92b8ba72f","Type":"ContainerStarted","Data":"4538879b7a38c43a0078274ccf0892056e0e075d2fe86a241cad8799300f8dff"} Oct 09 15:34:02 crc kubenswrapper[4719]: I1009 15:34:02.835927 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.911565967 podStartE2EDuration="24.83590329s" podCreationTimestamp="2025-10-09 15:33:38 +0000 UTC" firstStartedPulling="2025-10-09 15:33:41.106972289 +0000 UTC m=+926.616683574" lastFinishedPulling="2025-10-09 15:34:01.031309622 +0000 UTC m=+946.541020897" observedRunningTime="2025-10-09 15:34:01.974786298 +0000 UTC m=+947.484497593" watchObservedRunningTime="2025-10-09 15:34:02.83590329 +0000 UTC m=+948.345614575" Oct 09 15:34:02 crc kubenswrapper[4719]: I1009 15:34:02.836491 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=25.848726401 podStartE2EDuration="33.836484377s" podCreationTimestamp="2025-10-09 15:33:29 +0000 UTC" firstStartedPulling="2025-10-09 15:33:40.974230115 +0000 UTC m=+926.483941400" lastFinishedPulling="2025-10-09 15:33:48.961988091 +0000 UTC m=+934.471699376" observedRunningTime="2025-10-09 15:34:02.828667739 +0000 UTC m=+948.338379024" watchObservedRunningTime="2025-10-09 15:34:02.836484377 +0000 UTC m=+948.346195682" Oct 09 15:34:02 crc kubenswrapper[4719]: I1009 15:34:02.887306 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=25.81466754 podStartE2EDuration="33.887285211s" podCreationTimestamp="2025-10-09 15:33:29 +0000 UTC" firstStartedPulling="2025-10-09 15:33:40.484558467 +0000 UTC m=+925.994269752" lastFinishedPulling="2025-10-09 15:33:48.557176138 +0000 UTC m=+934.066887423" observedRunningTime="2025-10-09 15:34:02.885610158 +0000 UTC m=+948.395321473" watchObservedRunningTime="2025-10-09 15:34:02.887285211 +0000 UTC m=+948.396996506" Oct 09 15:34:02 crc kubenswrapper[4719]: I1009 15:34:02.958163 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5568d9d8bc-7jxwj"] Oct 09 15:34:02 crc kubenswrapper[4719]: I1009 15:34:02.988114 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58866ff6f5-kd5d5"] Oct 09 15:34:02 crc kubenswrapper[4719]: E1009 15:34:02.988536 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b13a1a4-8f28-426c-951c-be1bbc5229bf" containerName="init" Oct 09 15:34:02 crc kubenswrapper[4719]: I1009 15:34:02.988555 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b13a1a4-8f28-426c-951c-be1bbc5229bf" containerName="init" Oct 09 15:34:02 crc kubenswrapper[4719]: E1009 15:34:02.988571 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b13a1a4-8f28-426c-951c-be1bbc5229bf" containerName="dnsmasq-dns" Oct 09 15:34:02 crc kubenswrapper[4719]: I1009 15:34:02.988578 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b13a1a4-8f28-426c-951c-be1bbc5229bf" containerName="dnsmasq-dns" Oct 09 15:34:02 crc kubenswrapper[4719]: E1009 15:34:02.988586 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b80fb56-0e00-47cc-85c5-b7118589b2f0" containerName="init" Oct 09 15:34:02 crc kubenswrapper[4719]: I1009 15:34:02.988592 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b80fb56-0e00-47cc-85c5-b7118589b2f0" containerName="init" Oct 09 15:34:02 crc kubenswrapper[4719]: I1009 15:34:02.988780 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b80fb56-0e00-47cc-85c5-b7118589b2f0" containerName="init" Oct 09 15:34:02 crc kubenswrapper[4719]: I1009 15:34:02.988819 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b13a1a4-8f28-426c-951c-be1bbc5229bf" containerName="dnsmasq-dns" Oct 09 15:34:02 crc kubenswrapper[4719]: I1009 15:34:02.989785 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" Oct 09 15:34:03 crc kubenswrapper[4719]: I1009 15:34:03.030214 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58866ff6f5-kd5d5"] Oct 09 15:34:03 crc kubenswrapper[4719]: I1009 15:34:03.082396 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0ebf29d-d040-4829-912f-e8bd998cf0da-ovsdbserver-nb\") pod \"dnsmasq-dns-58866ff6f5-kd5d5\" (UID: \"b0ebf29d-d040-4829-912f-e8bd998cf0da\") " pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" Oct 09 15:34:03 crc kubenswrapper[4719]: I1009 15:34:03.082450 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0ebf29d-d040-4829-912f-e8bd998cf0da-dns-svc\") pod \"dnsmasq-dns-58866ff6f5-kd5d5\" (UID: \"b0ebf29d-d040-4829-912f-e8bd998cf0da\") " pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" Oct 09 15:34:03 crc kubenswrapper[4719]: I1009 15:34:03.082481 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcv59\" (UniqueName: \"kubernetes.io/projected/b0ebf29d-d040-4829-912f-e8bd998cf0da-kube-api-access-pcv59\") pod \"dnsmasq-dns-58866ff6f5-kd5d5\" (UID: \"b0ebf29d-d040-4829-912f-e8bd998cf0da\") " pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" Oct 09 15:34:03 crc kubenswrapper[4719]: I1009 15:34:03.082513 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0ebf29d-d040-4829-912f-e8bd998cf0da-ovsdbserver-sb\") pod \"dnsmasq-dns-58866ff6f5-kd5d5\" (UID: \"b0ebf29d-d040-4829-912f-e8bd998cf0da\") " pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" Oct 09 15:34:03 crc kubenswrapper[4719]: I1009 15:34:03.082582 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0ebf29d-d040-4829-912f-e8bd998cf0da-config\") pod \"dnsmasq-dns-58866ff6f5-kd5d5\" (UID: \"b0ebf29d-d040-4829-912f-e8bd998cf0da\") " pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" Oct 09 15:34:03 crc kubenswrapper[4719]: I1009 15:34:03.138765 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 09 15:34:03 crc kubenswrapper[4719]: I1009 15:34:03.183464 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0ebf29d-d040-4829-912f-e8bd998cf0da-config\") pod \"dnsmasq-dns-58866ff6f5-kd5d5\" (UID: \"b0ebf29d-d040-4829-912f-e8bd998cf0da\") " pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" Oct 09 15:34:03 crc kubenswrapper[4719]: I1009 15:34:03.183545 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0ebf29d-d040-4829-912f-e8bd998cf0da-ovsdbserver-nb\") pod \"dnsmasq-dns-58866ff6f5-kd5d5\" (UID: \"b0ebf29d-d040-4829-912f-e8bd998cf0da\") " pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" Oct 09 15:34:03 crc kubenswrapper[4719]: I1009 15:34:03.183565 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0ebf29d-d040-4829-912f-e8bd998cf0da-dns-svc\") pod \"dnsmasq-dns-58866ff6f5-kd5d5\" (UID: \"b0ebf29d-d040-4829-912f-e8bd998cf0da\") " pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" Oct 09 15:34:03 crc kubenswrapper[4719]: I1009 15:34:03.183585 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcv59\" (UniqueName: \"kubernetes.io/projected/b0ebf29d-d040-4829-912f-e8bd998cf0da-kube-api-access-pcv59\") pod \"dnsmasq-dns-58866ff6f5-kd5d5\" (UID: \"b0ebf29d-d040-4829-912f-e8bd998cf0da\") " pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" Oct 09 15:34:03 crc kubenswrapper[4719]: I1009 15:34:03.183612 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0ebf29d-d040-4829-912f-e8bd998cf0da-ovsdbserver-sb\") pod \"dnsmasq-dns-58866ff6f5-kd5d5\" (UID: \"b0ebf29d-d040-4829-912f-e8bd998cf0da\") " pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" Oct 09 15:34:03 crc kubenswrapper[4719]: I1009 15:34:03.184887 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0ebf29d-d040-4829-912f-e8bd998cf0da-dns-svc\") pod \"dnsmasq-dns-58866ff6f5-kd5d5\" (UID: \"b0ebf29d-d040-4829-912f-e8bd998cf0da\") " pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" Oct 09 15:34:03 crc kubenswrapper[4719]: I1009 15:34:03.184979 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0ebf29d-d040-4829-912f-e8bd998cf0da-ovsdbserver-sb\") pod \"dnsmasq-dns-58866ff6f5-kd5d5\" (UID: \"b0ebf29d-d040-4829-912f-e8bd998cf0da\") " pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" Oct 09 15:34:03 crc kubenswrapper[4719]: I1009 15:34:03.185029 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0ebf29d-d040-4829-912f-e8bd998cf0da-ovsdbserver-nb\") pod \"dnsmasq-dns-58866ff6f5-kd5d5\" (UID: \"b0ebf29d-d040-4829-912f-e8bd998cf0da\") " pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" Oct 09 15:34:03 crc kubenswrapper[4719]: I1009 15:34:03.185069 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0ebf29d-d040-4829-912f-e8bd998cf0da-config\") pod \"dnsmasq-dns-58866ff6f5-kd5d5\" (UID: \"b0ebf29d-d040-4829-912f-e8bd998cf0da\") " pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" Oct 09 15:34:03 crc kubenswrapper[4719]: I1009 15:34:03.189070 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 09 15:34:03 crc kubenswrapper[4719]: I1009 15:34:03.204786 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcv59\" (UniqueName: \"kubernetes.io/projected/b0ebf29d-d040-4829-912f-e8bd998cf0da-kube-api-access-pcv59\") pod \"dnsmasq-dns-58866ff6f5-kd5d5\" (UID: \"b0ebf29d-d040-4829-912f-e8bd998cf0da\") " pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" Oct 09 15:34:03 crc kubenswrapper[4719]: I1009 15:34:03.317413 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" Oct 09 15:34:03 crc kubenswrapper[4719]: I1009 15:34:03.820238 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58866ff6f5-kd5d5"] Oct 09 15:34:03 crc kubenswrapper[4719]: W1009 15:34:03.832763 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0ebf29d_d040_4829_912f_e8bd998cf0da.slice/crio-57d522a6800e2de7394201b9e5041ca3f1d6e6e5ff0cba5d678bf00bc4216545 WatchSource:0}: Error finding container 57d522a6800e2de7394201b9e5041ca3f1d6e6e5ff0cba5d678bf00bc4216545: Status 404 returned error can't find the container with id 57d522a6800e2de7394201b9e5041ca3f1d6e6e5ff0cba5d678bf00bc4216545 Oct 09 15:34:03 crc kubenswrapper[4719]: I1009 15:34:03.844341 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 09 15:34:03 crc kubenswrapper[4719]: I1009 15:34:03.887763 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.017558 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.041665 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.045730 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.045984 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.046088 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.046205 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-2zzrv" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.083356 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.104851 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0358ac72-8c58-4e63-843e-b9eaa35aefdf-etc-swift\") pod \"swift-storage-0\" (UID: \"0358ac72-8c58-4e63-843e-b9eaa35aefdf\") " pod="openstack/swift-storage-0" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.104987 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m59xm\" (UniqueName: \"kubernetes.io/projected/0358ac72-8c58-4e63-843e-b9eaa35aefdf-kube-api-access-m59xm\") pod \"swift-storage-0\" (UID: \"0358ac72-8c58-4e63-843e-b9eaa35aefdf\") " pod="openstack/swift-storage-0" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.105056 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"0358ac72-8c58-4e63-843e-b9eaa35aefdf\") " pod="openstack/swift-storage-0" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.105083 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0358ac72-8c58-4e63-843e-b9eaa35aefdf-cache\") pod \"swift-storage-0\" (UID: \"0358ac72-8c58-4e63-843e-b9eaa35aefdf\") " pod="openstack/swift-storage-0" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.105110 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0358ac72-8c58-4e63-843e-b9eaa35aefdf-lock\") pod \"swift-storage-0\" (UID: \"0358ac72-8c58-4e63-843e-b9eaa35aefdf\") " pod="openstack/swift-storage-0" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.200593 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.206320 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0358ac72-8c58-4e63-843e-b9eaa35aefdf-lock\") pod \"swift-storage-0\" (UID: \"0358ac72-8c58-4e63-843e-b9eaa35aefdf\") " pod="openstack/swift-storage-0" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.206406 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0358ac72-8c58-4e63-843e-b9eaa35aefdf-etc-swift\") pod \"swift-storage-0\" (UID: \"0358ac72-8c58-4e63-843e-b9eaa35aefdf\") " pod="openstack/swift-storage-0" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.206479 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m59xm\" (UniqueName: \"kubernetes.io/projected/0358ac72-8c58-4e63-843e-b9eaa35aefdf-kube-api-access-m59xm\") pod \"swift-storage-0\" (UID: \"0358ac72-8c58-4e63-843e-b9eaa35aefdf\") " pod="openstack/swift-storage-0" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.206551 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"0358ac72-8c58-4e63-843e-b9eaa35aefdf\") " pod="openstack/swift-storage-0" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.206578 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0358ac72-8c58-4e63-843e-b9eaa35aefdf-cache\") pod \"swift-storage-0\" (UID: \"0358ac72-8c58-4e63-843e-b9eaa35aefdf\") " pod="openstack/swift-storage-0" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.206951 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0358ac72-8c58-4e63-843e-b9eaa35aefdf-lock\") pod \"swift-storage-0\" (UID: \"0358ac72-8c58-4e63-843e-b9eaa35aefdf\") " pod="openstack/swift-storage-0" Oct 09 15:34:04 crc kubenswrapper[4719]: E1009 15:34:04.208269 4719 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 09 15:34:04 crc kubenswrapper[4719]: E1009 15:34:04.208288 4719 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 09 15:34:04 crc kubenswrapper[4719]: E1009 15:34:04.208321 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0358ac72-8c58-4e63-843e-b9eaa35aefdf-etc-swift podName:0358ac72-8c58-4e63-843e-b9eaa35aefdf nodeName:}" failed. No retries permitted until 2025-10-09 15:34:04.708307024 +0000 UTC m=+950.218018309 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0358ac72-8c58-4e63-843e-b9eaa35aefdf-etc-swift") pod "swift-storage-0" (UID: "0358ac72-8c58-4e63-843e-b9eaa35aefdf") : configmap "swift-ring-files" not found Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.208745 4719 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"0358ac72-8c58-4e63-843e-b9eaa35aefdf\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/swift-storage-0" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.211569 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0358ac72-8c58-4e63-843e-b9eaa35aefdf-cache\") pod \"swift-storage-0\" (UID: \"0358ac72-8c58-4e63-843e-b9eaa35aefdf\") " pod="openstack/swift-storage-0" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.272988 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.276859 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"0358ac72-8c58-4e63-843e-b9eaa35aefdf\") " pod="openstack/swift-storage-0" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.288497 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m59xm\" (UniqueName: \"kubernetes.io/projected/0358ac72-8c58-4e63-843e-b9eaa35aefdf-kube-api-access-m59xm\") pod \"swift-storage-0\" (UID: \"0358ac72-8c58-4e63-843e-b9eaa35aefdf\") " pod="openstack/swift-storage-0" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.562461 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-dd7hf"] Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.563591 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dd7hf" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.566229 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.566282 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.570461 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.576739 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-dd7hf"] Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.718183 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5685c463-d342-436a-a619-f809a2559691-swiftconf\") pod \"swift-ring-rebalance-dd7hf\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " pod="openstack/swift-ring-rebalance-dd7hf" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.718229 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5685c463-d342-436a-a619-f809a2559691-ring-data-devices\") pod \"swift-ring-rebalance-dd7hf\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " pod="openstack/swift-ring-rebalance-dd7hf" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.718281 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-577jl\" (UniqueName: \"kubernetes.io/projected/5685c463-d342-436a-a619-f809a2559691-kube-api-access-577jl\") pod \"swift-ring-rebalance-dd7hf\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " pod="openstack/swift-ring-rebalance-dd7hf" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.718359 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5685c463-d342-436a-a619-f809a2559691-combined-ca-bundle\") pod \"swift-ring-rebalance-dd7hf\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " pod="openstack/swift-ring-rebalance-dd7hf" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.718595 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0358ac72-8c58-4e63-843e-b9eaa35aefdf-etc-swift\") pod \"swift-storage-0\" (UID: \"0358ac72-8c58-4e63-843e-b9eaa35aefdf\") " pod="openstack/swift-storage-0" Oct 09 15:34:04 crc kubenswrapper[4719]: E1009 15:34:04.718875 4719 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 09 15:34:04 crc kubenswrapper[4719]: E1009 15:34:04.718910 4719 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 09 15:34:04 crc kubenswrapper[4719]: E1009 15:34:04.718975 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0358ac72-8c58-4e63-843e-b9eaa35aefdf-etc-swift podName:0358ac72-8c58-4e63-843e-b9eaa35aefdf nodeName:}" failed. No retries permitted until 2025-10-09 15:34:05.718953238 +0000 UTC m=+951.228664523 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0358ac72-8c58-4e63-843e-b9eaa35aefdf-etc-swift") pod "swift-storage-0" (UID: "0358ac72-8c58-4e63-843e-b9eaa35aefdf") : configmap "swift-ring-files" not found Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.718882 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5685c463-d342-436a-a619-f809a2559691-etc-swift\") pod \"swift-ring-rebalance-dd7hf\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " pod="openstack/swift-ring-rebalance-dd7hf" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.719071 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5685c463-d342-436a-a619-f809a2559691-scripts\") pod \"swift-ring-rebalance-dd7hf\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " pod="openstack/swift-ring-rebalance-dd7hf" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.719177 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5685c463-d342-436a-a619-f809a2559691-dispersionconf\") pod \"swift-ring-rebalance-dd7hf\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " pod="openstack/swift-ring-rebalance-dd7hf" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.821271 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-577jl\" (UniqueName: \"kubernetes.io/projected/5685c463-d342-436a-a619-f809a2559691-kube-api-access-577jl\") pod \"swift-ring-rebalance-dd7hf\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " pod="openstack/swift-ring-rebalance-dd7hf" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.821390 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5685c463-d342-436a-a619-f809a2559691-combined-ca-bundle\") pod \"swift-ring-rebalance-dd7hf\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " pod="openstack/swift-ring-rebalance-dd7hf" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.821485 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5685c463-d342-436a-a619-f809a2559691-etc-swift\") pod \"swift-ring-rebalance-dd7hf\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " pod="openstack/swift-ring-rebalance-dd7hf" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.821509 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5685c463-d342-436a-a619-f809a2559691-scripts\") pod \"swift-ring-rebalance-dd7hf\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " pod="openstack/swift-ring-rebalance-dd7hf" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.821540 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5685c463-d342-436a-a619-f809a2559691-dispersionconf\") pod \"swift-ring-rebalance-dd7hf\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " pod="openstack/swift-ring-rebalance-dd7hf" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.821597 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5685c463-d342-436a-a619-f809a2559691-swiftconf\") pod \"swift-ring-rebalance-dd7hf\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " pod="openstack/swift-ring-rebalance-dd7hf" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.821616 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5685c463-d342-436a-a619-f809a2559691-ring-data-devices\") pod \"swift-ring-rebalance-dd7hf\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " pod="openstack/swift-ring-rebalance-dd7hf" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.822349 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5685c463-d342-436a-a619-f809a2559691-etc-swift\") pod \"swift-ring-rebalance-dd7hf\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " pod="openstack/swift-ring-rebalance-dd7hf" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.822679 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5685c463-d342-436a-a619-f809a2559691-scripts\") pod \"swift-ring-rebalance-dd7hf\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " pod="openstack/swift-ring-rebalance-dd7hf" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.822770 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5685c463-d342-436a-a619-f809a2559691-ring-data-devices\") pod \"swift-ring-rebalance-dd7hf\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " pod="openstack/swift-ring-rebalance-dd7hf" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.828365 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5685c463-d342-436a-a619-f809a2559691-dispersionconf\") pod \"swift-ring-rebalance-dd7hf\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " pod="openstack/swift-ring-rebalance-dd7hf" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.828437 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5685c463-d342-436a-a619-f809a2559691-combined-ca-bundle\") pod \"swift-ring-rebalance-dd7hf\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " pod="openstack/swift-ring-rebalance-dd7hf" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.839890 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5685c463-d342-436a-a619-f809a2559691-swiftconf\") pod \"swift-ring-rebalance-dd7hf\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " pod="openstack/swift-ring-rebalance-dd7hf" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.852491 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-577jl\" (UniqueName: \"kubernetes.io/projected/5685c463-d342-436a-a619-f809a2559691-kube-api-access-577jl\") pod \"swift-ring-rebalance-dd7hf\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " pod="openstack/swift-ring-rebalance-dd7hf" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.859585 4719 generic.go:334] "Generic (PLEG): container finished" podID="b0ebf29d-d040-4829-912f-e8bd998cf0da" containerID="988845eef7edf5550c80887bed9783830e1acd5e471e50189a272b90f65a33a4" exitCode=0 Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.859705 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" event={"ID":"b0ebf29d-d040-4829-912f-e8bd998cf0da","Type":"ContainerDied","Data":"988845eef7edf5550c80887bed9783830e1acd5e471e50189a272b90f65a33a4"} Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.859747 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" event={"ID":"b0ebf29d-d040-4829-912f-e8bd998cf0da","Type":"ContainerStarted","Data":"57d522a6800e2de7394201b9e5041ca3f1d6e6e5ff0cba5d678bf00bc4216545"} Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.861006 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.861263 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" podUID="d8a234cd-5325-4f81-8b64-0af5562bff5e" containerName="dnsmasq-dns" containerID="cri-o://1647d952d1603085c0c951167634914b9c9b3f1b41faf5e2f18836a5cc0ebaaf" gracePeriod=10 Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.881463 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dd7hf" Oct 09 15:34:04 crc kubenswrapper[4719]: I1009 15:34:04.932618 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.160022 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.167421 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.174579 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.174785 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.175501 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.176238 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-j4t2h" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.191847 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.238515 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4276fa06-e8dc-40e0-8276-eaf58420e0ca-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"4276fa06-e8dc-40e0-8276-eaf58420e0ca\") " pod="openstack/ovn-northd-0" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.238599 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4276fa06-e8dc-40e0-8276-eaf58420e0ca-config\") pod \"ovn-northd-0\" (UID: \"4276fa06-e8dc-40e0-8276-eaf58420e0ca\") " pod="openstack/ovn-northd-0" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.238632 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6pwd\" (UniqueName: \"kubernetes.io/projected/4276fa06-e8dc-40e0-8276-eaf58420e0ca-kube-api-access-c6pwd\") pod \"ovn-northd-0\" (UID: \"4276fa06-e8dc-40e0-8276-eaf58420e0ca\") " pod="openstack/ovn-northd-0" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.238675 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4276fa06-e8dc-40e0-8276-eaf58420e0ca-scripts\") pod \"ovn-northd-0\" (UID: \"4276fa06-e8dc-40e0-8276-eaf58420e0ca\") " pod="openstack/ovn-northd-0" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.238709 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4276fa06-e8dc-40e0-8276-eaf58420e0ca-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4276fa06-e8dc-40e0-8276-eaf58420e0ca\") " pod="openstack/ovn-northd-0" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.238792 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4276fa06-e8dc-40e0-8276-eaf58420e0ca-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"4276fa06-e8dc-40e0-8276-eaf58420e0ca\") " pod="openstack/ovn-northd-0" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.238831 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4276fa06-e8dc-40e0-8276-eaf58420e0ca-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4276fa06-e8dc-40e0-8276-eaf58420e0ca\") " pod="openstack/ovn-northd-0" Oct 09 15:34:05 crc kubenswrapper[4719]: E1009 15:34:05.332897 4719 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.59:53928->38.102.83.59:45053: read tcp 38.102.83.59:53928->38.102.83.59:45053: read: connection reset by peer Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.340315 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4276fa06-e8dc-40e0-8276-eaf58420e0ca-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"4276fa06-e8dc-40e0-8276-eaf58420e0ca\") " pod="openstack/ovn-northd-0" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.340406 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4276fa06-e8dc-40e0-8276-eaf58420e0ca-config\") pod \"ovn-northd-0\" (UID: \"4276fa06-e8dc-40e0-8276-eaf58420e0ca\") " pod="openstack/ovn-northd-0" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.340439 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6pwd\" (UniqueName: \"kubernetes.io/projected/4276fa06-e8dc-40e0-8276-eaf58420e0ca-kube-api-access-c6pwd\") pod \"ovn-northd-0\" (UID: \"4276fa06-e8dc-40e0-8276-eaf58420e0ca\") " pod="openstack/ovn-northd-0" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.341819 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4276fa06-e8dc-40e0-8276-eaf58420e0ca-scripts\") pod \"ovn-northd-0\" (UID: \"4276fa06-e8dc-40e0-8276-eaf58420e0ca\") " pod="openstack/ovn-northd-0" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.341874 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4276fa06-e8dc-40e0-8276-eaf58420e0ca-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4276fa06-e8dc-40e0-8276-eaf58420e0ca\") " pod="openstack/ovn-northd-0" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.341964 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4276fa06-e8dc-40e0-8276-eaf58420e0ca-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"4276fa06-e8dc-40e0-8276-eaf58420e0ca\") " pod="openstack/ovn-northd-0" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.341988 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4276fa06-e8dc-40e0-8276-eaf58420e0ca-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4276fa06-e8dc-40e0-8276-eaf58420e0ca\") " pod="openstack/ovn-northd-0" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.342564 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4276fa06-e8dc-40e0-8276-eaf58420e0ca-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4276fa06-e8dc-40e0-8276-eaf58420e0ca\") " pod="openstack/ovn-northd-0" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.343887 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4276fa06-e8dc-40e0-8276-eaf58420e0ca-scripts\") pod \"ovn-northd-0\" (UID: \"4276fa06-e8dc-40e0-8276-eaf58420e0ca\") " pod="openstack/ovn-northd-0" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.344512 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4276fa06-e8dc-40e0-8276-eaf58420e0ca-config\") pod \"ovn-northd-0\" (UID: \"4276fa06-e8dc-40e0-8276-eaf58420e0ca\") " pod="openstack/ovn-northd-0" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.350397 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4276fa06-e8dc-40e0-8276-eaf58420e0ca-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"4276fa06-e8dc-40e0-8276-eaf58420e0ca\") " pod="openstack/ovn-northd-0" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.350691 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4276fa06-e8dc-40e0-8276-eaf58420e0ca-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"4276fa06-e8dc-40e0-8276-eaf58420e0ca\") " pod="openstack/ovn-northd-0" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.352996 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4276fa06-e8dc-40e0-8276-eaf58420e0ca-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4276fa06-e8dc-40e0-8276-eaf58420e0ca\") " pod="openstack/ovn-northd-0" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.388659 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6pwd\" (UniqueName: \"kubernetes.io/projected/4276fa06-e8dc-40e0-8276-eaf58420e0ca-kube-api-access-c6pwd\") pod \"ovn-northd-0\" (UID: \"4276fa06-e8dc-40e0-8276-eaf58420e0ca\") " pod="openstack/ovn-northd-0" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.461763 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.492165 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.545210 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8a234cd-5325-4f81-8b64-0af5562bff5e-ovsdbserver-sb\") pod \"d8a234cd-5325-4f81-8b64-0af5562bff5e\" (UID: \"d8a234cd-5325-4f81-8b64-0af5562bff5e\") " Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.545256 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8a234cd-5325-4f81-8b64-0af5562bff5e-ovsdbserver-nb\") pod \"d8a234cd-5325-4f81-8b64-0af5562bff5e\" (UID: \"d8a234cd-5325-4f81-8b64-0af5562bff5e\") " Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.545321 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8a234cd-5325-4f81-8b64-0af5562bff5e-dns-svc\") pod \"d8a234cd-5325-4f81-8b64-0af5562bff5e\" (UID: \"d8a234cd-5325-4f81-8b64-0af5562bff5e\") " Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.545409 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpn8z\" (UniqueName: \"kubernetes.io/projected/d8a234cd-5325-4f81-8b64-0af5562bff5e-kube-api-access-kpn8z\") pod \"d8a234cd-5325-4f81-8b64-0af5562bff5e\" (UID: \"d8a234cd-5325-4f81-8b64-0af5562bff5e\") " Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.546426 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8a234cd-5325-4f81-8b64-0af5562bff5e-config\") pod \"d8a234cd-5325-4f81-8b64-0af5562bff5e\" (UID: \"d8a234cd-5325-4f81-8b64-0af5562bff5e\") " Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.555531 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8a234cd-5325-4f81-8b64-0af5562bff5e-kube-api-access-kpn8z" (OuterVolumeSpecName: "kube-api-access-kpn8z") pod "d8a234cd-5325-4f81-8b64-0af5562bff5e" (UID: "d8a234cd-5325-4f81-8b64-0af5562bff5e"). InnerVolumeSpecName "kube-api-access-kpn8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.609508 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8a234cd-5325-4f81-8b64-0af5562bff5e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d8a234cd-5325-4f81-8b64-0af5562bff5e" (UID: "d8a234cd-5325-4f81-8b64-0af5562bff5e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.612788 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8a234cd-5325-4f81-8b64-0af5562bff5e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d8a234cd-5325-4f81-8b64-0af5562bff5e" (UID: "d8a234cd-5325-4f81-8b64-0af5562bff5e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.639182 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8a234cd-5325-4f81-8b64-0af5562bff5e-config" (OuterVolumeSpecName: "config") pod "d8a234cd-5325-4f81-8b64-0af5562bff5e" (UID: "d8a234cd-5325-4f81-8b64-0af5562bff5e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.648125 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8a234cd-5325-4f81-8b64-0af5562bff5e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d8a234cd-5325-4f81-8b64-0af5562bff5e" (UID: "d8a234cd-5325-4f81-8b64-0af5562bff5e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.648430 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8a234cd-5325-4f81-8b64-0af5562bff5e-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.648459 4719 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8a234cd-5325-4f81-8b64-0af5562bff5e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.648470 4719 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8a234cd-5325-4f81-8b64-0af5562bff5e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.648478 4719 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8a234cd-5325-4f81-8b64-0af5562bff5e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.648488 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpn8z\" (UniqueName: \"kubernetes.io/projected/d8a234cd-5325-4f81-8b64-0af5562bff5e-kube-api-access-kpn8z\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.680947 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-dd7hf"] Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.750090 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0358ac72-8c58-4e63-843e-b9eaa35aefdf-etc-swift\") pod \"swift-storage-0\" (UID: \"0358ac72-8c58-4e63-843e-b9eaa35aefdf\") " pod="openstack/swift-storage-0" Oct 09 15:34:05 crc kubenswrapper[4719]: E1009 15:34:05.750302 4719 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 09 15:34:05 crc kubenswrapper[4719]: E1009 15:34:05.750319 4719 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 09 15:34:05 crc kubenswrapper[4719]: E1009 15:34:05.750406 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0358ac72-8c58-4e63-843e-b9eaa35aefdf-etc-swift podName:0358ac72-8c58-4e63-843e-b9eaa35aefdf nodeName:}" failed. No retries permitted until 2025-10-09 15:34:07.750355746 +0000 UTC m=+953.260067031 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0358ac72-8c58-4e63-843e-b9eaa35aefdf-etc-swift") pod "swift-storage-0" (UID: "0358ac72-8c58-4e63-843e-b9eaa35aefdf") : configmap "swift-ring-files" not found Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.910774 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" event={"ID":"b0ebf29d-d040-4829-912f-e8bd998cf0da","Type":"ContainerStarted","Data":"a71c7efe995b43ccf47c9b1b6657313bac942a4bd854c2ad46fca8f6d4d03052"} Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.911275 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.914708 4719 generic.go:334] "Generic (PLEG): container finished" podID="d8a234cd-5325-4f81-8b64-0af5562bff5e" containerID="1647d952d1603085c0c951167634914b9c9b3f1b41faf5e2f18836a5cc0ebaaf" exitCode=0 Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.914768 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" event={"ID":"d8a234cd-5325-4f81-8b64-0af5562bff5e","Type":"ContainerDied","Data":"1647d952d1603085c0c951167634914b9c9b3f1b41faf5e2f18836a5cc0ebaaf"} Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.914790 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" event={"ID":"d8a234cd-5325-4f81-8b64-0af5562bff5e","Type":"ContainerDied","Data":"561f2cf8bcc34410f6a846f00eae6b5d221da76120eb127cb85836cc02a4df5c"} Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.914810 4719 scope.go:117] "RemoveContainer" containerID="1647d952d1603085c0c951167634914b9c9b3f1b41faf5e2f18836a5cc0ebaaf" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.914943 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5568d9d8bc-7jxwj" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.929423 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" podStartSLOduration=3.929398011 podStartE2EDuration="3.929398011s" podCreationTimestamp="2025-10-09 15:34:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:34:05.929389151 +0000 UTC m=+951.439100446" watchObservedRunningTime="2025-10-09 15:34:05.929398011 +0000 UTC m=+951.439109316" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.932102 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dd7hf" event={"ID":"5685c463-d342-436a-a619-f809a2559691","Type":"ContainerStarted","Data":"97f334f18e2b6182f768b52e9b82cc2737d175188c8dc8361df039d57222653a"} Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.960659 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5568d9d8bc-7jxwj"] Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.972141 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5568d9d8bc-7jxwj"] Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.974630 4719 scope.go:117] "RemoveContainer" containerID="f169d03e68245ead0b553a51022662d277e3ef4f7ae3562060873febcde75294" Oct 09 15:34:05 crc kubenswrapper[4719]: I1009 15:34:05.984253 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 09 15:34:06 crc kubenswrapper[4719]: I1009 15:34:06.031104 4719 scope.go:117] "RemoveContainer" containerID="1647d952d1603085c0c951167634914b9c9b3f1b41faf5e2f18836a5cc0ebaaf" Oct 09 15:34:06 crc kubenswrapper[4719]: E1009 15:34:06.032064 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1647d952d1603085c0c951167634914b9c9b3f1b41faf5e2f18836a5cc0ebaaf\": container with ID starting with 1647d952d1603085c0c951167634914b9c9b3f1b41faf5e2f18836a5cc0ebaaf not found: ID does not exist" containerID="1647d952d1603085c0c951167634914b9c9b3f1b41faf5e2f18836a5cc0ebaaf" Oct 09 15:34:06 crc kubenswrapper[4719]: I1009 15:34:06.032146 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1647d952d1603085c0c951167634914b9c9b3f1b41faf5e2f18836a5cc0ebaaf"} err="failed to get container status \"1647d952d1603085c0c951167634914b9c9b3f1b41faf5e2f18836a5cc0ebaaf\": rpc error: code = NotFound desc = could not find container \"1647d952d1603085c0c951167634914b9c9b3f1b41faf5e2f18836a5cc0ebaaf\": container with ID starting with 1647d952d1603085c0c951167634914b9c9b3f1b41faf5e2f18836a5cc0ebaaf not found: ID does not exist" Oct 09 15:34:06 crc kubenswrapper[4719]: I1009 15:34:06.032175 4719 scope.go:117] "RemoveContainer" containerID="f169d03e68245ead0b553a51022662d277e3ef4f7ae3562060873febcde75294" Oct 09 15:34:06 crc kubenswrapper[4719]: E1009 15:34:06.032705 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f169d03e68245ead0b553a51022662d277e3ef4f7ae3562060873febcde75294\": container with ID starting with f169d03e68245ead0b553a51022662d277e3ef4f7ae3562060873febcde75294 not found: ID does not exist" containerID="f169d03e68245ead0b553a51022662d277e3ef4f7ae3562060873febcde75294" Oct 09 15:34:06 crc kubenswrapper[4719]: I1009 15:34:06.032742 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f169d03e68245ead0b553a51022662d277e3ef4f7ae3562060873febcde75294"} err="failed to get container status \"f169d03e68245ead0b553a51022662d277e3ef4f7ae3562060873febcde75294\": rpc error: code = NotFound desc = could not find container \"f169d03e68245ead0b553a51022662d277e3ef4f7ae3562060873febcde75294\": container with ID starting with f169d03e68245ead0b553a51022662d277e3ef4f7ae3562060873febcde75294 not found: ID does not exist" Oct 09 15:34:06 crc kubenswrapper[4719]: I1009 15:34:06.946825 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4276fa06-e8dc-40e0-8276-eaf58420e0ca","Type":"ContainerStarted","Data":"b316e0336fee92e8d391d3b6efaaf8e187228d384901d656b8f813e864b4d05a"} Oct 09 15:34:07 crc kubenswrapper[4719]: I1009 15:34:07.179161 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8a234cd-5325-4f81-8b64-0af5562bff5e" path="/var/lib/kubelet/pods/d8a234cd-5325-4f81-8b64-0af5562bff5e/volumes" Oct 09 15:34:07 crc kubenswrapper[4719]: I1009 15:34:07.792932 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0358ac72-8c58-4e63-843e-b9eaa35aefdf-etc-swift\") pod \"swift-storage-0\" (UID: \"0358ac72-8c58-4e63-843e-b9eaa35aefdf\") " pod="openstack/swift-storage-0" Oct 09 15:34:07 crc kubenswrapper[4719]: E1009 15:34:07.793177 4719 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 09 15:34:07 crc kubenswrapper[4719]: E1009 15:34:07.793220 4719 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 09 15:34:07 crc kubenswrapper[4719]: E1009 15:34:07.793351 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0358ac72-8c58-4e63-843e-b9eaa35aefdf-etc-swift podName:0358ac72-8c58-4e63-843e-b9eaa35aefdf nodeName:}" failed. No retries permitted until 2025-10-09 15:34:11.793325012 +0000 UTC m=+957.303036307 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0358ac72-8c58-4e63-843e-b9eaa35aefdf-etc-swift") pod "swift-storage-0" (UID: "0358ac72-8c58-4e63-843e-b9eaa35aefdf") : configmap "swift-ring-files" not found Oct 09 15:34:10 crc kubenswrapper[4719]: I1009 15:34:10.717923 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 09 15:34:10 crc kubenswrapper[4719]: I1009 15:34:10.718488 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 09 15:34:10 crc kubenswrapper[4719]: I1009 15:34:10.778447 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 09 15:34:10 crc kubenswrapper[4719]: I1009 15:34:10.799552 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 09 15:34:10 crc kubenswrapper[4719]: I1009 15:34:10.799605 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 09 15:34:10 crc kubenswrapper[4719]: I1009 15:34:10.874318 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 09 15:34:10 crc kubenswrapper[4719]: I1009 15:34:10.985973 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dd7hf" event={"ID":"5685c463-d342-436a-a619-f809a2559691","Type":"ContainerStarted","Data":"028d8f90693f50efa8b56d568210bc2bdb2c112dc9c6b4a933addb04b48018f4"} Oct 09 15:34:10 crc kubenswrapper[4719]: I1009 15:34:10.988628 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"95b09495-a75f-42db-ae2c-99ac6e46f039","Type":"ContainerStarted","Data":"f24e7defb05b0fe57c59432a6505ceac98d73ecf638246550f93a6336334e527"} Oct 09 15:34:10 crc kubenswrapper[4719]: I1009 15:34:10.990857 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4276fa06-e8dc-40e0-8276-eaf58420e0ca","Type":"ContainerStarted","Data":"2a008648c2273736aef867088d6a9d5f6775bae5f9515774397231561229a0d1"} Oct 09 15:34:10 crc kubenswrapper[4719]: I1009 15:34:10.990882 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4276fa06-e8dc-40e0-8276-eaf58420e0ca","Type":"ContainerStarted","Data":"96399cf38b7f318fab646967a005099d2e8e72ccf293859061007a61146a638d"} Oct 09 15:34:11 crc kubenswrapper[4719]: I1009 15:34:11.005072 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-dd7hf" podStartSLOduration=2.530881289 podStartE2EDuration="7.005044787s" podCreationTimestamp="2025-10-09 15:34:04 +0000 UTC" firstStartedPulling="2025-10-09 15:34:05.720007072 +0000 UTC m=+951.229718357" lastFinishedPulling="2025-10-09 15:34:10.19417056 +0000 UTC m=+955.703881855" observedRunningTime="2025-10-09 15:34:10.999848202 +0000 UTC m=+956.509559497" watchObservedRunningTime="2025-10-09 15:34:11.005044787 +0000 UTC m=+956.514756062" Oct 09 15:34:11 crc kubenswrapper[4719]: I1009 15:34:11.024075 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.872863487 podStartE2EDuration="6.02404912s" podCreationTimestamp="2025-10-09 15:34:05 +0000 UTC" firstStartedPulling="2025-10-09 15:34:06.032951669 +0000 UTC m=+951.542662954" lastFinishedPulling="2025-10-09 15:34:10.184137302 +0000 UTC m=+955.693848587" observedRunningTime="2025-10-09 15:34:11.016901402 +0000 UTC m=+956.526612717" watchObservedRunningTime="2025-10-09 15:34:11.02404912 +0000 UTC m=+956.533760405" Oct 09 15:34:11 crc kubenswrapper[4719]: I1009 15:34:11.053026 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 09 15:34:11 crc kubenswrapper[4719]: I1009 15:34:11.066982 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 09 15:34:11 crc kubenswrapper[4719]: I1009 15:34:11.866257 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0358ac72-8c58-4e63-843e-b9eaa35aefdf-etc-swift\") pod \"swift-storage-0\" (UID: \"0358ac72-8c58-4e63-843e-b9eaa35aefdf\") " pod="openstack/swift-storage-0" Oct 09 15:34:11 crc kubenswrapper[4719]: E1009 15:34:11.866530 4719 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 09 15:34:11 crc kubenswrapper[4719]: E1009 15:34:11.866747 4719 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 09 15:34:11 crc kubenswrapper[4719]: E1009 15:34:11.866818 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0358ac72-8c58-4e63-843e-b9eaa35aefdf-etc-swift podName:0358ac72-8c58-4e63-843e-b9eaa35aefdf nodeName:}" failed. No retries permitted until 2025-10-09 15:34:19.866797138 +0000 UTC m=+965.376508423 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0358ac72-8c58-4e63-843e-b9eaa35aefdf-etc-swift") pod "swift-storage-0" (UID: "0358ac72-8c58-4e63-843e-b9eaa35aefdf") : configmap "swift-ring-files" not found Oct 09 15:34:12 crc kubenswrapper[4719]: I1009 15:34:12.010304 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 09 15:34:12 crc kubenswrapper[4719]: I1009 15:34:12.979963 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-gcfdh"] Oct 09 15:34:12 crc kubenswrapper[4719]: E1009 15:34:12.981586 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a234cd-5325-4f81-8b64-0af5562bff5e" containerName="dnsmasq-dns" Oct 09 15:34:12 crc kubenswrapper[4719]: I1009 15:34:12.981665 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a234cd-5325-4f81-8b64-0af5562bff5e" containerName="dnsmasq-dns" Oct 09 15:34:12 crc kubenswrapper[4719]: E1009 15:34:12.981736 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a234cd-5325-4f81-8b64-0af5562bff5e" containerName="init" Oct 09 15:34:12 crc kubenswrapper[4719]: I1009 15:34:12.981792 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a234cd-5325-4f81-8b64-0af5562bff5e" containerName="init" Oct 09 15:34:12 crc kubenswrapper[4719]: I1009 15:34:12.982043 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8a234cd-5325-4f81-8b64-0af5562bff5e" containerName="dnsmasq-dns" Oct 09 15:34:12 crc kubenswrapper[4719]: I1009 15:34:12.982684 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-gcfdh" Oct 09 15:34:12 crc kubenswrapper[4719]: I1009 15:34:12.986422 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-gcfdh"] Oct 09 15:34:13 crc kubenswrapper[4719]: I1009 15:34:13.019274 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"95b09495-a75f-42db-ae2c-99ac6e46f039","Type":"ContainerStarted","Data":"ffc4804d631e3f3bd2201f6e4ce6aa2536661ce40e7a85af4bcfffd7ee457f1d"} Oct 09 15:34:13 crc kubenswrapper[4719]: I1009 15:34:13.085696 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqsrx\" (UniqueName: \"kubernetes.io/projected/f8ea2169-9310-49ca-ac1f-f11eeb9b8e26-kube-api-access-bqsrx\") pod \"watcher-db-create-gcfdh\" (UID: \"f8ea2169-9310-49ca-ac1f-f11eeb9b8e26\") " pod="openstack/watcher-db-create-gcfdh" Oct 09 15:34:13 crc kubenswrapper[4719]: I1009 15:34:13.187887 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqsrx\" (UniqueName: \"kubernetes.io/projected/f8ea2169-9310-49ca-ac1f-f11eeb9b8e26-kube-api-access-bqsrx\") pod \"watcher-db-create-gcfdh\" (UID: \"f8ea2169-9310-49ca-ac1f-f11eeb9b8e26\") " pod="openstack/watcher-db-create-gcfdh" Oct 09 15:34:13 crc kubenswrapper[4719]: I1009 15:34:13.207480 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqsrx\" (UniqueName: \"kubernetes.io/projected/f8ea2169-9310-49ca-ac1f-f11eeb9b8e26-kube-api-access-bqsrx\") pod \"watcher-db-create-gcfdh\" (UID: \"f8ea2169-9310-49ca-ac1f-f11eeb9b8e26\") " pod="openstack/watcher-db-create-gcfdh" Oct 09 15:34:13 crc kubenswrapper[4719]: I1009 15:34:13.298699 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-gcfdh" Oct 09 15:34:13 crc kubenswrapper[4719]: I1009 15:34:13.326505 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" Oct 09 15:34:13 crc kubenswrapper[4719]: I1009 15:34:13.400655 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dd68b64f-t69c6"] Oct 09 15:34:13 crc kubenswrapper[4719]: I1009 15:34:13.405906 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6dd68b64f-t69c6" podUID="4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6" containerName="dnsmasq-dns" containerID="cri-o://6d0364ba444214c15623aaa0adc9ad8602740d3a67991db42395477b119e3f02" gracePeriod=10 Oct 09 15:34:13 crc kubenswrapper[4719]: I1009 15:34:13.865286 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-gcfdh"] Oct 09 15:34:13 crc kubenswrapper[4719]: I1009 15:34:13.953013 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dd68b64f-t69c6" Oct 09 15:34:14 crc kubenswrapper[4719]: I1009 15:34:14.009972 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6-dns-svc\") pod \"4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6\" (UID: \"4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6\") " Oct 09 15:34:14 crc kubenswrapper[4719]: I1009 15:34:14.010147 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvhcc\" (UniqueName: \"kubernetes.io/projected/4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6-kube-api-access-gvhcc\") pod \"4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6\" (UID: \"4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6\") " Oct 09 15:34:14 crc kubenswrapper[4719]: I1009 15:34:14.010214 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6-config\") pod \"4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6\" (UID: \"4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6\") " Oct 09 15:34:14 crc kubenswrapper[4719]: I1009 15:34:14.015621 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6-kube-api-access-gvhcc" (OuterVolumeSpecName: "kube-api-access-gvhcc") pod "4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6" (UID: "4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6"). InnerVolumeSpecName "kube-api-access-gvhcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:34:14 crc kubenswrapper[4719]: I1009 15:34:14.030442 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-gcfdh" event={"ID":"f8ea2169-9310-49ca-ac1f-f11eeb9b8e26","Type":"ContainerStarted","Data":"e631c472b961ca8b25e75e1dbbb45738f9771e87c0c550bd213d7799cee0180e"} Oct 09 15:34:14 crc kubenswrapper[4719]: I1009 15:34:14.030488 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-gcfdh" event={"ID":"f8ea2169-9310-49ca-ac1f-f11eeb9b8e26","Type":"ContainerStarted","Data":"6edba65f4dc09c2471132bba5203654ef3a653796e23c25899fb5f269f69de9a"} Oct 09 15:34:14 crc kubenswrapper[4719]: I1009 15:34:14.033060 4719 generic.go:334] "Generic (PLEG): container finished" podID="4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6" containerID="6d0364ba444214c15623aaa0adc9ad8602740d3a67991db42395477b119e3f02" exitCode=0 Oct 09 15:34:14 crc kubenswrapper[4719]: I1009 15:34:14.033097 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dd68b64f-t69c6" event={"ID":"4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6","Type":"ContainerDied","Data":"6d0364ba444214c15623aaa0adc9ad8602740d3a67991db42395477b119e3f02"} Oct 09 15:34:14 crc kubenswrapper[4719]: I1009 15:34:14.033118 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dd68b64f-t69c6" event={"ID":"4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6","Type":"ContainerDied","Data":"b007fe91956c10ad2b6a6388c093a88dd0f1b2fbb896a5e295f6cebe7f64de79"} Oct 09 15:34:14 crc kubenswrapper[4719]: I1009 15:34:14.033137 4719 scope.go:117] "RemoveContainer" containerID="6d0364ba444214c15623aaa0adc9ad8602740d3a67991db42395477b119e3f02" Oct 09 15:34:14 crc kubenswrapper[4719]: I1009 15:34:14.033246 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dd68b64f-t69c6" Oct 09 15:34:14 crc kubenswrapper[4719]: I1009 15:34:14.052556 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-create-gcfdh" podStartSLOduration=2.052534536 podStartE2EDuration="2.052534536s" podCreationTimestamp="2025-10-09 15:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:34:14.043008024 +0000 UTC m=+959.552719329" watchObservedRunningTime="2025-10-09 15:34:14.052534536 +0000 UTC m=+959.562245821" Oct 09 15:34:14 crc kubenswrapper[4719]: I1009 15:34:14.055858 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6-config" (OuterVolumeSpecName: "config") pod "4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6" (UID: "4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:34:14 crc kubenswrapper[4719]: I1009 15:34:14.060808 4719 scope.go:117] "RemoveContainer" containerID="8596ece6f6330253f5920d00478b711d559cb6448c8f8d7450dca57530e7836f" Oct 09 15:34:14 crc kubenswrapper[4719]: I1009 15:34:14.065271 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6" (UID: "4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:34:14 crc kubenswrapper[4719]: I1009 15:34:14.104497 4719 scope.go:117] "RemoveContainer" containerID="6d0364ba444214c15623aaa0adc9ad8602740d3a67991db42395477b119e3f02" Oct 09 15:34:14 crc kubenswrapper[4719]: E1009 15:34:14.105091 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d0364ba444214c15623aaa0adc9ad8602740d3a67991db42395477b119e3f02\": container with ID starting with 6d0364ba444214c15623aaa0adc9ad8602740d3a67991db42395477b119e3f02 not found: ID does not exist" containerID="6d0364ba444214c15623aaa0adc9ad8602740d3a67991db42395477b119e3f02" Oct 09 15:34:14 crc kubenswrapper[4719]: I1009 15:34:14.105136 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d0364ba444214c15623aaa0adc9ad8602740d3a67991db42395477b119e3f02"} err="failed to get container status \"6d0364ba444214c15623aaa0adc9ad8602740d3a67991db42395477b119e3f02\": rpc error: code = NotFound desc = could not find container \"6d0364ba444214c15623aaa0adc9ad8602740d3a67991db42395477b119e3f02\": container with ID starting with 6d0364ba444214c15623aaa0adc9ad8602740d3a67991db42395477b119e3f02 not found: ID does not exist" Oct 09 15:34:14 crc kubenswrapper[4719]: I1009 15:34:14.105160 4719 scope.go:117] "RemoveContainer" containerID="8596ece6f6330253f5920d00478b711d559cb6448c8f8d7450dca57530e7836f" Oct 09 15:34:14 crc kubenswrapper[4719]: E1009 15:34:14.105463 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8596ece6f6330253f5920d00478b711d559cb6448c8f8d7450dca57530e7836f\": container with ID starting with 8596ece6f6330253f5920d00478b711d559cb6448c8f8d7450dca57530e7836f not found: ID does not exist" containerID="8596ece6f6330253f5920d00478b711d559cb6448c8f8d7450dca57530e7836f" Oct 09 15:34:14 crc kubenswrapper[4719]: I1009 15:34:14.105490 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8596ece6f6330253f5920d00478b711d559cb6448c8f8d7450dca57530e7836f"} err="failed to get container status \"8596ece6f6330253f5920d00478b711d559cb6448c8f8d7450dca57530e7836f\": rpc error: code = NotFound desc = could not find container \"8596ece6f6330253f5920d00478b711d559cb6448c8f8d7450dca57530e7836f\": container with ID starting with 8596ece6f6330253f5920d00478b711d559cb6448c8f8d7450dca57530e7836f not found: ID does not exist" Oct 09 15:34:14 crc kubenswrapper[4719]: I1009 15:34:14.112605 4719 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:14 crc kubenswrapper[4719]: I1009 15:34:14.112707 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvhcc\" (UniqueName: \"kubernetes.io/projected/4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6-kube-api-access-gvhcc\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:14 crc kubenswrapper[4719]: I1009 15:34:14.112769 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:14 crc kubenswrapper[4719]: I1009 15:34:14.368875 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dd68b64f-t69c6"] Oct 09 15:34:14 crc kubenswrapper[4719]: I1009 15:34:14.373832 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dd68b64f-t69c6"] Oct 09 15:34:15 crc kubenswrapper[4719]: I1009 15:34:15.042121 4719 generic.go:334] "Generic (PLEG): container finished" podID="f8ea2169-9310-49ca-ac1f-f11eeb9b8e26" containerID="e631c472b961ca8b25e75e1dbbb45738f9771e87c0c550bd213d7799cee0180e" exitCode=0 Oct 09 15:34:15 crc kubenswrapper[4719]: I1009 15:34:15.042229 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-gcfdh" event={"ID":"f8ea2169-9310-49ca-ac1f-f11eeb9b8e26","Type":"ContainerDied","Data":"e631c472b961ca8b25e75e1dbbb45738f9771e87c0c550bd213d7799cee0180e"} Oct 09 15:34:15 crc kubenswrapper[4719]: I1009 15:34:15.175870 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6" path="/var/lib/kubelet/pods/4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6/volumes" Oct 09 15:34:16 crc kubenswrapper[4719]: I1009 15:34:16.412241 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-gcfdh" Oct 09 15:34:16 crc kubenswrapper[4719]: I1009 15:34:16.462716 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqsrx\" (UniqueName: \"kubernetes.io/projected/f8ea2169-9310-49ca-ac1f-f11eeb9b8e26-kube-api-access-bqsrx\") pod \"f8ea2169-9310-49ca-ac1f-f11eeb9b8e26\" (UID: \"f8ea2169-9310-49ca-ac1f-f11eeb9b8e26\") " Oct 09 15:34:16 crc kubenswrapper[4719]: I1009 15:34:16.470431 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8ea2169-9310-49ca-ac1f-f11eeb9b8e26-kube-api-access-bqsrx" (OuterVolumeSpecName: "kube-api-access-bqsrx") pod "f8ea2169-9310-49ca-ac1f-f11eeb9b8e26" (UID: "f8ea2169-9310-49ca-ac1f-f11eeb9b8e26"). InnerVolumeSpecName "kube-api-access-bqsrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:34:16 crc kubenswrapper[4719]: I1009 15:34:16.565491 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqsrx\" (UniqueName: \"kubernetes.io/projected/f8ea2169-9310-49ca-ac1f-f11eeb9b8e26-kube-api-access-bqsrx\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:17 crc kubenswrapper[4719]: I1009 15:34:17.063490 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"95b09495-a75f-42db-ae2c-99ac6e46f039","Type":"ContainerStarted","Data":"1faf5c2993ec241e120ed50cd7601b60c3b312339069c70768f76853f949a86a"} Oct 09 15:34:17 crc kubenswrapper[4719]: I1009 15:34:17.065052 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-gcfdh" event={"ID":"f8ea2169-9310-49ca-ac1f-f11eeb9b8e26","Type":"ContainerDied","Data":"6edba65f4dc09c2471132bba5203654ef3a653796e23c25899fb5f269f69de9a"} Oct 09 15:34:17 crc kubenswrapper[4719]: I1009 15:34:17.065108 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6edba65f4dc09c2471132bba5203654ef3a653796e23c25899fb5f269f69de9a" Oct 09 15:34:17 crc kubenswrapper[4719]: I1009 15:34:17.065131 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-gcfdh" Oct 09 15:34:17 crc kubenswrapper[4719]: I1009 15:34:17.094693 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=10.06446352 podStartE2EDuration="45.094670416s" podCreationTimestamp="2025-10-09 15:33:32 +0000 UTC" firstStartedPulling="2025-10-09 15:33:40.971370514 +0000 UTC m=+926.481081799" lastFinishedPulling="2025-10-09 15:34:16.00157741 +0000 UTC m=+961.511288695" observedRunningTime="2025-10-09 15:34:17.08945095 +0000 UTC m=+962.599162265" watchObservedRunningTime="2025-10-09 15:34:17.094670416 +0000 UTC m=+962.604381711" Oct 09 15:34:18 crc kubenswrapper[4719]: I1009 15:34:18.074788 4719 generic.go:334] "Generic (PLEG): container finished" podID="5685c463-d342-436a-a619-f809a2559691" containerID="028d8f90693f50efa8b56d568210bc2bdb2c112dc9c6b4a933addb04b48018f4" exitCode=0 Oct 09 15:34:18 crc kubenswrapper[4719]: I1009 15:34:18.074885 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dd7hf" event={"ID":"5685c463-d342-436a-a619-f809a2559691","Type":"ContainerDied","Data":"028d8f90693f50efa8b56d568210bc2bdb2c112dc9c6b4a933addb04b48018f4"} Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.156579 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.156696 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.158711 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.425159 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dd7hf" Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.518045 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5685c463-d342-436a-a619-f809a2559691-dispersionconf\") pod \"5685c463-d342-436a-a619-f809a2559691\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.518100 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5685c463-d342-436a-a619-f809a2559691-swiftconf\") pod \"5685c463-d342-436a-a619-f809a2559691\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.518378 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-577jl\" (UniqueName: \"kubernetes.io/projected/5685c463-d342-436a-a619-f809a2559691-kube-api-access-577jl\") pod \"5685c463-d342-436a-a619-f809a2559691\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.518415 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5685c463-d342-436a-a619-f809a2559691-scripts\") pod \"5685c463-d342-436a-a619-f809a2559691\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.518453 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5685c463-d342-436a-a619-f809a2559691-etc-swift\") pod \"5685c463-d342-436a-a619-f809a2559691\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.518494 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5685c463-d342-436a-a619-f809a2559691-ring-data-devices\") pod \"5685c463-d342-436a-a619-f809a2559691\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.518511 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5685c463-d342-436a-a619-f809a2559691-combined-ca-bundle\") pod \"5685c463-d342-436a-a619-f809a2559691\" (UID: \"5685c463-d342-436a-a619-f809a2559691\") " Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.520076 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5685c463-d342-436a-a619-f809a2559691-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5685c463-d342-436a-a619-f809a2559691" (UID: "5685c463-d342-436a-a619-f809a2559691"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.520141 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5685c463-d342-436a-a619-f809a2559691-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5685c463-d342-436a-a619-f809a2559691" (UID: "5685c463-d342-436a-a619-f809a2559691"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.529723 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5685c463-d342-436a-a619-f809a2559691-kube-api-access-577jl" (OuterVolumeSpecName: "kube-api-access-577jl") pod "5685c463-d342-436a-a619-f809a2559691" (UID: "5685c463-d342-436a-a619-f809a2559691"). InnerVolumeSpecName "kube-api-access-577jl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.532840 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5685c463-d342-436a-a619-f809a2559691-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5685c463-d342-436a-a619-f809a2559691" (UID: "5685c463-d342-436a-a619-f809a2559691"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.546225 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5685c463-d342-436a-a619-f809a2559691-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5685c463-d342-436a-a619-f809a2559691" (UID: "5685c463-d342-436a-a619-f809a2559691"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.548455 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5685c463-d342-436a-a619-f809a2559691-scripts" (OuterVolumeSpecName: "scripts") pod "5685c463-d342-436a-a619-f809a2559691" (UID: "5685c463-d342-436a-a619-f809a2559691"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.552219 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5685c463-d342-436a-a619-f809a2559691-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5685c463-d342-436a-a619-f809a2559691" (UID: "5685c463-d342-436a-a619-f809a2559691"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.620839 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-577jl\" (UniqueName: \"kubernetes.io/projected/5685c463-d342-436a-a619-f809a2559691-kube-api-access-577jl\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.620881 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5685c463-d342-436a-a619-f809a2559691-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.620894 4719 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5685c463-d342-436a-a619-f809a2559691-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.620907 4719 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5685c463-d342-436a-a619-f809a2559691-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.620920 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5685c463-d342-436a-a619-f809a2559691-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.620933 4719 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5685c463-d342-436a-a619-f809a2559691-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.620944 4719 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5685c463-d342-436a-a619-f809a2559691-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.924865 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0358ac72-8c58-4e63-843e-b9eaa35aefdf-etc-swift\") pod \"swift-storage-0\" (UID: \"0358ac72-8c58-4e63-843e-b9eaa35aefdf\") " pod="openstack/swift-storage-0" Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.931939 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0358ac72-8c58-4e63-843e-b9eaa35aefdf-etc-swift\") pod \"swift-storage-0\" (UID: \"0358ac72-8c58-4e63-843e-b9eaa35aefdf\") " pod="openstack/swift-storage-0" Oct 09 15:34:19 crc kubenswrapper[4719]: I1009 15:34:19.998341 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 09 15:34:20 crc kubenswrapper[4719]: I1009 15:34:20.092466 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dd7hf" Oct 09 15:34:20 crc kubenswrapper[4719]: I1009 15:34:20.092557 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dd7hf" event={"ID":"5685c463-d342-436a-a619-f809a2559691","Type":"ContainerDied","Data":"97f334f18e2b6182f768b52e9b82cc2737d175188c8dc8361df039d57222653a"} Oct 09 15:34:20 crc kubenswrapper[4719]: I1009 15:34:20.093427 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97f334f18e2b6182f768b52e9b82cc2737d175188c8dc8361df039d57222653a" Oct 09 15:34:20 crc kubenswrapper[4719]: I1009 15:34:20.093503 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:20 crc kubenswrapper[4719]: I1009 15:34:20.561611 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 09 15:34:20 crc kubenswrapper[4719]: I1009 15:34:20.594113 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 09 15:34:20 crc kubenswrapper[4719]: W1009 15:34:20.597863 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0358ac72_8c58_4e63_843e_b9eaa35aefdf.slice/crio-d1714c0d23f0e1c4d9951ab8794ec01e37ac1f3578eb7f50bf5dc4e166147298 WatchSource:0}: Error finding container d1714c0d23f0e1c4d9951ab8794ec01e37ac1f3578eb7f50bf5dc4e166147298: Status 404 returned error can't find the container with id d1714c0d23f0e1c4d9951ab8794ec01e37ac1f3578eb7f50bf5dc4e166147298 Oct 09 15:34:20 crc kubenswrapper[4719]: I1009 15:34:20.725055 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-8nm4m"] Oct 09 15:34:20 crc kubenswrapper[4719]: E1009 15:34:20.725385 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6" containerName="dnsmasq-dns" Oct 09 15:34:20 crc kubenswrapper[4719]: I1009 15:34:20.725400 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6" containerName="dnsmasq-dns" Oct 09 15:34:20 crc kubenswrapper[4719]: E1009 15:34:20.725417 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6" containerName="init" Oct 09 15:34:20 crc kubenswrapper[4719]: I1009 15:34:20.725423 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6" containerName="init" Oct 09 15:34:20 crc kubenswrapper[4719]: E1009 15:34:20.725437 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5685c463-d342-436a-a619-f809a2559691" containerName="swift-ring-rebalance" Oct 09 15:34:20 crc kubenswrapper[4719]: I1009 15:34:20.725444 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="5685c463-d342-436a-a619-f809a2559691" containerName="swift-ring-rebalance" Oct 09 15:34:20 crc kubenswrapper[4719]: E1009 15:34:20.725462 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8ea2169-9310-49ca-ac1f-f11eeb9b8e26" containerName="mariadb-database-create" Oct 09 15:34:20 crc kubenswrapper[4719]: I1009 15:34:20.725468 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8ea2169-9310-49ca-ac1f-f11eeb9b8e26" containerName="mariadb-database-create" Oct 09 15:34:20 crc kubenswrapper[4719]: I1009 15:34:20.725617 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="5685c463-d342-436a-a619-f809a2559691" containerName="swift-ring-rebalance" Oct 09 15:34:20 crc kubenswrapper[4719]: I1009 15:34:20.725631 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c0b6a31-2ddf-4ce4-a444-ec6cfe5963f6" containerName="dnsmasq-dns" Oct 09 15:34:20 crc kubenswrapper[4719]: I1009 15:34:20.725645 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8ea2169-9310-49ca-ac1f-f11eeb9b8e26" containerName="mariadb-database-create" Oct 09 15:34:20 crc kubenswrapper[4719]: I1009 15:34:20.726198 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8nm4m" Oct 09 15:34:20 crc kubenswrapper[4719]: I1009 15:34:20.734619 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-8nm4m"] Oct 09 15:34:20 crc kubenswrapper[4719]: I1009 15:34:20.849549 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5f8w\" (UniqueName: \"kubernetes.io/projected/82195049-d846-4c69-a778-b85451e8d485-kube-api-access-z5f8w\") pod \"keystone-db-create-8nm4m\" (UID: \"82195049-d846-4c69-a778-b85451e8d485\") " pod="openstack/keystone-db-create-8nm4m" Oct 09 15:34:20 crc kubenswrapper[4719]: I1009 15:34:20.952563 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5f8w\" (UniqueName: \"kubernetes.io/projected/82195049-d846-4c69-a778-b85451e8d485-kube-api-access-z5f8w\") pod \"keystone-db-create-8nm4m\" (UID: \"82195049-d846-4c69-a778-b85451e8d485\") " pod="openstack/keystone-db-create-8nm4m" Oct 09 15:34:20 crc kubenswrapper[4719]: I1009 15:34:20.980410 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5f8w\" (UniqueName: \"kubernetes.io/projected/82195049-d846-4c69-a778-b85451e8d485-kube-api-access-z5f8w\") pod \"keystone-db-create-8nm4m\" (UID: \"82195049-d846-4c69-a778-b85451e8d485\") " pod="openstack/keystone-db-create-8nm4m" Oct 09 15:34:21 crc kubenswrapper[4719]: I1009 15:34:20.999038 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-p6rrg"] Oct 09 15:34:21 crc kubenswrapper[4719]: I1009 15:34:21.000203 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p6rrg" Oct 09 15:34:21 crc kubenswrapper[4719]: I1009 15:34:21.007015 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-p6rrg"] Oct 09 15:34:21 crc kubenswrapper[4719]: I1009 15:34:21.048745 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8nm4m" Oct 09 15:34:21 crc kubenswrapper[4719]: I1009 15:34:21.054395 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm5ml\" (UniqueName: \"kubernetes.io/projected/726baf17-62cc-40f1-bb7c-204e357465d1-kube-api-access-hm5ml\") pod \"placement-db-create-p6rrg\" (UID: \"726baf17-62cc-40f1-bb7c-204e357465d1\") " pod="openstack/placement-db-create-p6rrg" Oct 09 15:34:21 crc kubenswrapper[4719]: I1009 15:34:21.108030 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0358ac72-8c58-4e63-843e-b9eaa35aefdf","Type":"ContainerStarted","Data":"d1714c0d23f0e1c4d9951ab8794ec01e37ac1f3578eb7f50bf5dc4e166147298"} Oct 09 15:34:21 crc kubenswrapper[4719]: I1009 15:34:21.156282 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm5ml\" (UniqueName: \"kubernetes.io/projected/726baf17-62cc-40f1-bb7c-204e357465d1-kube-api-access-hm5ml\") pod \"placement-db-create-p6rrg\" (UID: \"726baf17-62cc-40f1-bb7c-204e357465d1\") " pod="openstack/placement-db-create-p6rrg" Oct 09 15:34:21 crc kubenswrapper[4719]: I1009 15:34:21.182941 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm5ml\" (UniqueName: \"kubernetes.io/projected/726baf17-62cc-40f1-bb7c-204e357465d1-kube-api-access-hm5ml\") pod \"placement-db-create-p6rrg\" (UID: \"726baf17-62cc-40f1-bb7c-204e357465d1\") " pod="openstack/placement-db-create-p6rrg" Oct 09 15:34:21 crc kubenswrapper[4719]: I1009 15:34:21.337238 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p6rrg" Oct 09 15:34:21 crc kubenswrapper[4719]: I1009 15:34:21.862681 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-8nm4m"] Oct 09 15:34:21 crc kubenswrapper[4719]: W1009 15:34:21.871282 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82195049_d846_4c69_a778_b85451e8d485.slice/crio-b7d81dcb7160e5b582c4eb7f27ebb6ac562096aed7f81fb92656deaa9eaed772 WatchSource:0}: Error finding container b7d81dcb7160e5b582c4eb7f27ebb6ac562096aed7f81fb92656deaa9eaed772: Status 404 returned error can't find the container with id b7d81dcb7160e5b582c4eb7f27ebb6ac562096aed7f81fb92656deaa9eaed772 Oct 09 15:34:21 crc kubenswrapper[4719]: I1009 15:34:21.932436 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-p6rrg"] Oct 09 15:34:22 crc kubenswrapper[4719]: I1009 15:34:22.119624 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-p6rrg" event={"ID":"726baf17-62cc-40f1-bb7c-204e357465d1","Type":"ContainerStarted","Data":"1ab2eb78eec8afb4581e4f0eb6e75ecfc0cc95a33e1b6cb06591b10088735311"} Oct 09 15:34:22 crc kubenswrapper[4719]: I1009 15:34:22.129751 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0358ac72-8c58-4e63-843e-b9eaa35aefdf","Type":"ContainerStarted","Data":"fcb2bde1880af9b19ddb1c48f9091711c80c4000f6f530b74a87d1177c389ed8"} Oct 09 15:34:22 crc kubenswrapper[4719]: I1009 15:34:22.129810 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0358ac72-8c58-4e63-843e-b9eaa35aefdf","Type":"ContainerStarted","Data":"7a5cd358d593550f88947222e8ee884ac858b66a0c872fe130c15a9b38d5f536"} Oct 09 15:34:22 crc kubenswrapper[4719]: I1009 15:34:22.135020 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8nm4m" event={"ID":"82195049-d846-4c69-a778-b85451e8d485","Type":"ContainerStarted","Data":"4dc445a265df781fb85e53d3b213cdcc351e2ae28b87a95a76e264ff7ab3032f"} Oct 09 15:34:22 crc kubenswrapper[4719]: I1009 15:34:22.135069 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8nm4m" event={"ID":"82195049-d846-4c69-a778-b85451e8d485","Type":"ContainerStarted","Data":"b7d81dcb7160e5b582c4eb7f27ebb6ac562096aed7f81fb92656deaa9eaed772"} Oct 09 15:34:22 crc kubenswrapper[4719]: I1009 15:34:22.159327 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-8nm4m" podStartSLOduration=2.159309362 podStartE2EDuration="2.159309362s" podCreationTimestamp="2025-10-09 15:34:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:34:22.154623924 +0000 UTC m=+967.664335219" watchObservedRunningTime="2025-10-09 15:34:22.159309362 +0000 UTC m=+967.669020647" Oct 09 15:34:22 crc kubenswrapper[4719]: I1009 15:34:22.186120 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-p4t6l" podUID="f0151a18-0608-47b9-b58a-7eef9dfaf31b" containerName="ovn-controller" probeResult="failure" output=< Oct 09 15:34:22 crc kubenswrapper[4719]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 09 15:34:22 crc kubenswrapper[4719]: > Oct 09 15:34:22 crc kubenswrapper[4719]: I1009 15:34:22.230744 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 09 15:34:23 crc kubenswrapper[4719]: I1009 15:34:23.002636 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-f584-account-create-h95kn"] Oct 09 15:34:23 crc kubenswrapper[4719]: I1009 15:34:23.003671 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-f584-account-create-h95kn" Oct 09 15:34:23 crc kubenswrapper[4719]: I1009 15:34:23.005539 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Oct 09 15:34:23 crc kubenswrapper[4719]: I1009 15:34:23.014612 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-f584-account-create-h95kn"] Oct 09 15:34:23 crc kubenswrapper[4719]: I1009 15:34:23.147629 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0358ac72-8c58-4e63-843e-b9eaa35aefdf","Type":"ContainerStarted","Data":"13ee063d070ffe3fbf63028e5cc72668e3cac491ca806847ff84f6b3a509a304"} Oct 09 15:34:23 crc kubenswrapper[4719]: I1009 15:34:23.147755 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0358ac72-8c58-4e63-843e-b9eaa35aefdf","Type":"ContainerStarted","Data":"b4de04f4a5d55e8b77fa54b11dd8effd4017136fa0ae4760205744a9f00000a3"} Oct 09 15:34:23 crc kubenswrapper[4719]: I1009 15:34:23.147766 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0358ac72-8c58-4e63-843e-b9eaa35aefdf","Type":"ContainerStarted","Data":"ac13f70f4a2d7ea95f5c40828d273807f78477a3f841f9d4a2bc9bd122af44e4"} Oct 09 15:34:23 crc kubenswrapper[4719]: I1009 15:34:23.149699 4719 generic.go:334] "Generic (PLEG): container finished" podID="82195049-d846-4c69-a778-b85451e8d485" containerID="4dc445a265df781fb85e53d3b213cdcc351e2ae28b87a95a76e264ff7ab3032f" exitCode=0 Oct 09 15:34:23 crc kubenswrapper[4719]: I1009 15:34:23.149766 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8nm4m" event={"ID":"82195049-d846-4c69-a778-b85451e8d485","Type":"ContainerDied","Data":"4dc445a265df781fb85e53d3b213cdcc351e2ae28b87a95a76e264ff7ab3032f"} Oct 09 15:34:23 crc kubenswrapper[4719]: I1009 15:34:23.151859 4719 generic.go:334] "Generic (PLEG): container finished" podID="726baf17-62cc-40f1-bb7c-204e357465d1" containerID="e7bc012572a6a34b51dd2c192460f991ac66243f04a379944c9b7111d4823f07" exitCode=0 Oct 09 15:34:23 crc kubenswrapper[4719]: I1009 15:34:23.152095 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="95b09495-a75f-42db-ae2c-99ac6e46f039" containerName="prometheus" containerID="cri-o://f24e7defb05b0fe57c59432a6505ceac98d73ecf638246550f93a6336334e527" gracePeriod=600 Oct 09 15:34:23 crc kubenswrapper[4719]: I1009 15:34:23.152207 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-p6rrg" event={"ID":"726baf17-62cc-40f1-bb7c-204e357465d1","Type":"ContainerDied","Data":"e7bc012572a6a34b51dd2c192460f991ac66243f04a379944c9b7111d4823f07"} Oct 09 15:34:23 crc kubenswrapper[4719]: I1009 15:34:23.152255 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="95b09495-a75f-42db-ae2c-99ac6e46f039" containerName="thanos-sidecar" containerID="cri-o://1faf5c2993ec241e120ed50cd7601b60c3b312339069c70768f76853f949a86a" gracePeriod=600 Oct 09 15:34:23 crc kubenswrapper[4719]: I1009 15:34:23.152295 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="95b09495-a75f-42db-ae2c-99ac6e46f039" containerName="config-reloader" containerID="cri-o://ffc4804d631e3f3bd2201f6e4ce6aa2536661ce40e7a85af4bcfffd7ee457f1d" gracePeriod=600 Oct 09 15:34:23 crc kubenswrapper[4719]: I1009 15:34:23.196518 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdrts\" (UniqueName: \"kubernetes.io/projected/8898aee0-de9a-4957-a478-0f322abd395b-kube-api-access-vdrts\") pod \"watcher-f584-account-create-h95kn\" (UID: \"8898aee0-de9a-4957-a478-0f322abd395b\") " pod="openstack/watcher-f584-account-create-h95kn" Oct 09 15:34:23 crc kubenswrapper[4719]: I1009 15:34:23.298715 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdrts\" (UniqueName: \"kubernetes.io/projected/8898aee0-de9a-4957-a478-0f322abd395b-kube-api-access-vdrts\") pod \"watcher-f584-account-create-h95kn\" (UID: \"8898aee0-de9a-4957-a478-0f322abd395b\") " pod="openstack/watcher-f584-account-create-h95kn" Oct 09 15:34:23 crc kubenswrapper[4719]: I1009 15:34:23.324939 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdrts\" (UniqueName: \"kubernetes.io/projected/8898aee0-de9a-4957-a478-0f322abd395b-kube-api-access-vdrts\") pod \"watcher-f584-account-create-h95kn\" (UID: \"8898aee0-de9a-4957-a478-0f322abd395b\") " pod="openstack/watcher-f584-account-create-h95kn" Oct 09 15:34:23 crc kubenswrapper[4719]: I1009 15:34:23.333703 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-f584-account-create-h95kn" Oct 09 15:34:23 crc kubenswrapper[4719]: I1009 15:34:23.784952 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-f584-account-create-h95kn"] Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.158938 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.161340 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-f584-account-create-h95kn" event={"ID":"8898aee0-de9a-4957-a478-0f322abd395b","Type":"ContainerStarted","Data":"2e401662b2558ba1d3b213c8bb02e12db250b2df26c89d7630030687fbbfcfdc"} Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.165597 4719 generic.go:334] "Generic (PLEG): container finished" podID="d3a820d9-3c13-47ec-a39e-dea4d60b7536" containerID="72455dcfecc11206bc40520b8d088cd8e9d4106ebbf33631ff25928e9b1dc487" exitCode=0 Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.165731 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d3a820d9-3c13-47ec-a39e-dea4d60b7536","Type":"ContainerDied","Data":"72455dcfecc11206bc40520b8d088cd8e9d4106ebbf33631ff25928e9b1dc487"} Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.170876 4719 generic.go:334] "Generic (PLEG): container finished" podID="1df540c9-8b54-44a5-9c5d-03cf736ee67a" containerID="38009f663fbfc9c961727713c72827abf1225d73ea39016f1ab401dd44130afd" exitCode=0 Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.171069 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"1df540c9-8b54-44a5-9c5d-03cf736ee67a","Type":"ContainerDied","Data":"38009f663fbfc9c961727713c72827abf1225d73ea39016f1ab401dd44130afd"} Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.173059 4719 generic.go:334] "Generic (PLEG): container finished" podID="c8f5a6f9-5554-485d-9aee-47449402e37b" containerID="df5f9951dcb79d436b3ba8bc9106853f92f66edecb4cb1855c4a6cc3746f4ea4" exitCode=0 Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.173096 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c8f5a6f9-5554-485d-9aee-47449402e37b","Type":"ContainerDied","Data":"df5f9951dcb79d436b3ba8bc9106853f92f66edecb4cb1855c4a6cc3746f4ea4"} Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.178282 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0358ac72-8c58-4e63-843e-b9eaa35aefdf","Type":"ContainerStarted","Data":"d25346f199abd09302897d85d13cc54741b4a7ce71b2554ae9fac9b5a62f13e8"} Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.178327 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0358ac72-8c58-4e63-843e-b9eaa35aefdf","Type":"ContainerStarted","Data":"2acb103640a8b98f40516514162d95b3c14b1efb1a816c26e51d7a10d63c33d8"} Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.178336 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0358ac72-8c58-4e63-843e-b9eaa35aefdf","Type":"ContainerStarted","Data":"a76f7f8a78c3d40c6523d861b8e450a19c41b74daa46be261bbb67c80cf4f4ed"} Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.196464 4719 generic.go:334] "Generic (PLEG): container finished" podID="95b09495-a75f-42db-ae2c-99ac6e46f039" containerID="1faf5c2993ec241e120ed50cd7601b60c3b312339069c70768f76853f949a86a" exitCode=0 Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.196504 4719 generic.go:334] "Generic (PLEG): container finished" podID="95b09495-a75f-42db-ae2c-99ac6e46f039" containerID="ffc4804d631e3f3bd2201f6e4ce6aa2536661ce40e7a85af4bcfffd7ee457f1d" exitCode=0 Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.196516 4719 generic.go:334] "Generic (PLEG): container finished" podID="95b09495-a75f-42db-ae2c-99ac6e46f039" containerID="f24e7defb05b0fe57c59432a6505ceac98d73ecf638246550f93a6336334e527" exitCode=0 Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.196527 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.196603 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"95b09495-a75f-42db-ae2c-99ac6e46f039","Type":"ContainerDied","Data":"1faf5c2993ec241e120ed50cd7601b60c3b312339069c70768f76853f949a86a"} Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.196631 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"95b09495-a75f-42db-ae2c-99ac6e46f039","Type":"ContainerDied","Data":"ffc4804d631e3f3bd2201f6e4ce6aa2536661ce40e7a85af4bcfffd7ee457f1d"} Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.196644 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"95b09495-a75f-42db-ae2c-99ac6e46f039","Type":"ContainerDied","Data":"f24e7defb05b0fe57c59432a6505ceac98d73ecf638246550f93a6336334e527"} Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.196652 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"95b09495-a75f-42db-ae2c-99ac6e46f039","Type":"ContainerDied","Data":"ac05b64485c378330eb8db2f1a399a43343f9b3c55c94fc16a87bf6370cf8763"} Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.196667 4719 scope.go:117] "RemoveContainer" containerID="1faf5c2993ec241e120ed50cd7601b60c3b312339069c70768f76853f949a86a" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.218761 4719 scope.go:117] "RemoveContainer" containerID="ffc4804d631e3f3bd2201f6e4ce6aa2536661ce40e7a85af4bcfffd7ee457f1d" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.345944 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/95b09495-a75f-42db-ae2c-99ac6e46f039-tls-assets\") pod \"95b09495-a75f-42db-ae2c-99ac6e46f039\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.346012 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/95b09495-a75f-42db-ae2c-99ac6e46f039-web-config\") pod \"95b09495-a75f-42db-ae2c-99ac6e46f039\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.346073 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/95b09495-a75f-42db-ae2c-99ac6e46f039-config\") pod \"95b09495-a75f-42db-ae2c-99ac6e46f039\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.346421 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/95b09495-a75f-42db-ae2c-99ac6e46f039-config-out\") pod \"95b09495-a75f-42db-ae2c-99ac6e46f039\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.346610 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/95b09495-a75f-42db-ae2c-99ac6e46f039-prometheus-metric-storage-rulefiles-0\") pod \"95b09495-a75f-42db-ae2c-99ac6e46f039\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.346758 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tj8j6\" (UniqueName: \"kubernetes.io/projected/95b09495-a75f-42db-ae2c-99ac6e46f039-kube-api-access-tj8j6\") pod \"95b09495-a75f-42db-ae2c-99ac6e46f039\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.347011 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\") pod \"95b09495-a75f-42db-ae2c-99ac6e46f039\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.347071 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/95b09495-a75f-42db-ae2c-99ac6e46f039-thanos-prometheus-http-client-file\") pod \"95b09495-a75f-42db-ae2c-99ac6e46f039\" (UID: \"95b09495-a75f-42db-ae2c-99ac6e46f039\") " Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.349020 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95b09495-a75f-42db-ae2c-99ac6e46f039-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "95b09495-a75f-42db-ae2c-99ac6e46f039" (UID: "95b09495-a75f-42db-ae2c-99ac6e46f039"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.355040 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95b09495-a75f-42db-ae2c-99ac6e46f039-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "95b09495-a75f-42db-ae2c-99ac6e46f039" (UID: "95b09495-a75f-42db-ae2c-99ac6e46f039"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.360065 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95b09495-a75f-42db-ae2c-99ac6e46f039-kube-api-access-tj8j6" (OuterVolumeSpecName: "kube-api-access-tj8j6") pod "95b09495-a75f-42db-ae2c-99ac6e46f039" (UID: "95b09495-a75f-42db-ae2c-99ac6e46f039"). InnerVolumeSpecName "kube-api-access-tj8j6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.361071 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95b09495-a75f-42db-ae2c-99ac6e46f039-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "95b09495-a75f-42db-ae2c-99ac6e46f039" (UID: "95b09495-a75f-42db-ae2c-99ac6e46f039"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.361435 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95b09495-a75f-42db-ae2c-99ac6e46f039-config" (OuterVolumeSpecName: "config") pod "95b09495-a75f-42db-ae2c-99ac6e46f039" (UID: "95b09495-a75f-42db-ae2c-99ac6e46f039"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.369366 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95b09495-a75f-42db-ae2c-99ac6e46f039-config-out" (OuterVolumeSpecName: "config-out") pod "95b09495-a75f-42db-ae2c-99ac6e46f039" (UID: "95b09495-a75f-42db-ae2c-99ac6e46f039"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.397846 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "95b09495-a75f-42db-ae2c-99ac6e46f039" (UID: "95b09495-a75f-42db-ae2c-99ac6e46f039"). InnerVolumeSpecName "pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.424752 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95b09495-a75f-42db-ae2c-99ac6e46f039-web-config" (OuterVolumeSpecName: "web-config") pod "95b09495-a75f-42db-ae2c-99ac6e46f039" (UID: "95b09495-a75f-42db-ae2c-99ac6e46f039"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.450010 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tj8j6\" (UniqueName: \"kubernetes.io/projected/95b09495-a75f-42db-ae2c-99ac6e46f039-kube-api-access-tj8j6\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.450089 4719 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\") on node \"crc\" " Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.450106 4719 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/95b09495-a75f-42db-ae2c-99ac6e46f039-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.450119 4719 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/95b09495-a75f-42db-ae2c-99ac6e46f039-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.450131 4719 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/95b09495-a75f-42db-ae2c-99ac6e46f039-web-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.450139 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/95b09495-a75f-42db-ae2c-99ac6e46f039-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.450146 4719 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/95b09495-a75f-42db-ae2c-99ac6e46f039-config-out\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.450154 4719 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/95b09495-a75f-42db-ae2c-99ac6e46f039-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.495918 4719 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.496068 4719 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1") on node "crc" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.552029 4719 reconciler_common.go:293] "Volume detached for volume \"pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.557732 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.569268 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.597975 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 09 15:34:24 crc kubenswrapper[4719]: E1009 15:34:24.599071 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95b09495-a75f-42db-ae2c-99ac6e46f039" containerName="init-config-reloader" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.599094 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b09495-a75f-42db-ae2c-99ac6e46f039" containerName="init-config-reloader" Oct 09 15:34:24 crc kubenswrapper[4719]: E1009 15:34:24.599108 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95b09495-a75f-42db-ae2c-99ac6e46f039" containerName="thanos-sidecar" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.599115 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b09495-a75f-42db-ae2c-99ac6e46f039" containerName="thanos-sidecar" Oct 09 15:34:24 crc kubenswrapper[4719]: E1009 15:34:24.599724 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95b09495-a75f-42db-ae2c-99ac6e46f039" containerName="config-reloader" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.599745 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b09495-a75f-42db-ae2c-99ac6e46f039" containerName="config-reloader" Oct 09 15:34:24 crc kubenswrapper[4719]: E1009 15:34:24.599762 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95b09495-a75f-42db-ae2c-99ac6e46f039" containerName="prometheus" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.599770 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b09495-a75f-42db-ae2c-99ac6e46f039" containerName="prometheus" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.600135 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="95b09495-a75f-42db-ae2c-99ac6e46f039" containerName="prometheus" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.606403 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="95b09495-a75f-42db-ae2c-99ac6e46f039" containerName="thanos-sidecar" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.606440 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="95b09495-a75f-42db-ae2c-99ac6e46f039" containerName="config-reloader" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.608386 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.610828 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.611032 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.612174 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.612334 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.617463 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.623193 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-4npj7" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.626173 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.626391 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.761203 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.761297 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.761321 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.761342 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.761384 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.761410 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkpd8\" (UniqueName: \"kubernetes.io/projected/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-kube-api-access-fkpd8\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.761458 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.761482 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.761508 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.761537 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.761563 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-config\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.862779 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.862828 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.862857 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.862889 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.862916 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-config\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.862938 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.862976 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.862996 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.863020 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.863040 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.863070 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkpd8\" (UniqueName: \"kubernetes.io/projected/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-kube-api-access-fkpd8\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.864077 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.889975 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.890071 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.890071 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.890132 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.890194 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-config\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.890340 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.890586 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.890613 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.891585 4719 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.891636 4719 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/37a51f19e15282ab5032b2bf09c91363092e9b48becd8acf5f5419f3d47a69ff/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.895249 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkpd8\" (UniqueName: \"kubernetes.io/projected/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-kube-api-access-fkpd8\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.932821 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\") pod \"prometheus-metric-storage-0\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:24 crc kubenswrapper[4719]: I1009 15:34:24.953873 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.022224 4719 scope.go:117] "RemoveContainer" containerID="f24e7defb05b0fe57c59432a6505ceac98d73ecf638246550f93a6336334e527" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.098063 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p6rrg" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.100469 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8nm4m" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.167241 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm5ml\" (UniqueName: \"kubernetes.io/projected/726baf17-62cc-40f1-bb7c-204e357465d1-kube-api-access-hm5ml\") pod \"726baf17-62cc-40f1-bb7c-204e357465d1\" (UID: \"726baf17-62cc-40f1-bb7c-204e357465d1\") " Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.172496 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/726baf17-62cc-40f1-bb7c-204e357465d1-kube-api-access-hm5ml" (OuterVolumeSpecName: "kube-api-access-hm5ml") pod "726baf17-62cc-40f1-bb7c-204e357465d1" (UID: "726baf17-62cc-40f1-bb7c-204e357465d1"). InnerVolumeSpecName "kube-api-access-hm5ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.183952 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95b09495-a75f-42db-ae2c-99ac6e46f039" path="/var/lib/kubelet/pods/95b09495-a75f-42db-ae2c-99ac6e46f039/volumes" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.191955 4719 scope.go:117] "RemoveContainer" containerID="c71605342389cbf78dd4673d88ba0c916c140e7ccdb2f4ba39746c61fdc45be9" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.224812 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-p6rrg" event={"ID":"726baf17-62cc-40f1-bb7c-204e357465d1","Type":"ContainerDied","Data":"1ab2eb78eec8afb4581e4f0eb6e75ecfc0cc95a33e1b6cb06591b10088735311"} Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.224858 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ab2eb78eec8afb4581e4f0eb6e75ecfc0cc95a33e1b6cb06591b10088735311" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.224940 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p6rrg" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.231603 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-f584-account-create-h95kn" event={"ID":"8898aee0-de9a-4957-a478-0f322abd395b","Type":"ContainerStarted","Data":"8adc252ae0feb6c2483c2829e2f3067b3cb04b2cbe66ec68f79ee7d13e96a21c"} Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.239990 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8nm4m" event={"ID":"82195049-d846-4c69-a778-b85451e8d485","Type":"ContainerDied","Data":"b7d81dcb7160e5b582c4eb7f27ebb6ac562096aed7f81fb92656deaa9eaed772"} Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.240037 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7d81dcb7160e5b582c4eb7f27ebb6ac562096aed7f81fb92656deaa9eaed772" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.240106 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8nm4m" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.251646 4719 scope.go:117] "RemoveContainer" containerID="1faf5c2993ec241e120ed50cd7601b60c3b312339069c70768f76853f949a86a" Oct 09 15:34:25 crc kubenswrapper[4719]: E1009 15:34:25.252297 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1faf5c2993ec241e120ed50cd7601b60c3b312339069c70768f76853f949a86a\": container with ID starting with 1faf5c2993ec241e120ed50cd7601b60c3b312339069c70768f76853f949a86a not found: ID does not exist" containerID="1faf5c2993ec241e120ed50cd7601b60c3b312339069c70768f76853f949a86a" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.252326 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1faf5c2993ec241e120ed50cd7601b60c3b312339069c70768f76853f949a86a"} err="failed to get container status \"1faf5c2993ec241e120ed50cd7601b60c3b312339069c70768f76853f949a86a\": rpc error: code = NotFound desc = could not find container \"1faf5c2993ec241e120ed50cd7601b60c3b312339069c70768f76853f949a86a\": container with ID starting with 1faf5c2993ec241e120ed50cd7601b60c3b312339069c70768f76853f949a86a not found: ID does not exist" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.252369 4719 scope.go:117] "RemoveContainer" containerID="ffc4804d631e3f3bd2201f6e4ce6aa2536661ce40e7a85af4bcfffd7ee457f1d" Oct 09 15:34:25 crc kubenswrapper[4719]: E1009 15:34:25.252846 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffc4804d631e3f3bd2201f6e4ce6aa2536661ce40e7a85af4bcfffd7ee457f1d\": container with ID starting with ffc4804d631e3f3bd2201f6e4ce6aa2536661ce40e7a85af4bcfffd7ee457f1d not found: ID does not exist" containerID="ffc4804d631e3f3bd2201f6e4ce6aa2536661ce40e7a85af4bcfffd7ee457f1d" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.252864 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc4804d631e3f3bd2201f6e4ce6aa2536661ce40e7a85af4bcfffd7ee457f1d"} err="failed to get container status \"ffc4804d631e3f3bd2201f6e4ce6aa2536661ce40e7a85af4bcfffd7ee457f1d\": rpc error: code = NotFound desc = could not find container \"ffc4804d631e3f3bd2201f6e4ce6aa2536661ce40e7a85af4bcfffd7ee457f1d\": container with ID starting with ffc4804d631e3f3bd2201f6e4ce6aa2536661ce40e7a85af4bcfffd7ee457f1d not found: ID does not exist" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.252875 4719 scope.go:117] "RemoveContainer" containerID="f24e7defb05b0fe57c59432a6505ceac98d73ecf638246550f93a6336334e527" Oct 09 15:34:25 crc kubenswrapper[4719]: E1009 15:34:25.253701 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f24e7defb05b0fe57c59432a6505ceac98d73ecf638246550f93a6336334e527\": container with ID starting with f24e7defb05b0fe57c59432a6505ceac98d73ecf638246550f93a6336334e527 not found: ID does not exist" containerID="f24e7defb05b0fe57c59432a6505ceac98d73ecf638246550f93a6336334e527" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.253720 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f24e7defb05b0fe57c59432a6505ceac98d73ecf638246550f93a6336334e527"} err="failed to get container status \"f24e7defb05b0fe57c59432a6505ceac98d73ecf638246550f93a6336334e527\": rpc error: code = NotFound desc = could not find container \"f24e7defb05b0fe57c59432a6505ceac98d73ecf638246550f93a6336334e527\": container with ID starting with f24e7defb05b0fe57c59432a6505ceac98d73ecf638246550f93a6336334e527 not found: ID does not exist" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.253736 4719 scope.go:117] "RemoveContainer" containerID="c71605342389cbf78dd4673d88ba0c916c140e7ccdb2f4ba39746c61fdc45be9" Oct 09 15:34:25 crc kubenswrapper[4719]: E1009 15:34:25.254099 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c71605342389cbf78dd4673d88ba0c916c140e7ccdb2f4ba39746c61fdc45be9\": container with ID starting with c71605342389cbf78dd4673d88ba0c916c140e7ccdb2f4ba39746c61fdc45be9 not found: ID does not exist" containerID="c71605342389cbf78dd4673d88ba0c916c140e7ccdb2f4ba39746c61fdc45be9" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.254147 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c71605342389cbf78dd4673d88ba0c916c140e7ccdb2f4ba39746c61fdc45be9"} err="failed to get container status \"c71605342389cbf78dd4673d88ba0c916c140e7ccdb2f4ba39746c61fdc45be9\": rpc error: code = NotFound desc = could not find container \"c71605342389cbf78dd4673d88ba0c916c140e7ccdb2f4ba39746c61fdc45be9\": container with ID starting with c71605342389cbf78dd4673d88ba0c916c140e7ccdb2f4ba39746c61fdc45be9 not found: ID does not exist" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.254180 4719 scope.go:117] "RemoveContainer" containerID="1faf5c2993ec241e120ed50cd7601b60c3b312339069c70768f76853f949a86a" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.255197 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1faf5c2993ec241e120ed50cd7601b60c3b312339069c70768f76853f949a86a"} err="failed to get container status \"1faf5c2993ec241e120ed50cd7601b60c3b312339069c70768f76853f949a86a\": rpc error: code = NotFound desc = could not find container \"1faf5c2993ec241e120ed50cd7601b60c3b312339069c70768f76853f949a86a\": container with ID starting with 1faf5c2993ec241e120ed50cd7601b60c3b312339069c70768f76853f949a86a not found: ID does not exist" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.255710 4719 scope.go:117] "RemoveContainer" containerID="ffc4804d631e3f3bd2201f6e4ce6aa2536661ce40e7a85af4bcfffd7ee457f1d" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.256269 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc4804d631e3f3bd2201f6e4ce6aa2536661ce40e7a85af4bcfffd7ee457f1d"} err="failed to get container status \"ffc4804d631e3f3bd2201f6e4ce6aa2536661ce40e7a85af4bcfffd7ee457f1d\": rpc error: code = NotFound desc = could not find container \"ffc4804d631e3f3bd2201f6e4ce6aa2536661ce40e7a85af4bcfffd7ee457f1d\": container with ID starting with ffc4804d631e3f3bd2201f6e4ce6aa2536661ce40e7a85af4bcfffd7ee457f1d not found: ID does not exist" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.256287 4719 scope.go:117] "RemoveContainer" containerID="f24e7defb05b0fe57c59432a6505ceac98d73ecf638246550f93a6336334e527" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.256531 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f24e7defb05b0fe57c59432a6505ceac98d73ecf638246550f93a6336334e527"} err="failed to get container status \"f24e7defb05b0fe57c59432a6505ceac98d73ecf638246550f93a6336334e527\": rpc error: code = NotFound desc = could not find container \"f24e7defb05b0fe57c59432a6505ceac98d73ecf638246550f93a6336334e527\": container with ID starting with f24e7defb05b0fe57c59432a6505ceac98d73ecf638246550f93a6336334e527 not found: ID does not exist" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.256546 4719 scope.go:117] "RemoveContainer" containerID="c71605342389cbf78dd4673d88ba0c916c140e7ccdb2f4ba39746c61fdc45be9" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.256748 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c71605342389cbf78dd4673d88ba0c916c140e7ccdb2f4ba39746c61fdc45be9"} err="failed to get container status \"c71605342389cbf78dd4673d88ba0c916c140e7ccdb2f4ba39746c61fdc45be9\": rpc error: code = NotFound desc = could not find container \"c71605342389cbf78dd4673d88ba0c916c140e7ccdb2f4ba39746c61fdc45be9\": container with ID starting with c71605342389cbf78dd4673d88ba0c916c140e7ccdb2f4ba39746c61fdc45be9 not found: ID does not exist" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.256764 4719 scope.go:117] "RemoveContainer" containerID="1faf5c2993ec241e120ed50cd7601b60c3b312339069c70768f76853f949a86a" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.258294 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1faf5c2993ec241e120ed50cd7601b60c3b312339069c70768f76853f949a86a"} err="failed to get container status \"1faf5c2993ec241e120ed50cd7601b60c3b312339069c70768f76853f949a86a\": rpc error: code = NotFound desc = could not find container \"1faf5c2993ec241e120ed50cd7601b60c3b312339069c70768f76853f949a86a\": container with ID starting with 1faf5c2993ec241e120ed50cd7601b60c3b312339069c70768f76853f949a86a not found: ID does not exist" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.258318 4719 scope.go:117] "RemoveContainer" containerID="ffc4804d631e3f3bd2201f6e4ce6aa2536661ce40e7a85af4bcfffd7ee457f1d" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.258670 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc4804d631e3f3bd2201f6e4ce6aa2536661ce40e7a85af4bcfffd7ee457f1d"} err="failed to get container status \"ffc4804d631e3f3bd2201f6e4ce6aa2536661ce40e7a85af4bcfffd7ee457f1d\": rpc error: code = NotFound desc = could not find container \"ffc4804d631e3f3bd2201f6e4ce6aa2536661ce40e7a85af4bcfffd7ee457f1d\": container with ID starting with ffc4804d631e3f3bd2201f6e4ce6aa2536661ce40e7a85af4bcfffd7ee457f1d not found: ID does not exist" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.258714 4719 scope.go:117] "RemoveContainer" containerID="f24e7defb05b0fe57c59432a6505ceac98d73ecf638246550f93a6336334e527" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.258988 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f24e7defb05b0fe57c59432a6505ceac98d73ecf638246550f93a6336334e527"} err="failed to get container status \"f24e7defb05b0fe57c59432a6505ceac98d73ecf638246550f93a6336334e527\": rpc error: code = NotFound desc = could not find container \"f24e7defb05b0fe57c59432a6505ceac98d73ecf638246550f93a6336334e527\": container with ID starting with f24e7defb05b0fe57c59432a6505ceac98d73ecf638246550f93a6336334e527 not found: ID does not exist" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.259014 4719 scope.go:117] "RemoveContainer" containerID="c71605342389cbf78dd4673d88ba0c916c140e7ccdb2f4ba39746c61fdc45be9" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.259233 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c71605342389cbf78dd4673d88ba0c916c140e7ccdb2f4ba39746c61fdc45be9"} err="failed to get container status \"c71605342389cbf78dd4673d88ba0c916c140e7ccdb2f4ba39746c61fdc45be9\": rpc error: code = NotFound desc = could not find container \"c71605342389cbf78dd4673d88ba0c916c140e7ccdb2f4ba39746c61fdc45be9\": container with ID starting with c71605342389cbf78dd4673d88ba0c916c140e7ccdb2f4ba39746c61fdc45be9 not found: ID does not exist" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.269277 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5f8w\" (UniqueName: \"kubernetes.io/projected/82195049-d846-4c69-a778-b85451e8d485-kube-api-access-z5f8w\") pod \"82195049-d846-4c69-a778-b85451e8d485\" (UID: \"82195049-d846-4c69-a778-b85451e8d485\") " Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.269933 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm5ml\" (UniqueName: \"kubernetes.io/projected/726baf17-62cc-40f1-bb7c-204e357465d1-kube-api-access-hm5ml\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.274488 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82195049-d846-4c69-a778-b85451e8d485-kube-api-access-z5f8w" (OuterVolumeSpecName: "kube-api-access-z5f8w") pod "82195049-d846-4c69-a778-b85451e8d485" (UID: "82195049-d846-4c69-a778-b85451e8d485"). InnerVolumeSpecName "kube-api-access-z5f8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.371561 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5f8w\" (UniqueName: \"kubernetes.io/projected/82195049-d846-4c69-a778-b85451e8d485-kube-api-access-z5f8w\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:25 crc kubenswrapper[4719]: I1009 15:34:25.611945 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 09 15:34:26 crc kubenswrapper[4719]: I1009 15:34:26.250918 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d3a820d9-3c13-47ec-a39e-dea4d60b7536","Type":"ContainerStarted","Data":"8611d72af232bf969bfd43aaa526aa9e349e35e4283ddc18128023360f6a6345"} Oct 09 15:34:26 crc kubenswrapper[4719]: I1009 15:34:26.251493 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 09 15:34:26 crc kubenswrapper[4719]: I1009 15:34:26.253735 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"1df540c9-8b54-44a5-9c5d-03cf736ee67a","Type":"ContainerStarted","Data":"cce126d359f8c49940c0dda7e85511a1d1b5e0b61053e56a8eb1db9022e520c5"} Oct 09 15:34:26 crc kubenswrapper[4719]: I1009 15:34:26.253939 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:34:26 crc kubenswrapper[4719]: I1009 15:34:26.256604 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c8f5a6f9-5554-485d-9aee-47449402e37b","Type":"ContainerStarted","Data":"9078d3710db1f54276251a04386eaa1cc78f42b433fab5cf22a1239ba1607a6d"} Oct 09 15:34:26 crc kubenswrapper[4719]: I1009 15:34:26.256763 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:34:26 crc kubenswrapper[4719]: I1009 15:34:26.258444 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"936ff8ba-7ad7-4796-af1a-4b1cbf75f560","Type":"ContainerStarted","Data":"145fbdf04b1f275a452075b05971b0e2bd5e559cddecce92628d9397dd991031"} Oct 09 15:34:26 crc kubenswrapper[4719]: I1009 15:34:26.264684 4719 generic.go:334] "Generic (PLEG): container finished" podID="8898aee0-de9a-4957-a478-0f322abd395b" containerID="8adc252ae0feb6c2483c2829e2f3067b3cb04b2cbe66ec68f79ee7d13e96a21c" exitCode=0 Oct 09 15:34:26 crc kubenswrapper[4719]: I1009 15:34:26.264728 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-f584-account-create-h95kn" event={"ID":"8898aee0-de9a-4957-a478-0f322abd395b","Type":"ContainerDied","Data":"8adc252ae0feb6c2483c2829e2f3067b3cb04b2cbe66ec68f79ee7d13e96a21c"} Oct 09 15:34:26 crc kubenswrapper[4719]: I1009 15:34:26.309856 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=52.225358098 podStartE2EDuration="1m0.309819455s" podCreationTimestamp="2025-10-09 15:33:26 +0000 UTC" firstStartedPulling="2025-10-09 15:33:40.086387635 +0000 UTC m=+925.596098920" lastFinishedPulling="2025-10-09 15:33:48.170848992 +0000 UTC m=+933.680560277" observedRunningTime="2025-10-09 15:34:26.286934588 +0000 UTC m=+971.796645883" watchObservedRunningTime="2025-10-09 15:34:26.309819455 +0000 UTC m=+971.819530740" Oct 09 15:34:26 crc kubenswrapper[4719]: I1009 15:34:26.336221 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-notifications-server-0" podStartSLOduration=52.263797617 podStartE2EDuration="1m0.336190231s" podCreationTimestamp="2025-10-09 15:33:26 +0000 UTC" firstStartedPulling="2025-10-09 15:33:40.484784544 +0000 UTC m=+925.994495829" lastFinishedPulling="2025-10-09 15:33:48.557177158 +0000 UTC m=+934.066888443" observedRunningTime="2025-10-09 15:34:26.326894957 +0000 UTC m=+971.836606242" watchObservedRunningTime="2025-10-09 15:34:26.336190231 +0000 UTC m=+971.845901536" Oct 09 15:34:26 crc kubenswrapper[4719]: I1009 15:34:26.353628 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=53.365664422 podStartE2EDuration="1m1.353609495s" podCreationTimestamp="2025-10-09 15:33:25 +0000 UTC" firstStartedPulling="2025-10-09 15:33:40.97661221 +0000 UTC m=+926.486323485" lastFinishedPulling="2025-10-09 15:33:48.964557273 +0000 UTC m=+934.474268558" observedRunningTime="2025-10-09 15:34:26.351709565 +0000 UTC m=+971.861420860" watchObservedRunningTime="2025-10-09 15:34:26.353609495 +0000 UTC m=+971.863320780" Oct 09 15:34:27 crc kubenswrapper[4719]: I1009 15:34:27.107007 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-p4t6l" podUID="f0151a18-0608-47b9-b58a-7eef9dfaf31b" containerName="ovn-controller" probeResult="failure" output=< Oct 09 15:34:27 crc kubenswrapper[4719]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 09 15:34:27 crc kubenswrapper[4719]: > Oct 09 15:34:27 crc kubenswrapper[4719]: I1009 15:34:27.747700 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-f584-account-create-h95kn" Oct 09 15:34:27 crc kubenswrapper[4719]: I1009 15:34:27.824522 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdrts\" (UniqueName: \"kubernetes.io/projected/8898aee0-de9a-4957-a478-0f322abd395b-kube-api-access-vdrts\") pod \"8898aee0-de9a-4957-a478-0f322abd395b\" (UID: \"8898aee0-de9a-4957-a478-0f322abd395b\") " Oct 09 15:34:27 crc kubenswrapper[4719]: I1009 15:34:27.830878 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8898aee0-de9a-4957-a478-0f322abd395b-kube-api-access-vdrts" (OuterVolumeSpecName: "kube-api-access-vdrts") pod "8898aee0-de9a-4957-a478-0f322abd395b" (UID: "8898aee0-de9a-4957-a478-0f322abd395b"). InnerVolumeSpecName "kube-api-access-vdrts". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:34:27 crc kubenswrapper[4719]: I1009 15:34:27.926759 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdrts\" (UniqueName: \"kubernetes.io/projected/8898aee0-de9a-4957-a478-0f322abd395b-kube-api-access-vdrts\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:28 crc kubenswrapper[4719]: I1009 15:34:28.291542 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-f584-account-create-h95kn" event={"ID":"8898aee0-de9a-4957-a478-0f322abd395b","Type":"ContainerDied","Data":"2e401662b2558ba1d3b213c8bb02e12db250b2df26c89d7630030687fbbfcfdc"} Oct 09 15:34:28 crc kubenswrapper[4719]: I1009 15:34:28.291590 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e401662b2558ba1d3b213c8bb02e12db250b2df26c89d7630030687fbbfcfdc" Oct 09 15:34:28 crc kubenswrapper[4719]: I1009 15:34:28.291589 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-f584-account-create-h95kn" Oct 09 15:34:29 crc kubenswrapper[4719]: I1009 15:34:29.300322 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"936ff8ba-7ad7-4796-af1a-4b1cbf75f560","Type":"ContainerStarted","Data":"338a1afba406d4123764fa09c2c00cb6cc9e088c9b16d3636056291c2b3a2576"} Oct 09 15:34:31 crc kubenswrapper[4719]: I1009 15:34:31.147529 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-fc37-account-create-lfn7t"] Oct 09 15:34:31 crc kubenswrapper[4719]: E1009 15:34:31.148190 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8898aee0-de9a-4957-a478-0f322abd395b" containerName="mariadb-account-create" Oct 09 15:34:31 crc kubenswrapper[4719]: I1009 15:34:31.148204 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="8898aee0-de9a-4957-a478-0f322abd395b" containerName="mariadb-account-create" Oct 09 15:34:31 crc kubenswrapper[4719]: E1009 15:34:31.148218 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82195049-d846-4c69-a778-b85451e8d485" containerName="mariadb-database-create" Oct 09 15:34:31 crc kubenswrapper[4719]: I1009 15:34:31.148225 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="82195049-d846-4c69-a778-b85451e8d485" containerName="mariadb-database-create" Oct 09 15:34:31 crc kubenswrapper[4719]: E1009 15:34:31.148245 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="726baf17-62cc-40f1-bb7c-204e357465d1" containerName="mariadb-database-create" Oct 09 15:34:31 crc kubenswrapper[4719]: I1009 15:34:31.148251 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="726baf17-62cc-40f1-bb7c-204e357465d1" containerName="mariadb-database-create" Oct 09 15:34:31 crc kubenswrapper[4719]: I1009 15:34:31.148409 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="82195049-d846-4c69-a778-b85451e8d485" containerName="mariadb-database-create" Oct 09 15:34:31 crc kubenswrapper[4719]: I1009 15:34:31.148420 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="8898aee0-de9a-4957-a478-0f322abd395b" containerName="mariadb-account-create" Oct 09 15:34:31 crc kubenswrapper[4719]: I1009 15:34:31.148439 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="726baf17-62cc-40f1-bb7c-204e357465d1" containerName="mariadb-database-create" Oct 09 15:34:31 crc kubenswrapper[4719]: I1009 15:34:31.149035 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fc37-account-create-lfn7t" Oct 09 15:34:31 crc kubenswrapper[4719]: I1009 15:34:31.158733 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fc37-account-create-lfn7t"] Oct 09 15:34:31 crc kubenswrapper[4719]: I1009 15:34:31.159066 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 09 15:34:31 crc kubenswrapper[4719]: I1009 15:34:31.184371 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx9cw\" (UniqueName: \"kubernetes.io/projected/baf7be65-641f-4d6a-a15c-ac903de135ab-kube-api-access-nx9cw\") pod \"placement-fc37-account-create-lfn7t\" (UID: \"baf7be65-641f-4d6a-a15c-ac903de135ab\") " pod="openstack/placement-fc37-account-create-lfn7t" Oct 09 15:34:31 crc kubenswrapper[4719]: I1009 15:34:31.287109 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx9cw\" (UniqueName: \"kubernetes.io/projected/baf7be65-641f-4d6a-a15c-ac903de135ab-kube-api-access-nx9cw\") pod \"placement-fc37-account-create-lfn7t\" (UID: \"baf7be65-641f-4d6a-a15c-ac903de135ab\") " pod="openstack/placement-fc37-account-create-lfn7t" Oct 09 15:34:31 crc kubenswrapper[4719]: I1009 15:34:31.306836 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx9cw\" (UniqueName: \"kubernetes.io/projected/baf7be65-641f-4d6a-a15c-ac903de135ab-kube-api-access-nx9cw\") pod \"placement-fc37-account-create-lfn7t\" (UID: \"baf7be65-641f-4d6a-a15c-ac903de135ab\") " pod="openstack/placement-fc37-account-create-lfn7t" Oct 09 15:34:31 crc kubenswrapper[4719]: I1009 15:34:31.319610 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0358ac72-8c58-4e63-843e-b9eaa35aefdf","Type":"ContainerStarted","Data":"4ac9fd77d7e5cb4cc36e1fa88c536c40ff93dcb63e3de8ebe0313e64bc9ae3d5"} Oct 09 15:34:31 crc kubenswrapper[4719]: I1009 15:34:31.319651 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0358ac72-8c58-4e63-843e-b9eaa35aefdf","Type":"ContainerStarted","Data":"6ed3fc950dbbae4cd6d4d1e2096a915330db9149a7b7e199bb359c68d76f4c8d"} Oct 09 15:34:31 crc kubenswrapper[4719]: I1009 15:34:31.319659 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0358ac72-8c58-4e63-843e-b9eaa35aefdf","Type":"ContainerStarted","Data":"4b87cebb3a721147f04051c630dd0716b13aac691ca55382f3eeeb3df369b726"} Oct 09 15:34:31 crc kubenswrapper[4719]: I1009 15:34:31.319667 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0358ac72-8c58-4e63-843e-b9eaa35aefdf","Type":"ContainerStarted","Data":"84c5d72f6aa5d8b4374af425a9857d3ce515687ba23ffdaad427fa6b2efbf78a"} Oct 09 15:34:31 crc kubenswrapper[4719]: I1009 15:34:31.465528 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fc37-account-create-lfn7t" Oct 09 15:34:31 crc kubenswrapper[4719]: I1009 15:34:31.936739 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fc37-account-create-lfn7t"] Oct 09 15:34:31 crc kubenswrapper[4719]: W1009 15:34:31.940531 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbaf7be65_641f_4d6a_a15c_ac903de135ab.slice/crio-191a8aedf756400e80b9e4344afe68198ff24990b53393d9a47b42d4b6cc8f78 WatchSource:0}: Error finding container 191a8aedf756400e80b9e4344afe68198ff24990b53393d9a47b42d4b6cc8f78: Status 404 returned error can't find the container with id 191a8aedf756400e80b9e4344afe68198ff24990b53393d9a47b42d4b6cc8f78 Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.108328 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-p4t6l" podUID="f0151a18-0608-47b9-b58a-7eef9dfaf31b" containerName="ovn-controller" probeResult="failure" output=< Oct 09 15:34:32 crc kubenswrapper[4719]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 09 15:34:32 crc kubenswrapper[4719]: > Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.171002 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hqfvq" Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.175851 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hqfvq" Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.327712 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fc37-account-create-lfn7t" event={"ID":"baf7be65-641f-4d6a-a15c-ac903de135ab","Type":"ContainerStarted","Data":"e295f1d7595245f40b732ea498b1e7088f654781f310a36b2ac93249892d8362"} Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.327753 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fc37-account-create-lfn7t" event={"ID":"baf7be65-641f-4d6a-a15c-ac903de135ab","Type":"ContainerStarted","Data":"191a8aedf756400e80b9e4344afe68198ff24990b53393d9a47b42d4b6cc8f78"} Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.332524 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0358ac72-8c58-4e63-843e-b9eaa35aefdf","Type":"ContainerStarted","Data":"43000da111d9bbe921df6bfa86f9e7f2fcb69dc4768fcd19677efeee4078bcf8"} Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.332567 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0358ac72-8c58-4e63-843e-b9eaa35aefdf","Type":"ContainerStarted","Data":"45cebcbac08970f2b9cadcae2832aa305742d810b66c7b3acb7e8e7cba25a5b8"} Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.345010 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-fc37-account-create-lfn7t" podStartSLOduration=1.344972384 podStartE2EDuration="1.344972384s" podCreationTimestamp="2025-10-09 15:34:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:34:32.340865384 +0000 UTC m=+977.850576689" watchObservedRunningTime="2025-10-09 15:34:32.344972384 +0000 UTC m=+977.854683669" Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.431530 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-p4t6l-config-rbc9j"] Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.432809 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-p4t6l-config-rbc9j" Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.445384 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.454471 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-p4t6l-config-rbc9j"] Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.506241 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/217fe184-5e24-4906-8b31-a4ccf54d135a-additional-scripts\") pod \"ovn-controller-p4t6l-config-rbc9j\" (UID: \"217fe184-5e24-4906-8b31-a4ccf54d135a\") " pod="openstack/ovn-controller-p4t6l-config-rbc9j" Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.506311 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/217fe184-5e24-4906-8b31-a4ccf54d135a-var-log-ovn\") pod \"ovn-controller-p4t6l-config-rbc9j\" (UID: \"217fe184-5e24-4906-8b31-a4ccf54d135a\") " pod="openstack/ovn-controller-p4t6l-config-rbc9j" Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.506344 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/217fe184-5e24-4906-8b31-a4ccf54d135a-var-run-ovn\") pod \"ovn-controller-p4t6l-config-rbc9j\" (UID: \"217fe184-5e24-4906-8b31-a4ccf54d135a\") " pod="openstack/ovn-controller-p4t6l-config-rbc9j" Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.506410 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5fzf\" (UniqueName: \"kubernetes.io/projected/217fe184-5e24-4906-8b31-a4ccf54d135a-kube-api-access-m5fzf\") pod \"ovn-controller-p4t6l-config-rbc9j\" (UID: \"217fe184-5e24-4906-8b31-a4ccf54d135a\") " pod="openstack/ovn-controller-p4t6l-config-rbc9j" Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.506436 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/217fe184-5e24-4906-8b31-a4ccf54d135a-var-run\") pod \"ovn-controller-p4t6l-config-rbc9j\" (UID: \"217fe184-5e24-4906-8b31-a4ccf54d135a\") " pod="openstack/ovn-controller-p4t6l-config-rbc9j" Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.506482 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/217fe184-5e24-4906-8b31-a4ccf54d135a-scripts\") pod \"ovn-controller-p4t6l-config-rbc9j\" (UID: \"217fe184-5e24-4906-8b31-a4ccf54d135a\") " pod="openstack/ovn-controller-p4t6l-config-rbc9j" Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.607739 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/217fe184-5e24-4906-8b31-a4ccf54d135a-scripts\") pod \"ovn-controller-p4t6l-config-rbc9j\" (UID: \"217fe184-5e24-4906-8b31-a4ccf54d135a\") " pod="openstack/ovn-controller-p4t6l-config-rbc9j" Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.607810 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/217fe184-5e24-4906-8b31-a4ccf54d135a-additional-scripts\") pod \"ovn-controller-p4t6l-config-rbc9j\" (UID: \"217fe184-5e24-4906-8b31-a4ccf54d135a\") " pod="openstack/ovn-controller-p4t6l-config-rbc9j" Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.607872 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/217fe184-5e24-4906-8b31-a4ccf54d135a-var-log-ovn\") pod \"ovn-controller-p4t6l-config-rbc9j\" (UID: \"217fe184-5e24-4906-8b31-a4ccf54d135a\") " pod="openstack/ovn-controller-p4t6l-config-rbc9j" Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.607907 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/217fe184-5e24-4906-8b31-a4ccf54d135a-var-run-ovn\") pod \"ovn-controller-p4t6l-config-rbc9j\" (UID: \"217fe184-5e24-4906-8b31-a4ccf54d135a\") " pod="openstack/ovn-controller-p4t6l-config-rbc9j" Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.607961 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5fzf\" (UniqueName: \"kubernetes.io/projected/217fe184-5e24-4906-8b31-a4ccf54d135a-kube-api-access-m5fzf\") pod \"ovn-controller-p4t6l-config-rbc9j\" (UID: \"217fe184-5e24-4906-8b31-a4ccf54d135a\") " pod="openstack/ovn-controller-p4t6l-config-rbc9j" Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.607987 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/217fe184-5e24-4906-8b31-a4ccf54d135a-var-run\") pod \"ovn-controller-p4t6l-config-rbc9j\" (UID: \"217fe184-5e24-4906-8b31-a4ccf54d135a\") " pod="openstack/ovn-controller-p4t6l-config-rbc9j" Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.608239 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/217fe184-5e24-4906-8b31-a4ccf54d135a-var-run\") pod \"ovn-controller-p4t6l-config-rbc9j\" (UID: \"217fe184-5e24-4906-8b31-a4ccf54d135a\") " pod="openstack/ovn-controller-p4t6l-config-rbc9j" Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.608281 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/217fe184-5e24-4906-8b31-a4ccf54d135a-var-run-ovn\") pod \"ovn-controller-p4t6l-config-rbc9j\" (UID: \"217fe184-5e24-4906-8b31-a4ccf54d135a\") " pod="openstack/ovn-controller-p4t6l-config-rbc9j" Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.608235 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/217fe184-5e24-4906-8b31-a4ccf54d135a-var-log-ovn\") pod \"ovn-controller-p4t6l-config-rbc9j\" (UID: \"217fe184-5e24-4906-8b31-a4ccf54d135a\") " pod="openstack/ovn-controller-p4t6l-config-rbc9j" Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.609028 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/217fe184-5e24-4906-8b31-a4ccf54d135a-additional-scripts\") pod \"ovn-controller-p4t6l-config-rbc9j\" (UID: \"217fe184-5e24-4906-8b31-a4ccf54d135a\") " pod="openstack/ovn-controller-p4t6l-config-rbc9j" Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.609666 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/217fe184-5e24-4906-8b31-a4ccf54d135a-scripts\") pod \"ovn-controller-p4t6l-config-rbc9j\" (UID: \"217fe184-5e24-4906-8b31-a4ccf54d135a\") " pod="openstack/ovn-controller-p4t6l-config-rbc9j" Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.634371 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5fzf\" (UniqueName: \"kubernetes.io/projected/217fe184-5e24-4906-8b31-a4ccf54d135a-kube-api-access-m5fzf\") pod \"ovn-controller-p4t6l-config-rbc9j\" (UID: \"217fe184-5e24-4906-8b31-a4ccf54d135a\") " pod="openstack/ovn-controller-p4t6l-config-rbc9j" Oct 09 15:34:32 crc kubenswrapper[4719]: I1009 15:34:32.810013 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-p4t6l-config-rbc9j" Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.265439 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-p4t6l-config-rbc9j"] Oct 09 15:34:33 crc kubenswrapper[4719]: W1009 15:34:33.269551 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod217fe184_5e24_4906_8b31_a4ccf54d135a.slice/crio-00da31623a44bd667920e8809fdbcb5b953077f125dc2afb9292b56cd40fbc56 WatchSource:0}: Error finding container 00da31623a44bd667920e8809fdbcb5b953077f125dc2afb9292b56cd40fbc56: Status 404 returned error can't find the container with id 00da31623a44bd667920e8809fdbcb5b953077f125dc2afb9292b56cd40fbc56 Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.343858 4719 generic.go:334] "Generic (PLEG): container finished" podID="baf7be65-641f-4d6a-a15c-ac903de135ab" containerID="e295f1d7595245f40b732ea498b1e7088f654781f310a36b2ac93249892d8362" exitCode=0 Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.344056 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fc37-account-create-lfn7t" event={"ID":"baf7be65-641f-4d6a-a15c-ac903de135ab","Type":"ContainerDied","Data":"e295f1d7595245f40b732ea498b1e7088f654781f310a36b2ac93249892d8362"} Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.358052 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0358ac72-8c58-4e63-843e-b9eaa35aefdf","Type":"ContainerStarted","Data":"703aafdbff918a3280793674ab987c990a4722eaad0708f84db13f3cdfd1fe82"} Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.359894 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-p4t6l-config-rbc9j" event={"ID":"217fe184-5e24-4906-8b31-a4ccf54d135a","Type":"ContainerStarted","Data":"00da31623a44bd667920e8809fdbcb5b953077f125dc2afb9292b56cd40fbc56"} Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.406434 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=21.614397152 podStartE2EDuration="31.406328213s" podCreationTimestamp="2025-10-09 15:34:02 +0000 UTC" firstStartedPulling="2025-10-09 15:34:20.60021 +0000 UTC m=+966.109921285" lastFinishedPulling="2025-10-09 15:34:30.392141061 +0000 UTC m=+975.901852346" observedRunningTime="2025-10-09 15:34:33.402402598 +0000 UTC m=+978.912113903" watchObservedRunningTime="2025-10-09 15:34:33.406328213 +0000 UTC m=+978.916039498" Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.690440 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bd4bb89d9-649p9"] Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.692686 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.694806 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.698996 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd4bb89d9-649p9"] Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.827557 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-dns-svc\") pod \"dnsmasq-dns-7bd4bb89d9-649p9\" (UID: \"732560f4-c5be-46ab-9266-d59d4fe1a07d\") " pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.827593 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-config\") pod \"dnsmasq-dns-7bd4bb89d9-649p9\" (UID: \"732560f4-c5be-46ab-9266-d59d4fe1a07d\") " pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.827825 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd4bb89d9-649p9\" (UID: \"732560f4-c5be-46ab-9266-d59d4fe1a07d\") " pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.827888 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd4bb89d9-649p9\" (UID: \"732560f4-c5be-46ab-9266-d59d4fe1a07d\") " pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.827988 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjrhl\" (UniqueName: \"kubernetes.io/projected/732560f4-c5be-46ab-9266-d59d4fe1a07d-kube-api-access-qjrhl\") pod \"dnsmasq-dns-7bd4bb89d9-649p9\" (UID: \"732560f4-c5be-46ab-9266-d59d4fe1a07d\") " pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.828191 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd4bb89d9-649p9\" (UID: \"732560f4-c5be-46ab-9266-d59d4fe1a07d\") " pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.930370 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd4bb89d9-649p9\" (UID: \"732560f4-c5be-46ab-9266-d59d4fe1a07d\") " pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.930483 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-dns-svc\") pod \"dnsmasq-dns-7bd4bb89d9-649p9\" (UID: \"732560f4-c5be-46ab-9266-d59d4fe1a07d\") " pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.930511 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-config\") pod \"dnsmasq-dns-7bd4bb89d9-649p9\" (UID: \"732560f4-c5be-46ab-9266-d59d4fe1a07d\") " pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.930573 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd4bb89d9-649p9\" (UID: \"732560f4-c5be-46ab-9266-d59d4fe1a07d\") " pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.930599 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd4bb89d9-649p9\" (UID: \"732560f4-c5be-46ab-9266-d59d4fe1a07d\") " pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.930629 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjrhl\" (UniqueName: \"kubernetes.io/projected/732560f4-c5be-46ab-9266-d59d4fe1a07d-kube-api-access-qjrhl\") pod \"dnsmasq-dns-7bd4bb89d9-649p9\" (UID: \"732560f4-c5be-46ab-9266-d59d4fe1a07d\") " pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.931539 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd4bb89d9-649p9\" (UID: \"732560f4-c5be-46ab-9266-d59d4fe1a07d\") " pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.931537 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-dns-svc\") pod \"dnsmasq-dns-7bd4bb89d9-649p9\" (UID: \"732560f4-c5be-46ab-9266-d59d4fe1a07d\") " pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.931763 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd4bb89d9-649p9\" (UID: \"732560f4-c5be-46ab-9266-d59d4fe1a07d\") " pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.931781 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd4bb89d9-649p9\" (UID: \"732560f4-c5be-46ab-9266-d59d4fe1a07d\") " pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.932081 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-config\") pod \"dnsmasq-dns-7bd4bb89d9-649p9\" (UID: \"732560f4-c5be-46ab-9266-d59d4fe1a07d\") " pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" Oct 09 15:34:33 crc kubenswrapper[4719]: I1009 15:34:33.960827 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjrhl\" (UniqueName: \"kubernetes.io/projected/732560f4-c5be-46ab-9266-d59d4fe1a07d-kube-api-access-qjrhl\") pod \"dnsmasq-dns-7bd4bb89d9-649p9\" (UID: \"732560f4-c5be-46ab-9266-d59d4fe1a07d\") " pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" Oct 09 15:34:34 crc kubenswrapper[4719]: I1009 15:34:34.022993 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" Oct 09 15:34:34 crc kubenswrapper[4719]: I1009 15:34:34.367897 4719 generic.go:334] "Generic (PLEG): container finished" podID="936ff8ba-7ad7-4796-af1a-4b1cbf75f560" containerID="338a1afba406d4123764fa09c2c00cb6cc9e088c9b16d3636056291c2b3a2576" exitCode=0 Oct 09 15:34:34 crc kubenswrapper[4719]: I1009 15:34:34.367973 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"936ff8ba-7ad7-4796-af1a-4b1cbf75f560","Type":"ContainerDied","Data":"338a1afba406d4123764fa09c2c00cb6cc9e088c9b16d3636056291c2b3a2576"} Oct 09 15:34:34 crc kubenswrapper[4719]: I1009 15:34:34.370616 4719 generic.go:334] "Generic (PLEG): container finished" podID="217fe184-5e24-4906-8b31-a4ccf54d135a" containerID="4dd880f5549a37e3c5798cffed123243f1447b4dbaf302745557799f6c966e66" exitCode=0 Oct 09 15:34:34 crc kubenswrapper[4719]: I1009 15:34:34.370661 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-p4t6l-config-rbc9j" event={"ID":"217fe184-5e24-4906-8b31-a4ccf54d135a","Type":"ContainerDied","Data":"4dd880f5549a37e3c5798cffed123243f1447b4dbaf302745557799f6c966e66"} Oct 09 15:34:34 crc kubenswrapper[4719]: I1009 15:34:34.553053 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd4bb89d9-649p9"] Oct 09 15:34:34 crc kubenswrapper[4719]: I1009 15:34:34.799039 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fc37-account-create-lfn7t" Oct 09 15:34:34 crc kubenswrapper[4719]: I1009 15:34:34.951705 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx9cw\" (UniqueName: \"kubernetes.io/projected/baf7be65-641f-4d6a-a15c-ac903de135ab-kube-api-access-nx9cw\") pod \"baf7be65-641f-4d6a-a15c-ac903de135ab\" (UID: \"baf7be65-641f-4d6a-a15c-ac903de135ab\") " Oct 09 15:34:34 crc kubenswrapper[4719]: I1009 15:34:34.955661 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baf7be65-641f-4d6a-a15c-ac903de135ab-kube-api-access-nx9cw" (OuterVolumeSpecName: "kube-api-access-nx9cw") pod "baf7be65-641f-4d6a-a15c-ac903de135ab" (UID: "baf7be65-641f-4d6a-a15c-ac903de135ab"). InnerVolumeSpecName "kube-api-access-nx9cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.053737 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx9cw\" (UniqueName: \"kubernetes.io/projected/baf7be65-641f-4d6a-a15c-ac903de135ab-kube-api-access-nx9cw\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.384491 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fc37-account-create-lfn7t" Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.384478 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fc37-account-create-lfn7t" event={"ID":"baf7be65-641f-4d6a-a15c-ac903de135ab","Type":"ContainerDied","Data":"191a8aedf756400e80b9e4344afe68198ff24990b53393d9a47b42d4b6cc8f78"} Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.384633 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="191a8aedf756400e80b9e4344afe68198ff24990b53393d9a47b42d4b6cc8f78" Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.386558 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"936ff8ba-7ad7-4796-af1a-4b1cbf75f560","Type":"ContainerStarted","Data":"a251b00040279e44f69da7e525e52b3c57cc34b753966fadd377ad44ac45289d"} Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.388427 4719 generic.go:334] "Generic (PLEG): container finished" podID="732560f4-c5be-46ab-9266-d59d4fe1a07d" containerID="1a75b05f41b40aa4f726787413be2797ea0145036c5b1bab14a954912931dd23" exitCode=0 Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.388549 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" event={"ID":"732560f4-c5be-46ab-9266-d59d4fe1a07d","Type":"ContainerDied","Data":"1a75b05f41b40aa4f726787413be2797ea0145036c5b1bab14a954912931dd23"} Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.388628 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" event={"ID":"732560f4-c5be-46ab-9266-d59d4fe1a07d","Type":"ContainerStarted","Data":"1259378a83e9331de5a7ba5d07ef0bbcb12f764567c74f67cbde44e3b000038d"} Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.757689 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-p4t6l-config-rbc9j" Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.873096 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/217fe184-5e24-4906-8b31-a4ccf54d135a-var-run-ovn\") pod \"217fe184-5e24-4906-8b31-a4ccf54d135a\" (UID: \"217fe184-5e24-4906-8b31-a4ccf54d135a\") " Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.873176 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/217fe184-5e24-4906-8b31-a4ccf54d135a-additional-scripts\") pod \"217fe184-5e24-4906-8b31-a4ccf54d135a\" (UID: \"217fe184-5e24-4906-8b31-a4ccf54d135a\") " Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.873239 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/217fe184-5e24-4906-8b31-a4ccf54d135a-var-run\") pod \"217fe184-5e24-4906-8b31-a4ccf54d135a\" (UID: \"217fe184-5e24-4906-8b31-a4ccf54d135a\") " Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.873284 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5fzf\" (UniqueName: \"kubernetes.io/projected/217fe184-5e24-4906-8b31-a4ccf54d135a-kube-api-access-m5fzf\") pod \"217fe184-5e24-4906-8b31-a4ccf54d135a\" (UID: \"217fe184-5e24-4906-8b31-a4ccf54d135a\") " Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.873378 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/217fe184-5e24-4906-8b31-a4ccf54d135a-var-run" (OuterVolumeSpecName: "var-run") pod "217fe184-5e24-4906-8b31-a4ccf54d135a" (UID: "217fe184-5e24-4906-8b31-a4ccf54d135a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.873418 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/217fe184-5e24-4906-8b31-a4ccf54d135a-scripts\") pod \"217fe184-5e24-4906-8b31-a4ccf54d135a\" (UID: \"217fe184-5e24-4906-8b31-a4ccf54d135a\") " Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.873295 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/217fe184-5e24-4906-8b31-a4ccf54d135a-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "217fe184-5e24-4906-8b31-a4ccf54d135a" (UID: "217fe184-5e24-4906-8b31-a4ccf54d135a"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.873479 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/217fe184-5e24-4906-8b31-a4ccf54d135a-var-log-ovn\") pod \"217fe184-5e24-4906-8b31-a4ccf54d135a\" (UID: \"217fe184-5e24-4906-8b31-a4ccf54d135a\") " Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.873686 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/217fe184-5e24-4906-8b31-a4ccf54d135a-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "217fe184-5e24-4906-8b31-a4ccf54d135a" (UID: "217fe184-5e24-4906-8b31-a4ccf54d135a"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.873980 4719 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/217fe184-5e24-4906-8b31-a4ccf54d135a-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.873998 4719 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/217fe184-5e24-4906-8b31-a4ccf54d135a-var-run\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.874007 4719 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/217fe184-5e24-4906-8b31-a4ccf54d135a-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.874718 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/217fe184-5e24-4906-8b31-a4ccf54d135a-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "217fe184-5e24-4906-8b31-a4ccf54d135a" (UID: "217fe184-5e24-4906-8b31-a4ccf54d135a"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.875051 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/217fe184-5e24-4906-8b31-a4ccf54d135a-scripts" (OuterVolumeSpecName: "scripts") pod "217fe184-5e24-4906-8b31-a4ccf54d135a" (UID: "217fe184-5e24-4906-8b31-a4ccf54d135a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.880182 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/217fe184-5e24-4906-8b31-a4ccf54d135a-kube-api-access-m5fzf" (OuterVolumeSpecName: "kube-api-access-m5fzf") pod "217fe184-5e24-4906-8b31-a4ccf54d135a" (UID: "217fe184-5e24-4906-8b31-a4ccf54d135a"). InnerVolumeSpecName "kube-api-access-m5fzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.975828 4719 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/217fe184-5e24-4906-8b31-a4ccf54d135a-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.975861 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5fzf\" (UniqueName: \"kubernetes.io/projected/217fe184-5e24-4906-8b31-a4ccf54d135a-kube-api-access-m5fzf\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:35 crc kubenswrapper[4719]: I1009 15:34:35.975872 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/217fe184-5e24-4906-8b31-a4ccf54d135a-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:36 crc kubenswrapper[4719]: I1009 15:34:36.398668 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" event={"ID":"732560f4-c5be-46ab-9266-d59d4fe1a07d","Type":"ContainerStarted","Data":"6e47fa6062535038986c21a5044357e21d36839618c053e1c5ec902baa5f6aaf"} Oct 09 15:34:36 crc kubenswrapper[4719]: I1009 15:34:36.399379 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" Oct 09 15:34:36 crc kubenswrapper[4719]: I1009 15:34:36.401440 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-p4t6l-config-rbc9j" event={"ID":"217fe184-5e24-4906-8b31-a4ccf54d135a","Type":"ContainerDied","Data":"00da31623a44bd667920e8809fdbcb5b953077f125dc2afb9292b56cd40fbc56"} Oct 09 15:34:36 crc kubenswrapper[4719]: I1009 15:34:36.401474 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00da31623a44bd667920e8809fdbcb5b953077f125dc2afb9292b56cd40fbc56" Oct 09 15:34:36 crc kubenswrapper[4719]: I1009 15:34:36.401516 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-p4t6l-config-rbc9j" Oct 09 15:34:36 crc kubenswrapper[4719]: I1009 15:34:36.427487 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" podStartSLOduration=3.427463187 podStartE2EDuration="3.427463187s" podCreationTimestamp="2025-10-09 15:34:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:34:36.422981614 +0000 UTC m=+981.932692919" watchObservedRunningTime="2025-10-09 15:34:36.427463187 +0000 UTC m=+981.937174462" Oct 09 15:34:36 crc kubenswrapper[4719]: I1009 15:34:36.858152 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-p4t6l-config-rbc9j"] Oct 09 15:34:36 crc kubenswrapper[4719]: I1009 15:34:36.871539 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-p4t6l-config-rbc9j"] Oct 09 15:34:36 crc kubenswrapper[4719]: I1009 15:34:36.976967 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:34:36 crc kubenswrapper[4719]: I1009 15:34:36.977070 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:34:37 crc kubenswrapper[4719]: I1009 15:34:37.127381 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-p4t6l" Oct 09 15:34:37 crc kubenswrapper[4719]: I1009 15:34:37.170457 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="217fe184-5e24-4906-8b31-a4ccf54d135a" path="/var/lib/kubelet/pods/217fe184-5e24-4906-8b31-a4ccf54d135a/volumes" Oct 09 15:34:37 crc kubenswrapper[4719]: I1009 15:34:37.195845 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="c8f5a6f9-5554-485d-9aee-47449402e37b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.108:5671: connect: connection refused" Oct 09 15:34:37 crc kubenswrapper[4719]: I1009 15:34:37.567636 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="d3a820d9-3c13-47ec-a39e-dea4d60b7536" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Oct 09 15:34:37 crc kubenswrapper[4719]: I1009 15:34:37.870620 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-notifications-server-0" podUID="1df540c9-8b54-44a5-9c5d-03cf736ee67a" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.110:5671: connect: connection refused" Oct 09 15:34:38 crc kubenswrapper[4719]: I1009 15:34:38.419186 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"936ff8ba-7ad7-4796-af1a-4b1cbf75f560","Type":"ContainerStarted","Data":"a010c0eed107a2550b83adc1060a991b99bc6bfc506ebd0f7dbc9e1b3b961505"} Oct 09 15:34:38 crc kubenswrapper[4719]: I1009 15:34:38.419550 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"936ff8ba-7ad7-4796-af1a-4b1cbf75f560","Type":"ContainerStarted","Data":"68ee72b70100c95f49d5c64a588f6a9104cf4ddfe72e3dd2e724807481956bd4"} Oct 09 15:34:38 crc kubenswrapper[4719]: I1009 15:34:38.486016 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=14.485997747 podStartE2EDuration="14.485997747s" podCreationTimestamp="2025-10-09 15:34:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:34:38.474511002 +0000 UTC m=+983.984222307" watchObservedRunningTime="2025-10-09 15:34:38.485997747 +0000 UTC m=+983.995709032" Oct 09 15:34:39 crc kubenswrapper[4719]: I1009 15:34:39.954939 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:39 crc kubenswrapper[4719]: I1009 15:34:39.955115 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:39 crc kubenswrapper[4719]: I1009 15:34:39.960601 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:40 crc kubenswrapper[4719]: I1009 15:34:40.440686 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 09 15:34:40 crc kubenswrapper[4719]: I1009 15:34:40.819895 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c1e6-account-create-wc9df"] Oct 09 15:34:40 crc kubenswrapper[4719]: E1009 15:34:40.821008 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="217fe184-5e24-4906-8b31-a4ccf54d135a" containerName="ovn-config" Oct 09 15:34:40 crc kubenswrapper[4719]: I1009 15:34:40.821034 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="217fe184-5e24-4906-8b31-a4ccf54d135a" containerName="ovn-config" Oct 09 15:34:40 crc kubenswrapper[4719]: E1009 15:34:40.821069 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baf7be65-641f-4d6a-a15c-ac903de135ab" containerName="mariadb-account-create" Oct 09 15:34:40 crc kubenswrapper[4719]: I1009 15:34:40.821078 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="baf7be65-641f-4d6a-a15c-ac903de135ab" containerName="mariadb-account-create" Oct 09 15:34:40 crc kubenswrapper[4719]: I1009 15:34:40.821404 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="baf7be65-641f-4d6a-a15c-ac903de135ab" containerName="mariadb-account-create" Oct 09 15:34:40 crc kubenswrapper[4719]: I1009 15:34:40.821437 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="217fe184-5e24-4906-8b31-a4ccf54d135a" containerName="ovn-config" Oct 09 15:34:40 crc kubenswrapper[4719]: I1009 15:34:40.822287 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c1e6-account-create-wc9df" Oct 09 15:34:40 crc kubenswrapper[4719]: I1009 15:34:40.825667 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 09 15:34:40 crc kubenswrapper[4719]: I1009 15:34:40.833864 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c1e6-account-create-wc9df"] Oct 09 15:34:40 crc kubenswrapper[4719]: I1009 15:34:40.955874 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtqfp\" (UniqueName: \"kubernetes.io/projected/a14bea55-0b69-471d-8bf2-86963f482288-kube-api-access-rtqfp\") pod \"keystone-c1e6-account-create-wc9df\" (UID: \"a14bea55-0b69-471d-8bf2-86963f482288\") " pod="openstack/keystone-c1e6-account-create-wc9df" Oct 09 15:34:41 crc kubenswrapper[4719]: I1009 15:34:41.057944 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtqfp\" (UniqueName: \"kubernetes.io/projected/a14bea55-0b69-471d-8bf2-86963f482288-kube-api-access-rtqfp\") pod \"keystone-c1e6-account-create-wc9df\" (UID: \"a14bea55-0b69-471d-8bf2-86963f482288\") " pod="openstack/keystone-c1e6-account-create-wc9df" Oct 09 15:34:41 crc kubenswrapper[4719]: I1009 15:34:41.076804 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtqfp\" (UniqueName: \"kubernetes.io/projected/a14bea55-0b69-471d-8bf2-86963f482288-kube-api-access-rtqfp\") pod \"keystone-c1e6-account-create-wc9df\" (UID: \"a14bea55-0b69-471d-8bf2-86963f482288\") " pod="openstack/keystone-c1e6-account-create-wc9df" Oct 09 15:34:41 crc kubenswrapper[4719]: I1009 15:34:41.150944 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c1e6-account-create-wc9df" Oct 09 15:34:41 crc kubenswrapper[4719]: I1009 15:34:41.633161 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c1e6-account-create-wc9df"] Oct 09 15:34:42 crc kubenswrapper[4719]: I1009 15:34:42.455220 4719 generic.go:334] "Generic (PLEG): container finished" podID="a14bea55-0b69-471d-8bf2-86963f482288" containerID="f4a393ba44362546342ffae25032fec389bab28ffef9c838dd19f0572e59fc1a" exitCode=0 Oct 09 15:34:42 crc kubenswrapper[4719]: I1009 15:34:42.455330 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c1e6-account-create-wc9df" event={"ID":"a14bea55-0b69-471d-8bf2-86963f482288","Type":"ContainerDied","Data":"f4a393ba44362546342ffae25032fec389bab28ffef9c838dd19f0572e59fc1a"} Oct 09 15:34:42 crc kubenswrapper[4719]: I1009 15:34:42.455648 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c1e6-account-create-wc9df" event={"ID":"a14bea55-0b69-471d-8bf2-86963f482288","Type":"ContainerStarted","Data":"a9b269e0c7b24fa152ab9ce0320dc4751d804107a4e5e38c75fcf5a7abe4208b"} Oct 09 15:34:43 crc kubenswrapper[4719]: I1009 15:34:43.805173 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c1e6-account-create-wc9df" Oct 09 15:34:43 crc kubenswrapper[4719]: I1009 15:34:43.910174 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtqfp\" (UniqueName: \"kubernetes.io/projected/a14bea55-0b69-471d-8bf2-86963f482288-kube-api-access-rtqfp\") pod \"a14bea55-0b69-471d-8bf2-86963f482288\" (UID: \"a14bea55-0b69-471d-8bf2-86963f482288\") " Oct 09 15:34:43 crc kubenswrapper[4719]: I1009 15:34:43.915659 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a14bea55-0b69-471d-8bf2-86963f482288-kube-api-access-rtqfp" (OuterVolumeSpecName: "kube-api-access-rtqfp") pod "a14bea55-0b69-471d-8bf2-86963f482288" (UID: "a14bea55-0b69-471d-8bf2-86963f482288"). InnerVolumeSpecName "kube-api-access-rtqfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:34:44 crc kubenswrapper[4719]: I1009 15:34:44.012769 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtqfp\" (UniqueName: \"kubernetes.io/projected/a14bea55-0b69-471d-8bf2-86963f482288-kube-api-access-rtqfp\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:44 crc kubenswrapper[4719]: I1009 15:34:44.025469 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" Oct 09 15:34:44 crc kubenswrapper[4719]: I1009 15:34:44.079788 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58866ff6f5-kd5d5"] Oct 09 15:34:44 crc kubenswrapper[4719]: I1009 15:34:44.080073 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" podUID="b0ebf29d-d040-4829-912f-e8bd998cf0da" containerName="dnsmasq-dns" containerID="cri-o://a71c7efe995b43ccf47c9b1b6657313bac942a4bd854c2ad46fca8f6d4d03052" gracePeriod=10 Oct 09 15:34:44 crc kubenswrapper[4719]: I1009 15:34:44.475720 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c1e6-account-create-wc9df" event={"ID":"a14bea55-0b69-471d-8bf2-86963f482288","Type":"ContainerDied","Data":"a9b269e0c7b24fa152ab9ce0320dc4751d804107a4e5e38c75fcf5a7abe4208b"} Oct 09 15:34:44 crc kubenswrapper[4719]: I1009 15:34:44.476206 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9b269e0c7b24fa152ab9ce0320dc4751d804107a4e5e38c75fcf5a7abe4208b" Oct 09 15:34:44 crc kubenswrapper[4719]: I1009 15:34:44.475732 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c1e6-account-create-wc9df" Oct 09 15:34:44 crc kubenswrapper[4719]: I1009 15:34:44.478893 4719 generic.go:334] "Generic (PLEG): container finished" podID="b0ebf29d-d040-4829-912f-e8bd998cf0da" containerID="a71c7efe995b43ccf47c9b1b6657313bac942a4bd854c2ad46fca8f6d4d03052" exitCode=0 Oct 09 15:34:44 crc kubenswrapper[4719]: I1009 15:34:44.478954 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" event={"ID":"b0ebf29d-d040-4829-912f-e8bd998cf0da","Type":"ContainerDied","Data":"a71c7efe995b43ccf47c9b1b6657313bac942a4bd854c2ad46fca8f6d4d03052"} Oct 09 15:34:44 crc kubenswrapper[4719]: I1009 15:34:44.518547 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" Oct 09 15:34:44 crc kubenswrapper[4719]: I1009 15:34:44.622281 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0ebf29d-d040-4829-912f-e8bd998cf0da-ovsdbserver-sb\") pod \"b0ebf29d-d040-4829-912f-e8bd998cf0da\" (UID: \"b0ebf29d-d040-4829-912f-e8bd998cf0da\") " Oct 09 15:34:44 crc kubenswrapper[4719]: I1009 15:34:44.622393 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0ebf29d-d040-4829-912f-e8bd998cf0da-ovsdbserver-nb\") pod \"b0ebf29d-d040-4829-912f-e8bd998cf0da\" (UID: \"b0ebf29d-d040-4829-912f-e8bd998cf0da\") " Oct 09 15:34:44 crc kubenswrapper[4719]: I1009 15:34:44.622449 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcv59\" (UniqueName: \"kubernetes.io/projected/b0ebf29d-d040-4829-912f-e8bd998cf0da-kube-api-access-pcv59\") pod \"b0ebf29d-d040-4829-912f-e8bd998cf0da\" (UID: \"b0ebf29d-d040-4829-912f-e8bd998cf0da\") " Oct 09 15:34:44 crc kubenswrapper[4719]: I1009 15:34:44.622481 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0ebf29d-d040-4829-912f-e8bd998cf0da-config\") pod \"b0ebf29d-d040-4829-912f-e8bd998cf0da\" (UID: \"b0ebf29d-d040-4829-912f-e8bd998cf0da\") " Oct 09 15:34:44 crc kubenswrapper[4719]: I1009 15:34:44.622564 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0ebf29d-d040-4829-912f-e8bd998cf0da-dns-svc\") pod \"b0ebf29d-d040-4829-912f-e8bd998cf0da\" (UID: \"b0ebf29d-d040-4829-912f-e8bd998cf0da\") " Oct 09 15:34:44 crc kubenswrapper[4719]: I1009 15:34:44.632662 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0ebf29d-d040-4829-912f-e8bd998cf0da-kube-api-access-pcv59" (OuterVolumeSpecName: "kube-api-access-pcv59") pod "b0ebf29d-d040-4829-912f-e8bd998cf0da" (UID: "b0ebf29d-d040-4829-912f-e8bd998cf0da"). InnerVolumeSpecName "kube-api-access-pcv59". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:34:44 crc kubenswrapper[4719]: I1009 15:34:44.662691 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0ebf29d-d040-4829-912f-e8bd998cf0da-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b0ebf29d-d040-4829-912f-e8bd998cf0da" (UID: "b0ebf29d-d040-4829-912f-e8bd998cf0da"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:34:44 crc kubenswrapper[4719]: I1009 15:34:44.663752 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0ebf29d-d040-4829-912f-e8bd998cf0da-config" (OuterVolumeSpecName: "config") pod "b0ebf29d-d040-4829-912f-e8bd998cf0da" (UID: "b0ebf29d-d040-4829-912f-e8bd998cf0da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:34:44 crc kubenswrapper[4719]: I1009 15:34:44.664060 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0ebf29d-d040-4829-912f-e8bd998cf0da-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b0ebf29d-d040-4829-912f-e8bd998cf0da" (UID: "b0ebf29d-d040-4829-912f-e8bd998cf0da"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:34:44 crc kubenswrapper[4719]: I1009 15:34:44.671503 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0ebf29d-d040-4829-912f-e8bd998cf0da-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b0ebf29d-d040-4829-912f-e8bd998cf0da" (UID: "b0ebf29d-d040-4829-912f-e8bd998cf0da"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:34:44 crc kubenswrapper[4719]: I1009 15:34:44.724336 4719 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0ebf29d-d040-4829-912f-e8bd998cf0da-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:44 crc kubenswrapper[4719]: I1009 15:34:44.724597 4719 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0ebf29d-d040-4829-912f-e8bd998cf0da-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:44 crc kubenswrapper[4719]: I1009 15:34:44.724859 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcv59\" (UniqueName: \"kubernetes.io/projected/b0ebf29d-d040-4829-912f-e8bd998cf0da-kube-api-access-pcv59\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:44 crc kubenswrapper[4719]: I1009 15:34:44.724925 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0ebf29d-d040-4829-912f-e8bd998cf0da-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:44 crc kubenswrapper[4719]: I1009 15:34:44.724994 4719 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0ebf29d-d040-4829-912f-e8bd998cf0da-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:45 crc kubenswrapper[4719]: I1009 15:34:45.487969 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" event={"ID":"b0ebf29d-d040-4829-912f-e8bd998cf0da","Type":"ContainerDied","Data":"57d522a6800e2de7394201b9e5041ca3f1d6e6e5ff0cba5d678bf00bc4216545"} Oct 09 15:34:45 crc kubenswrapper[4719]: I1009 15:34:45.488029 4719 scope.go:117] "RemoveContainer" containerID="a71c7efe995b43ccf47c9b1b6657313bac942a4bd854c2ad46fca8f6d4d03052" Oct 09 15:34:45 crc kubenswrapper[4719]: I1009 15:34:45.488055 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58866ff6f5-kd5d5" Oct 09 15:34:45 crc kubenswrapper[4719]: I1009 15:34:45.510457 4719 scope.go:117] "RemoveContainer" containerID="988845eef7edf5550c80887bed9783830e1acd5e471e50189a272b90f65a33a4" Oct 09 15:34:45 crc kubenswrapper[4719]: I1009 15:34:45.515510 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58866ff6f5-kd5d5"] Oct 09 15:34:45 crc kubenswrapper[4719]: I1009 15:34:45.522951 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58866ff6f5-kd5d5"] Oct 09 15:34:47 crc kubenswrapper[4719]: I1009 15:34:47.173505 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0ebf29d-d040-4829-912f-e8bd998cf0da" path="/var/lib/kubelet/pods/b0ebf29d-d040-4829-912f-e8bd998cf0da/volumes" Oct 09 15:34:47 crc kubenswrapper[4719]: I1009 15:34:47.194290 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="c8f5a6f9-5554-485d-9aee-47449402e37b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.108:5671: connect: connection refused" Oct 09 15:34:47 crc kubenswrapper[4719]: I1009 15:34:47.564662 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="d3a820d9-3c13-47ec-a39e-dea4d60b7536" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Oct 09 15:34:47 crc kubenswrapper[4719]: I1009 15:34:47.872544 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-notifications-server-0" Oct 09 15:34:48 crc kubenswrapper[4719]: I1009 15:34:48.049230 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-ngwlj"] Oct 09 15:34:48 crc kubenswrapper[4719]: E1009 15:34:48.049650 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ebf29d-d040-4829-912f-e8bd998cf0da" containerName="dnsmasq-dns" Oct 09 15:34:48 crc kubenswrapper[4719]: I1009 15:34:48.049675 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ebf29d-d040-4829-912f-e8bd998cf0da" containerName="dnsmasq-dns" Oct 09 15:34:48 crc kubenswrapper[4719]: E1009 15:34:48.049705 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ebf29d-d040-4829-912f-e8bd998cf0da" containerName="init" Oct 09 15:34:48 crc kubenswrapper[4719]: I1009 15:34:48.049716 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ebf29d-d040-4829-912f-e8bd998cf0da" containerName="init" Oct 09 15:34:48 crc kubenswrapper[4719]: E1009 15:34:48.049740 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a14bea55-0b69-471d-8bf2-86963f482288" containerName="mariadb-account-create" Oct 09 15:34:48 crc kubenswrapper[4719]: I1009 15:34:48.049748 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="a14bea55-0b69-471d-8bf2-86963f482288" containerName="mariadb-account-create" Oct 09 15:34:48 crc kubenswrapper[4719]: I1009 15:34:48.049959 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0ebf29d-d040-4829-912f-e8bd998cf0da" containerName="dnsmasq-dns" Oct 09 15:34:48 crc kubenswrapper[4719]: I1009 15:34:48.049997 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="a14bea55-0b69-471d-8bf2-86963f482288" containerName="mariadb-account-create" Oct 09 15:34:48 crc kubenswrapper[4719]: I1009 15:34:48.050796 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ngwlj" Oct 09 15:34:48 crc kubenswrapper[4719]: I1009 15:34:48.071035 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ngwlj"] Oct 09 15:34:48 crc kubenswrapper[4719]: I1009 15:34:48.205381 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd4t4\" (UniqueName: \"kubernetes.io/projected/8cfd57d6-febf-4fcc-878e-aba5c948348a-kube-api-access-vd4t4\") pod \"glance-db-create-ngwlj\" (UID: \"8cfd57d6-febf-4fcc-878e-aba5c948348a\") " pod="openstack/glance-db-create-ngwlj" Oct 09 15:34:48 crc kubenswrapper[4719]: I1009 15:34:48.309081 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd4t4\" (UniqueName: \"kubernetes.io/projected/8cfd57d6-febf-4fcc-878e-aba5c948348a-kube-api-access-vd4t4\") pod \"glance-db-create-ngwlj\" (UID: \"8cfd57d6-febf-4fcc-878e-aba5c948348a\") " pod="openstack/glance-db-create-ngwlj" Oct 09 15:34:48 crc kubenswrapper[4719]: I1009 15:34:48.327079 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd4t4\" (UniqueName: \"kubernetes.io/projected/8cfd57d6-febf-4fcc-878e-aba5c948348a-kube-api-access-vd4t4\") pod \"glance-db-create-ngwlj\" (UID: \"8cfd57d6-febf-4fcc-878e-aba5c948348a\") " pod="openstack/glance-db-create-ngwlj" Oct 09 15:34:48 crc kubenswrapper[4719]: I1009 15:34:48.410666 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ngwlj" Oct 09 15:34:48 crc kubenswrapper[4719]: I1009 15:34:48.879523 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ngwlj"] Oct 09 15:34:48 crc kubenswrapper[4719]: W1009 15:34:48.898241 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cfd57d6_febf_4fcc_878e_aba5c948348a.slice/crio-4dae2c2b160ebbd7c7b202aaf1b563e057b79a02a123f0cf2f074362597e3805 WatchSource:0}: Error finding container 4dae2c2b160ebbd7c7b202aaf1b563e057b79a02a123f0cf2f074362597e3805: Status 404 returned error can't find the container with id 4dae2c2b160ebbd7c7b202aaf1b563e057b79a02a123f0cf2f074362597e3805 Oct 09 15:34:49 crc kubenswrapper[4719]: I1009 15:34:49.542198 4719 generic.go:334] "Generic (PLEG): container finished" podID="8cfd57d6-febf-4fcc-878e-aba5c948348a" containerID="ac9b9a20c18c91e91e4911f84dd9d8a0f4472764bf3925c08d1508d968d868d9" exitCode=0 Oct 09 15:34:49 crc kubenswrapper[4719]: I1009 15:34:49.542285 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ngwlj" event={"ID":"8cfd57d6-febf-4fcc-878e-aba5c948348a","Type":"ContainerDied","Data":"ac9b9a20c18c91e91e4911f84dd9d8a0f4472764bf3925c08d1508d968d868d9"} Oct 09 15:34:49 crc kubenswrapper[4719]: I1009 15:34:49.542523 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ngwlj" event={"ID":"8cfd57d6-febf-4fcc-878e-aba5c948348a","Type":"ContainerStarted","Data":"4dae2c2b160ebbd7c7b202aaf1b563e057b79a02a123f0cf2f074362597e3805"} Oct 09 15:34:50 crc kubenswrapper[4719]: I1009 15:34:50.877233 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ngwlj" Oct 09 15:34:51 crc kubenswrapper[4719]: I1009 15:34:51.048782 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd4t4\" (UniqueName: \"kubernetes.io/projected/8cfd57d6-febf-4fcc-878e-aba5c948348a-kube-api-access-vd4t4\") pod \"8cfd57d6-febf-4fcc-878e-aba5c948348a\" (UID: \"8cfd57d6-febf-4fcc-878e-aba5c948348a\") " Oct 09 15:34:51 crc kubenswrapper[4719]: I1009 15:34:51.058324 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cfd57d6-febf-4fcc-878e-aba5c948348a-kube-api-access-vd4t4" (OuterVolumeSpecName: "kube-api-access-vd4t4") pod "8cfd57d6-febf-4fcc-878e-aba5c948348a" (UID: "8cfd57d6-febf-4fcc-878e-aba5c948348a"). InnerVolumeSpecName "kube-api-access-vd4t4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:34:51 crc kubenswrapper[4719]: I1009 15:34:51.151110 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd4t4\" (UniqueName: \"kubernetes.io/projected/8cfd57d6-febf-4fcc-878e-aba5c948348a-kube-api-access-vd4t4\") on node \"crc\" DevicePath \"\"" Oct 09 15:34:51 crc kubenswrapper[4719]: I1009 15:34:51.566473 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ngwlj" event={"ID":"8cfd57d6-febf-4fcc-878e-aba5c948348a","Type":"ContainerDied","Data":"4dae2c2b160ebbd7c7b202aaf1b563e057b79a02a123f0cf2f074362597e3805"} Oct 09 15:34:51 crc kubenswrapper[4719]: I1009 15:34:51.566530 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dae2c2b160ebbd7c7b202aaf1b563e057b79a02a123f0cf2f074362597e3805" Oct 09 15:34:51 crc kubenswrapper[4719]: I1009 15:34:51.566603 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ngwlj" Oct 09 15:34:57 crc kubenswrapper[4719]: I1009 15:34:57.195501 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:34:57 crc kubenswrapper[4719]: I1009 15:34:57.566856 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.102983 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-ac57-account-create-g6pg7"] Oct 09 15:34:58 crc kubenswrapper[4719]: E1009 15:34:58.103600 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cfd57d6-febf-4fcc-878e-aba5c948348a" containerName="mariadb-database-create" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.103615 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cfd57d6-febf-4fcc-878e-aba5c948348a" containerName="mariadb-database-create" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.103819 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cfd57d6-febf-4fcc-878e-aba5c948348a" containerName="mariadb-database-create" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.104328 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ac57-account-create-g6pg7" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.113131 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-ac57-account-create-g6pg7"] Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.115600 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.264719 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j4s7\" (UniqueName: \"kubernetes.io/projected/9ee9b71f-3824-456a-9466-6b39fb613885-kube-api-access-4j4s7\") pod \"glance-ac57-account-create-g6pg7\" (UID: \"9ee9b71f-3824-456a-9466-6b39fb613885\") " pod="openstack/glance-ac57-account-create-g6pg7" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.367253 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j4s7\" (UniqueName: \"kubernetes.io/projected/9ee9b71f-3824-456a-9466-6b39fb613885-kube-api-access-4j4s7\") pod \"glance-ac57-account-create-g6pg7\" (UID: \"9ee9b71f-3824-456a-9466-6b39fb613885\") " pod="openstack/glance-ac57-account-create-g6pg7" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.403076 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j4s7\" (UniqueName: \"kubernetes.io/projected/9ee9b71f-3824-456a-9466-6b39fb613885-kube-api-access-4j4s7\") pod \"glance-ac57-account-create-g6pg7\" (UID: \"9ee9b71f-3824-456a-9466-6b39fb613885\") " pod="openstack/glance-ac57-account-create-g6pg7" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.424935 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ac57-account-create-g6pg7" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.697485 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-t5rw4"] Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.699246 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-t5rw4" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.703045 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-t5rw4"] Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.810693 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-hcncf"] Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.812701 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hcncf" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.813234 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-hcncf"] Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.823554 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-qs479"] Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.831990 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-qs479" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.834299 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-sjklv" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.834832 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.859608 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-qs479"] Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.879232 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-whtzg"] Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.880188 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggz6b\" (UniqueName: \"kubernetes.io/projected/527d79a8-0ef1-485b-94a1-eff7ee279a5a-kube-api-access-ggz6b\") pod \"barbican-db-create-t5rw4\" (UID: \"527d79a8-0ef1-485b-94a1-eff7ee279a5a\") " pod="openstack/barbican-db-create-t5rw4" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.880583 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-whtzg" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.884183 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.884406 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.884611 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2vsx5" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.884753 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.914707 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-whtzg"] Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.982258 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/352eb1e1-d9bb-4184-9512-7cb1e9787edb-config-data\") pod \"watcher-db-sync-qs479\" (UID: \"352eb1e1-d9bb-4184-9512-7cb1e9787edb\") " pod="openstack/watcher-db-sync-qs479" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.982336 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6gw6\" (UniqueName: \"kubernetes.io/projected/352eb1e1-d9bb-4184-9512-7cb1e9787edb-kube-api-access-x6gw6\") pod \"watcher-db-sync-qs479\" (UID: \"352eb1e1-d9bb-4184-9512-7cb1e9787edb\") " pod="openstack/watcher-db-sync-qs479" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.982492 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/352eb1e1-d9bb-4184-9512-7cb1e9787edb-combined-ca-bundle\") pod \"watcher-db-sync-qs479\" (UID: \"352eb1e1-d9bb-4184-9512-7cb1e9787edb\") " pod="openstack/watcher-db-sync-qs479" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.982629 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/352eb1e1-d9bb-4184-9512-7cb1e9787edb-db-sync-config-data\") pod \"watcher-db-sync-qs479\" (UID: \"352eb1e1-d9bb-4184-9512-7cb1e9787edb\") " pod="openstack/watcher-db-sync-qs479" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.982765 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svr9k\" (UniqueName: \"kubernetes.io/projected/f560e5ac-9e54-4a07-a0d4-88a94c2004c5-kube-api-access-svr9k\") pod \"cinder-db-create-hcncf\" (UID: \"f560e5ac-9e54-4a07-a0d4-88a94c2004c5\") " pod="openstack/cinder-db-create-hcncf" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.982818 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1142ae88-4e1f-4957-b735-23d64604712f-combined-ca-bundle\") pod \"keystone-db-sync-whtzg\" (UID: \"1142ae88-4e1f-4957-b735-23d64604712f\") " pod="openstack/keystone-db-sync-whtzg" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.982864 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggz6b\" (UniqueName: \"kubernetes.io/projected/527d79a8-0ef1-485b-94a1-eff7ee279a5a-kube-api-access-ggz6b\") pod \"barbican-db-create-t5rw4\" (UID: \"527d79a8-0ef1-485b-94a1-eff7ee279a5a\") " pod="openstack/barbican-db-create-t5rw4" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.982921 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr66s\" (UniqueName: \"kubernetes.io/projected/1142ae88-4e1f-4957-b735-23d64604712f-kube-api-access-jr66s\") pod \"keystone-db-sync-whtzg\" (UID: \"1142ae88-4e1f-4957-b735-23d64604712f\") " pod="openstack/keystone-db-sync-whtzg" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.983037 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1142ae88-4e1f-4957-b735-23d64604712f-config-data\") pod \"keystone-db-sync-whtzg\" (UID: \"1142ae88-4e1f-4957-b735-23d64604712f\") " pod="openstack/keystone-db-sync-whtzg" Oct 09 15:34:58 crc kubenswrapper[4719]: I1009 15:34:58.991510 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-ac57-account-create-g6pg7"] Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.027778 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggz6b\" (UniqueName: \"kubernetes.io/projected/527d79a8-0ef1-485b-94a1-eff7ee279a5a-kube-api-access-ggz6b\") pod \"barbican-db-create-t5rw4\" (UID: \"527d79a8-0ef1-485b-94a1-eff7ee279a5a\") " pod="openstack/barbican-db-create-t5rw4" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.040500 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-t5rw4" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.055125 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-l5gd9"] Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.056139 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-l5gd9" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.070957 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-l5gd9"] Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.090059 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svr9k\" (UniqueName: \"kubernetes.io/projected/f560e5ac-9e54-4a07-a0d4-88a94c2004c5-kube-api-access-svr9k\") pod \"cinder-db-create-hcncf\" (UID: \"f560e5ac-9e54-4a07-a0d4-88a94c2004c5\") " pod="openstack/cinder-db-create-hcncf" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.090095 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1142ae88-4e1f-4957-b735-23d64604712f-combined-ca-bundle\") pod \"keystone-db-sync-whtzg\" (UID: \"1142ae88-4e1f-4957-b735-23d64604712f\") " pod="openstack/keystone-db-sync-whtzg" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.090127 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr66s\" (UniqueName: \"kubernetes.io/projected/1142ae88-4e1f-4957-b735-23d64604712f-kube-api-access-jr66s\") pod \"keystone-db-sync-whtzg\" (UID: \"1142ae88-4e1f-4957-b735-23d64604712f\") " pod="openstack/keystone-db-sync-whtzg" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.090160 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1142ae88-4e1f-4957-b735-23d64604712f-config-data\") pod \"keystone-db-sync-whtzg\" (UID: \"1142ae88-4e1f-4957-b735-23d64604712f\") " pod="openstack/keystone-db-sync-whtzg" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.090196 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/352eb1e1-d9bb-4184-9512-7cb1e9787edb-config-data\") pod \"watcher-db-sync-qs479\" (UID: \"352eb1e1-d9bb-4184-9512-7cb1e9787edb\") " pod="openstack/watcher-db-sync-qs479" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.090232 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6gw6\" (UniqueName: \"kubernetes.io/projected/352eb1e1-d9bb-4184-9512-7cb1e9787edb-kube-api-access-x6gw6\") pod \"watcher-db-sync-qs479\" (UID: \"352eb1e1-d9bb-4184-9512-7cb1e9787edb\") " pod="openstack/watcher-db-sync-qs479" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.090267 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/352eb1e1-d9bb-4184-9512-7cb1e9787edb-combined-ca-bundle\") pod \"watcher-db-sync-qs479\" (UID: \"352eb1e1-d9bb-4184-9512-7cb1e9787edb\") " pod="openstack/watcher-db-sync-qs479" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.090286 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/352eb1e1-d9bb-4184-9512-7cb1e9787edb-db-sync-config-data\") pod \"watcher-db-sync-qs479\" (UID: \"352eb1e1-d9bb-4184-9512-7cb1e9787edb\") " pod="openstack/watcher-db-sync-qs479" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.094806 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/352eb1e1-d9bb-4184-9512-7cb1e9787edb-config-data\") pod \"watcher-db-sync-qs479\" (UID: \"352eb1e1-d9bb-4184-9512-7cb1e9787edb\") " pod="openstack/watcher-db-sync-qs479" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.095286 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/352eb1e1-d9bb-4184-9512-7cb1e9787edb-combined-ca-bundle\") pod \"watcher-db-sync-qs479\" (UID: \"352eb1e1-d9bb-4184-9512-7cb1e9787edb\") " pod="openstack/watcher-db-sync-qs479" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.098367 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1142ae88-4e1f-4957-b735-23d64604712f-config-data\") pod \"keystone-db-sync-whtzg\" (UID: \"1142ae88-4e1f-4957-b735-23d64604712f\") " pod="openstack/keystone-db-sync-whtzg" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.105035 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/352eb1e1-d9bb-4184-9512-7cb1e9787edb-db-sync-config-data\") pod \"watcher-db-sync-qs479\" (UID: \"352eb1e1-d9bb-4184-9512-7cb1e9787edb\") " pod="openstack/watcher-db-sync-qs479" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.105102 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1142ae88-4e1f-4957-b735-23d64604712f-combined-ca-bundle\") pod \"keystone-db-sync-whtzg\" (UID: \"1142ae88-4e1f-4957-b735-23d64604712f\") " pod="openstack/keystone-db-sync-whtzg" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.122697 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svr9k\" (UniqueName: \"kubernetes.io/projected/f560e5ac-9e54-4a07-a0d4-88a94c2004c5-kube-api-access-svr9k\") pod \"cinder-db-create-hcncf\" (UID: \"f560e5ac-9e54-4a07-a0d4-88a94c2004c5\") " pod="openstack/cinder-db-create-hcncf" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.124938 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr66s\" (UniqueName: \"kubernetes.io/projected/1142ae88-4e1f-4957-b735-23d64604712f-kube-api-access-jr66s\") pod \"keystone-db-sync-whtzg\" (UID: \"1142ae88-4e1f-4957-b735-23d64604712f\") " pod="openstack/keystone-db-sync-whtzg" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.131783 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6gw6\" (UniqueName: \"kubernetes.io/projected/352eb1e1-d9bb-4184-9512-7cb1e9787edb-kube-api-access-x6gw6\") pod \"watcher-db-sync-qs479\" (UID: \"352eb1e1-d9bb-4184-9512-7cb1e9787edb\") " pod="openstack/watcher-db-sync-qs479" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.140843 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hcncf" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.159571 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-qs479" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.212082 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47cxz\" (UniqueName: \"kubernetes.io/projected/b475a082-9d75-4284-8e09-35f2a96501b9-kube-api-access-47cxz\") pod \"neutron-db-create-l5gd9\" (UID: \"b475a082-9d75-4284-8e09-35f2a96501b9\") " pod="openstack/neutron-db-create-l5gd9" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.213308 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-whtzg" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.317121 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47cxz\" (UniqueName: \"kubernetes.io/projected/b475a082-9d75-4284-8e09-35f2a96501b9-kube-api-access-47cxz\") pod \"neutron-db-create-l5gd9\" (UID: \"b475a082-9d75-4284-8e09-35f2a96501b9\") " pod="openstack/neutron-db-create-l5gd9" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.340201 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47cxz\" (UniqueName: \"kubernetes.io/projected/b475a082-9d75-4284-8e09-35f2a96501b9-kube-api-access-47cxz\") pod \"neutron-db-create-l5gd9\" (UID: \"b475a082-9d75-4284-8e09-35f2a96501b9\") " pod="openstack/neutron-db-create-l5gd9" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.419027 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-l5gd9" Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.590570 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-t5rw4"] Oct 09 15:34:59 crc kubenswrapper[4719]: W1009 15:34:59.598770 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod527d79a8_0ef1_485b_94a1_eff7ee279a5a.slice/crio-48a49dbf35764f1ed07e4891c2b990227bb19d687cc9500e653ee33f73414c37 WatchSource:0}: Error finding container 48a49dbf35764f1ed07e4891c2b990227bb19d687cc9500e653ee33f73414c37: Status 404 returned error can't find the container with id 48a49dbf35764f1ed07e4891c2b990227bb19d687cc9500e653ee33f73414c37 Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.635908 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-t5rw4" event={"ID":"527d79a8-0ef1-485b-94a1-eff7ee279a5a","Type":"ContainerStarted","Data":"48a49dbf35764f1ed07e4891c2b990227bb19d687cc9500e653ee33f73414c37"} Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.638661 4719 generic.go:334] "Generic (PLEG): container finished" podID="9ee9b71f-3824-456a-9466-6b39fb613885" containerID="f9c8b01f404ca8805122237e75652c2b8a571000f16a96338eb4d76b3aae1182" exitCode=0 Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.638718 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ac57-account-create-g6pg7" event={"ID":"9ee9b71f-3824-456a-9466-6b39fb613885","Type":"ContainerDied","Data":"f9c8b01f404ca8805122237e75652c2b8a571000f16a96338eb4d76b3aae1182"} Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.638749 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ac57-account-create-g6pg7" event={"ID":"9ee9b71f-3824-456a-9466-6b39fb613885","Type":"ContainerStarted","Data":"952d14d1980a6020bf2940d35bb5cc3b0c376772ce3c6d0ff3c7afbdc694de1b"} Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.820528 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-whtzg"] Oct 09 15:34:59 crc kubenswrapper[4719]: W1009 15:34:59.822472 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1142ae88_4e1f_4957_b735_23d64604712f.slice/crio-a0e304d424e56d0ff397e262ee1d14bf08fd7563e1388617ccbb39381e02adfd WatchSource:0}: Error finding container a0e304d424e56d0ff397e262ee1d14bf08fd7563e1388617ccbb39381e02adfd: Status 404 returned error can't find the container with id a0e304d424e56d0ff397e262ee1d14bf08fd7563e1388617ccbb39381e02adfd Oct 09 15:34:59 crc kubenswrapper[4719]: I1009 15:34:59.999519 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-qs479"] Oct 09 15:35:00 crc kubenswrapper[4719]: I1009 15:35:00.022503 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-hcncf"] Oct 09 15:35:00 crc kubenswrapper[4719]: W1009 15:35:00.052745 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf560e5ac_9e54_4a07_a0d4_88a94c2004c5.slice/crio-191c4f4ba48adbff7daed0109854c376796ae4636cc5829ccf96b78c03641928 WatchSource:0}: Error finding container 191c4f4ba48adbff7daed0109854c376796ae4636cc5829ccf96b78c03641928: Status 404 returned error can't find the container with id 191c4f4ba48adbff7daed0109854c376796ae4636cc5829ccf96b78c03641928 Oct 09 15:35:00 crc kubenswrapper[4719]: I1009 15:35:00.145943 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-l5gd9"] Oct 09 15:35:00 crc kubenswrapper[4719]: W1009 15:35:00.156502 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb475a082_9d75_4284_8e09_35f2a96501b9.slice/crio-947d9fce5c90db28b66c9b197a4e6c9e4af02b445940525920c1c937e40e1542 WatchSource:0}: Error finding container 947d9fce5c90db28b66c9b197a4e6c9e4af02b445940525920c1c937e40e1542: Status 404 returned error can't find the container with id 947d9fce5c90db28b66c9b197a4e6c9e4af02b445940525920c1c937e40e1542 Oct 09 15:35:00 crc kubenswrapper[4719]: I1009 15:35:00.663416 4719 generic.go:334] "Generic (PLEG): container finished" podID="b475a082-9d75-4284-8e09-35f2a96501b9" containerID="40ff740b355745cf960a303461ef82e422aacf339afb07c86ac099b6c958acff" exitCode=0 Oct 09 15:35:00 crc kubenswrapper[4719]: I1009 15:35:00.663485 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-l5gd9" event={"ID":"b475a082-9d75-4284-8e09-35f2a96501b9","Type":"ContainerDied","Data":"40ff740b355745cf960a303461ef82e422aacf339afb07c86ac099b6c958acff"} Oct 09 15:35:00 crc kubenswrapper[4719]: I1009 15:35:00.663508 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-l5gd9" event={"ID":"b475a082-9d75-4284-8e09-35f2a96501b9","Type":"ContainerStarted","Data":"947d9fce5c90db28b66c9b197a4e6c9e4af02b445940525920c1c937e40e1542"} Oct 09 15:35:00 crc kubenswrapper[4719]: I1009 15:35:00.665153 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-qs479" event={"ID":"352eb1e1-d9bb-4184-9512-7cb1e9787edb","Type":"ContainerStarted","Data":"9ac54d5b90796387346d368ba178c1f43cac34b2e9be51aeb365e1f553681936"} Oct 09 15:35:00 crc kubenswrapper[4719]: I1009 15:35:00.666896 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-whtzg" event={"ID":"1142ae88-4e1f-4957-b735-23d64604712f","Type":"ContainerStarted","Data":"a0e304d424e56d0ff397e262ee1d14bf08fd7563e1388617ccbb39381e02adfd"} Oct 09 15:35:00 crc kubenswrapper[4719]: I1009 15:35:00.668585 4719 generic.go:334] "Generic (PLEG): container finished" podID="f560e5ac-9e54-4a07-a0d4-88a94c2004c5" containerID="fc0921523ad54c212ee320465523c436b11ac7952117fe7ad106462b5529fca6" exitCode=0 Oct 09 15:35:00 crc kubenswrapper[4719]: I1009 15:35:00.668645 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hcncf" event={"ID":"f560e5ac-9e54-4a07-a0d4-88a94c2004c5","Type":"ContainerDied","Data":"fc0921523ad54c212ee320465523c436b11ac7952117fe7ad106462b5529fca6"} Oct 09 15:35:00 crc kubenswrapper[4719]: I1009 15:35:00.668659 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hcncf" event={"ID":"f560e5ac-9e54-4a07-a0d4-88a94c2004c5","Type":"ContainerStarted","Data":"191c4f4ba48adbff7daed0109854c376796ae4636cc5829ccf96b78c03641928"} Oct 09 15:35:00 crc kubenswrapper[4719]: I1009 15:35:00.676420 4719 generic.go:334] "Generic (PLEG): container finished" podID="527d79a8-0ef1-485b-94a1-eff7ee279a5a" containerID="de6c6165788ef55db5e7679468882258697386d318be78b26eee189543dcc45e" exitCode=0 Oct 09 15:35:00 crc kubenswrapper[4719]: I1009 15:35:00.676743 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-t5rw4" event={"ID":"527d79a8-0ef1-485b-94a1-eff7ee279a5a","Type":"ContainerDied","Data":"de6c6165788ef55db5e7679468882258697386d318be78b26eee189543dcc45e"} Oct 09 15:35:01 crc kubenswrapper[4719]: I1009 15:35:01.121832 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ac57-account-create-g6pg7" Oct 09 15:35:01 crc kubenswrapper[4719]: I1009 15:35:01.170114 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j4s7\" (UniqueName: \"kubernetes.io/projected/9ee9b71f-3824-456a-9466-6b39fb613885-kube-api-access-4j4s7\") pod \"9ee9b71f-3824-456a-9466-6b39fb613885\" (UID: \"9ee9b71f-3824-456a-9466-6b39fb613885\") " Oct 09 15:35:01 crc kubenswrapper[4719]: I1009 15:35:01.184618 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ee9b71f-3824-456a-9466-6b39fb613885-kube-api-access-4j4s7" (OuterVolumeSpecName: "kube-api-access-4j4s7") pod "9ee9b71f-3824-456a-9466-6b39fb613885" (UID: "9ee9b71f-3824-456a-9466-6b39fb613885"). InnerVolumeSpecName "kube-api-access-4j4s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:35:01 crc kubenswrapper[4719]: I1009 15:35:01.275251 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j4s7\" (UniqueName: \"kubernetes.io/projected/9ee9b71f-3824-456a-9466-6b39fb613885-kube-api-access-4j4s7\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:01 crc kubenswrapper[4719]: I1009 15:35:01.685887 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ac57-account-create-g6pg7" event={"ID":"9ee9b71f-3824-456a-9466-6b39fb613885","Type":"ContainerDied","Data":"952d14d1980a6020bf2940d35bb5cc3b0c376772ce3c6d0ff3c7afbdc694de1b"} Oct 09 15:35:01 crc kubenswrapper[4719]: I1009 15:35:01.686148 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="952d14d1980a6020bf2940d35bb5cc3b0c376772ce3c6d0ff3c7afbdc694de1b" Oct 09 15:35:01 crc kubenswrapper[4719]: I1009 15:35:01.685959 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ac57-account-create-g6pg7" Oct 09 15:35:03 crc kubenswrapper[4719]: I1009 15:35:03.336219 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-xs6f4"] Oct 09 15:35:03 crc kubenswrapper[4719]: E1009 15:35:03.337180 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ee9b71f-3824-456a-9466-6b39fb613885" containerName="mariadb-account-create" Oct 09 15:35:03 crc kubenswrapper[4719]: I1009 15:35:03.337199 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee9b71f-3824-456a-9466-6b39fb613885" containerName="mariadb-account-create" Oct 09 15:35:03 crc kubenswrapper[4719]: I1009 15:35:03.337484 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ee9b71f-3824-456a-9466-6b39fb613885" containerName="mariadb-account-create" Oct 09 15:35:03 crc kubenswrapper[4719]: I1009 15:35:03.338179 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xs6f4" Oct 09 15:35:03 crc kubenswrapper[4719]: I1009 15:35:03.341139 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 09 15:35:03 crc kubenswrapper[4719]: I1009 15:35:03.341802 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zsmzp" Oct 09 15:35:03 crc kubenswrapper[4719]: I1009 15:35:03.350664 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-xs6f4"] Oct 09 15:35:03 crc kubenswrapper[4719]: I1009 15:35:03.409139 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/52932375-ade4-4056-a4f8-6758db0df52f-db-sync-config-data\") pod \"glance-db-sync-xs6f4\" (UID: \"52932375-ade4-4056-a4f8-6758db0df52f\") " pod="openstack/glance-db-sync-xs6f4" Oct 09 15:35:03 crc kubenswrapper[4719]: I1009 15:35:03.409209 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wxvq\" (UniqueName: \"kubernetes.io/projected/52932375-ade4-4056-a4f8-6758db0df52f-kube-api-access-8wxvq\") pod \"glance-db-sync-xs6f4\" (UID: \"52932375-ade4-4056-a4f8-6758db0df52f\") " pod="openstack/glance-db-sync-xs6f4" Oct 09 15:35:03 crc kubenswrapper[4719]: I1009 15:35:03.409584 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52932375-ade4-4056-a4f8-6758db0df52f-combined-ca-bundle\") pod \"glance-db-sync-xs6f4\" (UID: \"52932375-ade4-4056-a4f8-6758db0df52f\") " pod="openstack/glance-db-sync-xs6f4" Oct 09 15:35:03 crc kubenswrapper[4719]: I1009 15:35:03.409614 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52932375-ade4-4056-a4f8-6758db0df52f-config-data\") pod \"glance-db-sync-xs6f4\" (UID: \"52932375-ade4-4056-a4f8-6758db0df52f\") " pod="openstack/glance-db-sync-xs6f4" Oct 09 15:35:03 crc kubenswrapper[4719]: I1009 15:35:03.511278 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/52932375-ade4-4056-a4f8-6758db0df52f-db-sync-config-data\") pod \"glance-db-sync-xs6f4\" (UID: \"52932375-ade4-4056-a4f8-6758db0df52f\") " pod="openstack/glance-db-sync-xs6f4" Oct 09 15:35:03 crc kubenswrapper[4719]: I1009 15:35:03.511329 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wxvq\" (UniqueName: \"kubernetes.io/projected/52932375-ade4-4056-a4f8-6758db0df52f-kube-api-access-8wxvq\") pod \"glance-db-sync-xs6f4\" (UID: \"52932375-ade4-4056-a4f8-6758db0df52f\") " pod="openstack/glance-db-sync-xs6f4" Oct 09 15:35:03 crc kubenswrapper[4719]: I1009 15:35:03.511423 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52932375-ade4-4056-a4f8-6758db0df52f-combined-ca-bundle\") pod \"glance-db-sync-xs6f4\" (UID: \"52932375-ade4-4056-a4f8-6758db0df52f\") " pod="openstack/glance-db-sync-xs6f4" Oct 09 15:35:03 crc kubenswrapper[4719]: I1009 15:35:03.511456 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52932375-ade4-4056-a4f8-6758db0df52f-config-data\") pod \"glance-db-sync-xs6f4\" (UID: \"52932375-ade4-4056-a4f8-6758db0df52f\") " pod="openstack/glance-db-sync-xs6f4" Oct 09 15:35:03 crc kubenswrapper[4719]: I1009 15:35:03.517791 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52932375-ade4-4056-a4f8-6758db0df52f-combined-ca-bundle\") pod \"glance-db-sync-xs6f4\" (UID: \"52932375-ade4-4056-a4f8-6758db0df52f\") " pod="openstack/glance-db-sync-xs6f4" Oct 09 15:35:03 crc kubenswrapper[4719]: I1009 15:35:03.518610 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/52932375-ade4-4056-a4f8-6758db0df52f-db-sync-config-data\") pod \"glance-db-sync-xs6f4\" (UID: \"52932375-ade4-4056-a4f8-6758db0df52f\") " pod="openstack/glance-db-sync-xs6f4" Oct 09 15:35:03 crc kubenswrapper[4719]: I1009 15:35:03.527866 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wxvq\" (UniqueName: \"kubernetes.io/projected/52932375-ade4-4056-a4f8-6758db0df52f-kube-api-access-8wxvq\") pod \"glance-db-sync-xs6f4\" (UID: \"52932375-ade4-4056-a4f8-6758db0df52f\") " pod="openstack/glance-db-sync-xs6f4" Oct 09 15:35:03 crc kubenswrapper[4719]: I1009 15:35:03.528146 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52932375-ade4-4056-a4f8-6758db0df52f-config-data\") pod \"glance-db-sync-xs6f4\" (UID: \"52932375-ade4-4056-a4f8-6758db0df52f\") " pod="openstack/glance-db-sync-xs6f4" Oct 09 15:35:03 crc kubenswrapper[4719]: I1009 15:35:03.658326 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xs6f4" Oct 09 15:35:06 crc kubenswrapper[4719]: I1009 15:35:06.976438 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:35:06 crc kubenswrapper[4719]: I1009 15:35:06.977117 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:35:07 crc kubenswrapper[4719]: I1009 15:35:07.721197 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hcncf" Oct 09 15:35:07 crc kubenswrapper[4719]: I1009 15:35:07.734164 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-t5rw4" Oct 09 15:35:07 crc kubenswrapper[4719]: I1009 15:35:07.749643 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-l5gd9" event={"ID":"b475a082-9d75-4284-8e09-35f2a96501b9","Type":"ContainerDied","Data":"947d9fce5c90db28b66c9b197a4e6c9e4af02b445940525920c1c937e40e1542"} Oct 09 15:35:07 crc kubenswrapper[4719]: I1009 15:35:07.749705 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="947d9fce5c90db28b66c9b197a4e6c9e4af02b445940525920c1c937e40e1542" Oct 09 15:35:07 crc kubenswrapper[4719]: I1009 15:35:07.755493 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-l5gd9" Oct 09 15:35:07 crc kubenswrapper[4719]: I1009 15:35:07.756989 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hcncf" event={"ID":"f560e5ac-9e54-4a07-a0d4-88a94c2004c5","Type":"ContainerDied","Data":"191c4f4ba48adbff7daed0109854c376796ae4636cc5829ccf96b78c03641928"} Oct 09 15:35:07 crc kubenswrapper[4719]: I1009 15:35:07.757035 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="191c4f4ba48adbff7daed0109854c376796ae4636cc5829ccf96b78c03641928" Oct 09 15:35:07 crc kubenswrapper[4719]: I1009 15:35:07.757085 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hcncf" Oct 09 15:35:07 crc kubenswrapper[4719]: I1009 15:35:07.759015 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-t5rw4" event={"ID":"527d79a8-0ef1-485b-94a1-eff7ee279a5a","Type":"ContainerDied","Data":"48a49dbf35764f1ed07e4891c2b990227bb19d687cc9500e653ee33f73414c37"} Oct 09 15:35:07 crc kubenswrapper[4719]: I1009 15:35:07.759073 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48a49dbf35764f1ed07e4891c2b990227bb19d687cc9500e653ee33f73414c37" Oct 09 15:35:07 crc kubenswrapper[4719]: I1009 15:35:07.759113 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-t5rw4" Oct 09 15:35:07 crc kubenswrapper[4719]: I1009 15:35:07.900166 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svr9k\" (UniqueName: \"kubernetes.io/projected/f560e5ac-9e54-4a07-a0d4-88a94c2004c5-kube-api-access-svr9k\") pod \"f560e5ac-9e54-4a07-a0d4-88a94c2004c5\" (UID: \"f560e5ac-9e54-4a07-a0d4-88a94c2004c5\") " Oct 09 15:35:07 crc kubenswrapper[4719]: I1009 15:35:07.900437 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggz6b\" (UniqueName: \"kubernetes.io/projected/527d79a8-0ef1-485b-94a1-eff7ee279a5a-kube-api-access-ggz6b\") pod \"527d79a8-0ef1-485b-94a1-eff7ee279a5a\" (UID: \"527d79a8-0ef1-485b-94a1-eff7ee279a5a\") " Oct 09 15:35:07 crc kubenswrapper[4719]: I1009 15:35:07.901270 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47cxz\" (UniqueName: \"kubernetes.io/projected/b475a082-9d75-4284-8e09-35f2a96501b9-kube-api-access-47cxz\") pod \"b475a082-9d75-4284-8e09-35f2a96501b9\" (UID: \"b475a082-9d75-4284-8e09-35f2a96501b9\") " Oct 09 15:35:07 crc kubenswrapper[4719]: I1009 15:35:07.906710 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/527d79a8-0ef1-485b-94a1-eff7ee279a5a-kube-api-access-ggz6b" (OuterVolumeSpecName: "kube-api-access-ggz6b") pod "527d79a8-0ef1-485b-94a1-eff7ee279a5a" (UID: "527d79a8-0ef1-485b-94a1-eff7ee279a5a"). InnerVolumeSpecName "kube-api-access-ggz6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:35:07 crc kubenswrapper[4719]: I1009 15:35:07.909532 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f560e5ac-9e54-4a07-a0d4-88a94c2004c5-kube-api-access-svr9k" (OuterVolumeSpecName: "kube-api-access-svr9k") pod "f560e5ac-9e54-4a07-a0d4-88a94c2004c5" (UID: "f560e5ac-9e54-4a07-a0d4-88a94c2004c5"). InnerVolumeSpecName "kube-api-access-svr9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:35:07 crc kubenswrapper[4719]: I1009 15:35:07.911788 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b475a082-9d75-4284-8e09-35f2a96501b9-kube-api-access-47cxz" (OuterVolumeSpecName: "kube-api-access-47cxz") pod "b475a082-9d75-4284-8e09-35f2a96501b9" (UID: "b475a082-9d75-4284-8e09-35f2a96501b9"). InnerVolumeSpecName "kube-api-access-47cxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:35:08 crc kubenswrapper[4719]: I1009 15:35:08.002653 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggz6b\" (UniqueName: \"kubernetes.io/projected/527d79a8-0ef1-485b-94a1-eff7ee279a5a-kube-api-access-ggz6b\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:08 crc kubenswrapper[4719]: I1009 15:35:08.002691 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47cxz\" (UniqueName: \"kubernetes.io/projected/b475a082-9d75-4284-8e09-35f2a96501b9-kube-api-access-47cxz\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:08 crc kubenswrapper[4719]: I1009 15:35:08.002700 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svr9k\" (UniqueName: \"kubernetes.io/projected/f560e5ac-9e54-4a07-a0d4-88a94c2004c5-kube-api-access-svr9k\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:08 crc kubenswrapper[4719]: I1009 15:35:08.788683 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-l5gd9" Oct 09 15:35:09 crc kubenswrapper[4719]: I1009 15:35:09.757457 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-xs6f4"] Oct 09 15:35:09 crc kubenswrapper[4719]: I1009 15:35:09.797495 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xs6f4" event={"ID":"52932375-ade4-4056-a4f8-6758db0df52f","Type":"ContainerStarted","Data":"4ebff1e76ee1492af03d71cccd002188c39c10db18fe53076c0c8688abf75a5b"} Oct 09 15:35:09 crc kubenswrapper[4719]: I1009 15:35:09.799058 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-qs479" event={"ID":"352eb1e1-d9bb-4184-9512-7cb1e9787edb","Type":"ContainerStarted","Data":"b94bc5a098d2dcd72a5e07df74fded744b44e2c318c7a38a4459bf7b58c239a7"} Oct 09 15:35:09 crc kubenswrapper[4719]: I1009 15:35:09.800495 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-whtzg" event={"ID":"1142ae88-4e1f-4957-b735-23d64604712f","Type":"ContainerStarted","Data":"530c831ba0dc13bf93ad6bb6d6a41fd0d4ab1867ad66fc150dbd54682573a6cc"} Oct 09 15:35:09 crc kubenswrapper[4719]: I1009 15:35:09.821631 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-qs479" podStartSLOduration=2.442687288 podStartE2EDuration="11.821609737s" podCreationTimestamp="2025-10-09 15:34:58 +0000 UTC" firstStartedPulling="2025-10-09 15:35:00.037586935 +0000 UTC m=+1005.547298220" lastFinishedPulling="2025-10-09 15:35:09.416509364 +0000 UTC m=+1014.926220669" observedRunningTime="2025-10-09 15:35:09.813292813 +0000 UTC m=+1015.323004108" watchObservedRunningTime="2025-10-09 15:35:09.821609737 +0000 UTC m=+1015.331321022" Oct 09 15:35:09 crc kubenswrapper[4719]: I1009 15:35:09.836545 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-whtzg" podStartSLOduration=2.268545688 podStartE2EDuration="11.83653046s" podCreationTimestamp="2025-10-09 15:34:58 +0000 UTC" firstStartedPulling="2025-10-09 15:34:59.824280863 +0000 UTC m=+1005.333992148" lastFinishedPulling="2025-10-09 15:35:09.392265635 +0000 UTC m=+1014.901976920" observedRunningTime="2025-10-09 15:35:09.83398993 +0000 UTC m=+1015.343701225" watchObservedRunningTime="2025-10-09 15:35:09.83653046 +0000 UTC m=+1015.346241745" Oct 09 15:35:13 crc kubenswrapper[4719]: I1009 15:35:13.839314 4719 generic.go:334] "Generic (PLEG): container finished" podID="1142ae88-4e1f-4957-b735-23d64604712f" containerID="530c831ba0dc13bf93ad6bb6d6a41fd0d4ab1867ad66fc150dbd54682573a6cc" exitCode=0 Oct 09 15:35:13 crc kubenswrapper[4719]: I1009 15:35:13.839423 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-whtzg" event={"ID":"1142ae88-4e1f-4957-b735-23d64604712f","Type":"ContainerDied","Data":"530c831ba0dc13bf93ad6bb6d6a41fd0d4ab1867ad66fc150dbd54682573a6cc"} Oct 09 15:35:13 crc kubenswrapper[4719]: I1009 15:35:13.842911 4719 generic.go:334] "Generic (PLEG): container finished" podID="352eb1e1-d9bb-4184-9512-7cb1e9787edb" containerID="b94bc5a098d2dcd72a5e07df74fded744b44e2c318c7a38a4459bf7b58c239a7" exitCode=0 Oct 09 15:35:13 crc kubenswrapper[4719]: I1009 15:35:13.842961 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-qs479" event={"ID":"352eb1e1-d9bb-4184-9512-7cb1e9787edb","Type":"ContainerDied","Data":"b94bc5a098d2dcd72a5e07df74fded744b44e2c318c7a38a4459bf7b58c239a7"} Oct 09 15:35:18 crc kubenswrapper[4719]: I1009 15:35:18.730280 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-2a57-account-create-9ctdx"] Oct 09 15:35:18 crc kubenswrapper[4719]: E1009 15:35:18.731175 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f560e5ac-9e54-4a07-a0d4-88a94c2004c5" containerName="mariadb-database-create" Oct 09 15:35:18 crc kubenswrapper[4719]: I1009 15:35:18.731192 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="f560e5ac-9e54-4a07-a0d4-88a94c2004c5" containerName="mariadb-database-create" Oct 09 15:35:18 crc kubenswrapper[4719]: E1009 15:35:18.731217 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527d79a8-0ef1-485b-94a1-eff7ee279a5a" containerName="mariadb-database-create" Oct 09 15:35:18 crc kubenswrapper[4719]: I1009 15:35:18.731225 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="527d79a8-0ef1-485b-94a1-eff7ee279a5a" containerName="mariadb-database-create" Oct 09 15:35:18 crc kubenswrapper[4719]: E1009 15:35:18.731239 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b475a082-9d75-4284-8e09-35f2a96501b9" containerName="mariadb-database-create" Oct 09 15:35:18 crc kubenswrapper[4719]: I1009 15:35:18.731249 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="b475a082-9d75-4284-8e09-35f2a96501b9" containerName="mariadb-database-create" Oct 09 15:35:18 crc kubenswrapper[4719]: I1009 15:35:18.731512 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="527d79a8-0ef1-485b-94a1-eff7ee279a5a" containerName="mariadb-database-create" Oct 09 15:35:18 crc kubenswrapper[4719]: I1009 15:35:18.731536 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="b475a082-9d75-4284-8e09-35f2a96501b9" containerName="mariadb-database-create" Oct 09 15:35:18 crc kubenswrapper[4719]: I1009 15:35:18.731551 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="f560e5ac-9e54-4a07-a0d4-88a94c2004c5" containerName="mariadb-database-create" Oct 09 15:35:18 crc kubenswrapper[4719]: I1009 15:35:18.732193 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2a57-account-create-9ctdx" Oct 09 15:35:18 crc kubenswrapper[4719]: I1009 15:35:18.736529 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 09 15:35:18 crc kubenswrapper[4719]: I1009 15:35:18.751496 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2a57-account-create-9ctdx"] Oct 09 15:35:18 crc kubenswrapper[4719]: I1009 15:35:18.804601 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpf2n\" (UniqueName: \"kubernetes.io/projected/ccfcbb78-ac73-4cea-a8c3-f676443f187b-kube-api-access-lpf2n\") pod \"barbican-2a57-account-create-9ctdx\" (UID: \"ccfcbb78-ac73-4cea-a8c3-f676443f187b\") " pod="openstack/barbican-2a57-account-create-9ctdx" Oct 09 15:35:18 crc kubenswrapper[4719]: I1009 15:35:18.905997 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpf2n\" (UniqueName: \"kubernetes.io/projected/ccfcbb78-ac73-4cea-a8c3-f676443f187b-kube-api-access-lpf2n\") pod \"barbican-2a57-account-create-9ctdx\" (UID: \"ccfcbb78-ac73-4cea-a8c3-f676443f187b\") " pod="openstack/barbican-2a57-account-create-9ctdx" Oct 09 15:35:18 crc kubenswrapper[4719]: I1009 15:35:18.926267 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-d4a7-account-create-fnx7r"] Oct 09 15:35:18 crc kubenswrapper[4719]: I1009 15:35:18.928955 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d4a7-account-create-fnx7r" Oct 09 15:35:18 crc kubenswrapper[4719]: I1009 15:35:18.932330 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 09 15:35:18 crc kubenswrapper[4719]: I1009 15:35:18.936098 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d4a7-account-create-fnx7r"] Oct 09 15:35:18 crc kubenswrapper[4719]: I1009 15:35:18.947323 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpf2n\" (UniqueName: \"kubernetes.io/projected/ccfcbb78-ac73-4cea-a8c3-f676443f187b-kube-api-access-lpf2n\") pod \"barbican-2a57-account-create-9ctdx\" (UID: \"ccfcbb78-ac73-4cea-a8c3-f676443f187b\") " pod="openstack/barbican-2a57-account-create-9ctdx" Oct 09 15:35:19 crc kubenswrapper[4719]: I1009 15:35:19.008102 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rrx7\" (UniqueName: \"kubernetes.io/projected/c57b6800-99e7-4fc0-a09e-d963495a39c8-kube-api-access-8rrx7\") pod \"cinder-d4a7-account-create-fnx7r\" (UID: \"c57b6800-99e7-4fc0-a09e-d963495a39c8\") " pod="openstack/cinder-d4a7-account-create-fnx7r" Oct 09 15:35:19 crc kubenswrapper[4719]: I1009 15:35:19.063917 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2a57-account-create-9ctdx" Oct 09 15:35:19 crc kubenswrapper[4719]: I1009 15:35:19.114851 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rrx7\" (UniqueName: \"kubernetes.io/projected/c57b6800-99e7-4fc0-a09e-d963495a39c8-kube-api-access-8rrx7\") pod \"cinder-d4a7-account-create-fnx7r\" (UID: \"c57b6800-99e7-4fc0-a09e-d963495a39c8\") " pod="openstack/cinder-d4a7-account-create-fnx7r" Oct 09 15:35:19 crc kubenswrapper[4719]: I1009 15:35:19.127504 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-60a1-account-create-8lvrq"] Oct 09 15:35:19 crc kubenswrapper[4719]: I1009 15:35:19.128727 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-60a1-account-create-8lvrq" Oct 09 15:35:19 crc kubenswrapper[4719]: I1009 15:35:19.131226 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 09 15:35:19 crc kubenswrapper[4719]: I1009 15:35:19.133078 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rrx7\" (UniqueName: \"kubernetes.io/projected/c57b6800-99e7-4fc0-a09e-d963495a39c8-kube-api-access-8rrx7\") pod \"cinder-d4a7-account-create-fnx7r\" (UID: \"c57b6800-99e7-4fc0-a09e-d963495a39c8\") " pod="openstack/cinder-d4a7-account-create-fnx7r" Oct 09 15:35:19 crc kubenswrapper[4719]: I1009 15:35:19.140823 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-60a1-account-create-8lvrq"] Oct 09 15:35:19 crc kubenswrapper[4719]: I1009 15:35:19.217051 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zjrl\" (UniqueName: \"kubernetes.io/projected/fca1d680-ba45-4687-bce1-4dc7d3e029f4-kube-api-access-9zjrl\") pod \"neutron-60a1-account-create-8lvrq\" (UID: \"fca1d680-ba45-4687-bce1-4dc7d3e029f4\") " pod="openstack/neutron-60a1-account-create-8lvrq" Oct 09 15:35:19 crc kubenswrapper[4719]: I1009 15:35:19.291297 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d4a7-account-create-fnx7r" Oct 09 15:35:19 crc kubenswrapper[4719]: I1009 15:35:19.318643 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zjrl\" (UniqueName: \"kubernetes.io/projected/fca1d680-ba45-4687-bce1-4dc7d3e029f4-kube-api-access-9zjrl\") pod \"neutron-60a1-account-create-8lvrq\" (UID: \"fca1d680-ba45-4687-bce1-4dc7d3e029f4\") " pod="openstack/neutron-60a1-account-create-8lvrq" Oct 09 15:35:19 crc kubenswrapper[4719]: I1009 15:35:19.340515 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zjrl\" (UniqueName: \"kubernetes.io/projected/fca1d680-ba45-4687-bce1-4dc7d3e029f4-kube-api-access-9zjrl\") pod \"neutron-60a1-account-create-8lvrq\" (UID: \"fca1d680-ba45-4687-bce1-4dc7d3e029f4\") " pod="openstack/neutron-60a1-account-create-8lvrq" Oct 09 15:35:19 crc kubenswrapper[4719]: I1009 15:35:19.496711 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-60a1-account-create-8lvrq" Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.634319 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-whtzg" Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.644408 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-qs479" Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.671708 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr66s\" (UniqueName: \"kubernetes.io/projected/1142ae88-4e1f-4957-b735-23d64604712f-kube-api-access-jr66s\") pod \"1142ae88-4e1f-4957-b735-23d64604712f\" (UID: \"1142ae88-4e1f-4957-b735-23d64604712f\") " Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.671894 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1142ae88-4e1f-4957-b735-23d64604712f-combined-ca-bundle\") pod \"1142ae88-4e1f-4957-b735-23d64604712f\" (UID: \"1142ae88-4e1f-4957-b735-23d64604712f\") " Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.671916 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1142ae88-4e1f-4957-b735-23d64604712f-config-data\") pod \"1142ae88-4e1f-4957-b735-23d64604712f\" (UID: \"1142ae88-4e1f-4957-b735-23d64604712f\") " Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.679143 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1142ae88-4e1f-4957-b735-23d64604712f-kube-api-access-jr66s" (OuterVolumeSpecName: "kube-api-access-jr66s") pod "1142ae88-4e1f-4957-b735-23d64604712f" (UID: "1142ae88-4e1f-4957-b735-23d64604712f"). InnerVolumeSpecName "kube-api-access-jr66s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.703600 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1142ae88-4e1f-4957-b735-23d64604712f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1142ae88-4e1f-4957-b735-23d64604712f" (UID: "1142ae88-4e1f-4957-b735-23d64604712f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.729611 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1142ae88-4e1f-4957-b735-23d64604712f-config-data" (OuterVolumeSpecName: "config-data") pod "1142ae88-4e1f-4957-b735-23d64604712f" (UID: "1142ae88-4e1f-4957-b735-23d64604712f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.773513 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/352eb1e1-d9bb-4184-9512-7cb1e9787edb-db-sync-config-data\") pod \"352eb1e1-d9bb-4184-9512-7cb1e9787edb\" (UID: \"352eb1e1-d9bb-4184-9512-7cb1e9787edb\") " Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.773661 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6gw6\" (UniqueName: \"kubernetes.io/projected/352eb1e1-d9bb-4184-9512-7cb1e9787edb-kube-api-access-x6gw6\") pod \"352eb1e1-d9bb-4184-9512-7cb1e9787edb\" (UID: \"352eb1e1-d9bb-4184-9512-7cb1e9787edb\") " Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.773804 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/352eb1e1-d9bb-4184-9512-7cb1e9787edb-combined-ca-bundle\") pod \"352eb1e1-d9bb-4184-9512-7cb1e9787edb\" (UID: \"352eb1e1-d9bb-4184-9512-7cb1e9787edb\") " Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.773837 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/352eb1e1-d9bb-4184-9512-7cb1e9787edb-config-data\") pod \"352eb1e1-d9bb-4184-9512-7cb1e9787edb\" (UID: \"352eb1e1-d9bb-4184-9512-7cb1e9787edb\") " Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.774207 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jr66s\" (UniqueName: \"kubernetes.io/projected/1142ae88-4e1f-4957-b735-23d64604712f-kube-api-access-jr66s\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.774227 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1142ae88-4e1f-4957-b735-23d64604712f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.774591 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1142ae88-4e1f-4957-b735-23d64604712f-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.778519 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/352eb1e1-d9bb-4184-9512-7cb1e9787edb-kube-api-access-x6gw6" (OuterVolumeSpecName: "kube-api-access-x6gw6") pod "352eb1e1-d9bb-4184-9512-7cb1e9787edb" (UID: "352eb1e1-d9bb-4184-9512-7cb1e9787edb"). InnerVolumeSpecName "kube-api-access-x6gw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.780098 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/352eb1e1-d9bb-4184-9512-7cb1e9787edb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "352eb1e1-d9bb-4184-9512-7cb1e9787edb" (UID: "352eb1e1-d9bb-4184-9512-7cb1e9787edb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.800311 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/352eb1e1-d9bb-4184-9512-7cb1e9787edb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "352eb1e1-d9bb-4184-9512-7cb1e9787edb" (UID: "352eb1e1-d9bb-4184-9512-7cb1e9787edb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.821559 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/352eb1e1-d9bb-4184-9512-7cb1e9787edb-config-data" (OuterVolumeSpecName: "config-data") pod "352eb1e1-d9bb-4184-9512-7cb1e9787edb" (UID: "352eb1e1-d9bb-4184-9512-7cb1e9787edb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.876656 4719 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/352eb1e1-d9bb-4184-9512-7cb1e9787edb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.876696 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6gw6\" (UniqueName: \"kubernetes.io/projected/352eb1e1-d9bb-4184-9512-7cb1e9787edb-kube-api-access-x6gw6\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.876707 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/352eb1e1-d9bb-4184-9512-7cb1e9787edb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.876716 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/352eb1e1-d9bb-4184-9512-7cb1e9787edb-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.912931 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-qs479" Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.912934 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-qs479" event={"ID":"352eb1e1-d9bb-4184-9512-7cb1e9787edb","Type":"ContainerDied","Data":"9ac54d5b90796387346d368ba178c1f43cac34b2e9be51aeb365e1f553681936"} Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.913520 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ac54d5b90796387346d368ba178c1f43cac34b2e9be51aeb365e1f553681936" Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.914896 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-whtzg" event={"ID":"1142ae88-4e1f-4957-b735-23d64604712f","Type":"ContainerDied","Data":"a0e304d424e56d0ff397e262ee1d14bf08fd7563e1388617ccbb39381e02adfd"} Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.914922 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0e304d424e56d0ff397e262ee1d14bf08fd7563e1388617ccbb39381e02adfd" Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.914973 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-whtzg" Oct 09 15:35:21 crc kubenswrapper[4719]: I1009 15:35:21.957920 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-60a1-account-create-8lvrq"] Oct 09 15:35:22 crc kubenswrapper[4719]: W1009 15:35:22.008654 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfca1d680_ba45_4687_bce1_4dc7d3e029f4.slice/crio-3fb8471ea69e4706145636f822454e2ca864f3afc17e44261715b801de8f08a4 WatchSource:0}: Error finding container 3fb8471ea69e4706145636f822454e2ca864f3afc17e44261715b801de8f08a4: Status 404 returned error can't find the container with id 3fb8471ea69e4706145636f822454e2ca864f3afc17e44261715b801de8f08a4 Oct 09 15:35:22 crc kubenswrapper[4719]: I1009 15:35:22.049677 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d4a7-account-create-fnx7r"] Oct 09 15:35:22 crc kubenswrapper[4719]: I1009 15:35:22.147179 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2a57-account-create-9ctdx"] Oct 09 15:35:22 crc kubenswrapper[4719]: W1009 15:35:22.148667 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccfcbb78_ac73_4cea_a8c3_f676443f187b.slice/crio-bd64ff2096a2bc94c030b7c6b41c4a1380d0b36e22e593db72a9a5599540d3e6 WatchSource:0}: Error finding container bd64ff2096a2bc94c030b7c6b41c4a1380d0b36e22e593db72a9a5599540d3e6: Status 404 returned error can't find the container with id bd64ff2096a2bc94c030b7c6b41c4a1380d0b36e22e593db72a9a5599540d3e6 Oct 09 15:35:22 crc kubenswrapper[4719]: I1009 15:35:22.840146 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-wn6ns"] Oct 09 15:35:22 crc kubenswrapper[4719]: E1009 15:35:22.842182 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1142ae88-4e1f-4957-b735-23d64604712f" containerName="keystone-db-sync" Oct 09 15:35:22 crc kubenswrapper[4719]: I1009 15:35:22.842237 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="1142ae88-4e1f-4957-b735-23d64604712f" containerName="keystone-db-sync" Oct 09 15:35:22 crc kubenswrapper[4719]: E1009 15:35:22.842254 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="352eb1e1-d9bb-4184-9512-7cb1e9787edb" containerName="watcher-db-sync" Oct 09 15:35:22 crc kubenswrapper[4719]: I1009 15:35:22.842263 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="352eb1e1-d9bb-4184-9512-7cb1e9787edb" containerName="watcher-db-sync" Oct 09 15:35:22 crc kubenswrapper[4719]: I1009 15:35:22.842698 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="1142ae88-4e1f-4957-b735-23d64604712f" containerName="keystone-db-sync" Oct 09 15:35:22 crc kubenswrapper[4719]: I1009 15:35:22.842730 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="352eb1e1-d9bb-4184-9512-7cb1e9787edb" containerName="watcher-db-sync" Oct 09 15:35:22 crc kubenswrapper[4719]: I1009 15:35:22.843693 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wn6ns" Oct 09 15:35:22 crc kubenswrapper[4719]: I1009 15:35:22.848316 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 09 15:35:22 crc kubenswrapper[4719]: I1009 15:35:22.849164 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 09 15:35:22 crc kubenswrapper[4719]: I1009 15:35:22.849330 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2vsx5" Oct 09 15:35:22 crc kubenswrapper[4719]: I1009 15:35:22.849538 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 09 15:35:22 crc kubenswrapper[4719]: I1009 15:35:22.885160 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wn6ns"] Oct 09 15:35:22 crc kubenswrapper[4719]: I1009 15:35:22.901051 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bf7887b65-ql45f"] Oct 09 15:35:22 crc kubenswrapper[4719]: I1009 15:35:22.912173 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj2t7\" (UniqueName: \"kubernetes.io/projected/f4998573-c12b-4a90-a58b-3af8be611b96-kube-api-access-pj2t7\") pod \"keystone-bootstrap-wn6ns\" (UID: \"f4998573-c12b-4a90-a58b-3af8be611b96\") " pod="openstack/keystone-bootstrap-wn6ns" Oct 09 15:35:22 crc kubenswrapper[4719]: I1009 15:35:22.912386 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-config-data\") pod \"keystone-bootstrap-wn6ns\" (UID: \"f4998573-c12b-4a90-a58b-3af8be611b96\") " pod="openstack/keystone-bootstrap-wn6ns" Oct 09 15:35:22 crc kubenswrapper[4719]: I1009 15:35:22.912440 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-scripts\") pod \"keystone-bootstrap-wn6ns\" (UID: \"f4998573-c12b-4a90-a58b-3af8be611b96\") " pod="openstack/keystone-bootstrap-wn6ns" Oct 09 15:35:22 crc kubenswrapper[4719]: I1009 15:35:22.912516 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-fernet-keys\") pod \"keystone-bootstrap-wn6ns\" (UID: \"f4998573-c12b-4a90-a58b-3af8be611b96\") " pod="openstack/keystone-bootstrap-wn6ns" Oct 09 15:35:22 crc kubenswrapper[4719]: I1009 15:35:22.912579 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-credential-keys\") pod \"keystone-bootstrap-wn6ns\" (UID: \"f4998573-c12b-4a90-a58b-3af8be611b96\") " pod="openstack/keystone-bootstrap-wn6ns" Oct 09 15:35:22 crc kubenswrapper[4719]: I1009 15:35:22.912608 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-combined-ca-bundle\") pod \"keystone-bootstrap-wn6ns\" (UID: \"f4998573-c12b-4a90-a58b-3af8be611b96\") " pod="openstack/keystone-bootstrap-wn6ns" Oct 09 15:35:22 crc kubenswrapper[4719]: I1009 15:35:22.924160 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf7887b65-ql45f" Oct 09 15:35:22 crc kubenswrapper[4719]: I1009 15:35:22.965166 4719 generic.go:334] "Generic (PLEG): container finished" podID="ccfcbb78-ac73-4cea-a8c3-f676443f187b" containerID="65ad4b7954f202b5cb00ecc8ad75a27ee18084f1c55c8114d5015d1a1dff8f24" exitCode=0 Oct 09 15:35:22 crc kubenswrapper[4719]: I1009 15:35:22.965268 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2a57-account-create-9ctdx" event={"ID":"ccfcbb78-ac73-4cea-a8c3-f676443f187b","Type":"ContainerDied","Data":"65ad4b7954f202b5cb00ecc8ad75a27ee18084f1c55c8114d5015d1a1dff8f24"} Oct 09 15:35:22 crc kubenswrapper[4719]: I1009 15:35:22.965297 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2a57-account-create-9ctdx" event={"ID":"ccfcbb78-ac73-4cea-a8c3-f676443f187b","Type":"ContainerStarted","Data":"bd64ff2096a2bc94c030b7c6b41c4a1380d0b36e22e593db72a9a5599540d3e6"} Oct 09 15:35:22 crc kubenswrapper[4719]: I1009 15:35:22.981555 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xs6f4" event={"ID":"52932375-ade4-4056-a4f8-6758db0df52f","Type":"ContainerStarted","Data":"9fd313329586b35b942fe6233e6f3512d120cb80c30bd770fb6e47f079d5fa27"} Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.003693 4719 generic.go:334] "Generic (PLEG): container finished" podID="fca1d680-ba45-4687-bce1-4dc7d3e029f4" containerID="eb4a3aadc347171a450ed9511fb7da84e6b26b180fcca0960a7cccab31898403" exitCode=0 Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.003823 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-60a1-account-create-8lvrq" event={"ID":"fca1d680-ba45-4687-bce1-4dc7d3e029f4","Type":"ContainerDied","Data":"eb4a3aadc347171a450ed9511fb7da84e6b26b180fcca0960a7cccab31898403"} Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.003857 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-60a1-account-create-8lvrq" event={"ID":"fca1d680-ba45-4687-bce1-4dc7d3e029f4","Type":"ContainerStarted","Data":"3fb8471ea69e4706145636f822454e2ca864f3afc17e44261715b801de8f08a4"} Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.027630 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-config-data\") pod \"keystone-bootstrap-wn6ns\" (UID: \"f4998573-c12b-4a90-a58b-3af8be611b96\") " pod="openstack/keystone-bootstrap-wn6ns" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.028234 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-scripts\") pod \"keystone-bootstrap-wn6ns\" (UID: \"f4998573-c12b-4a90-a58b-3af8be611b96\") " pod="openstack/keystone-bootstrap-wn6ns" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.028347 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-dns-svc\") pod \"dnsmasq-dns-bf7887b65-ql45f\" (UID: \"196c6929-25a8-4376-a13d-b3bedc31a611\") " pod="openstack/dnsmasq-dns-bf7887b65-ql45f" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.028464 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-ovsdbserver-nb\") pod \"dnsmasq-dns-bf7887b65-ql45f\" (UID: \"196c6929-25a8-4376-a13d-b3bedc31a611\") " pod="openstack/dnsmasq-dns-bf7887b65-ql45f" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.028527 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-fernet-keys\") pod \"keystone-bootstrap-wn6ns\" (UID: \"f4998573-c12b-4a90-a58b-3af8be611b96\") " pod="openstack/keystone-bootstrap-wn6ns" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.028628 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-credential-keys\") pod \"keystone-bootstrap-wn6ns\" (UID: \"f4998573-c12b-4a90-a58b-3af8be611b96\") " pod="openstack/keystone-bootstrap-wn6ns" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.028702 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-combined-ca-bundle\") pod \"keystone-bootstrap-wn6ns\" (UID: \"f4998573-c12b-4a90-a58b-3af8be611b96\") " pod="openstack/keystone-bootstrap-wn6ns" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.028825 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-dns-swift-storage-0\") pod \"dnsmasq-dns-bf7887b65-ql45f\" (UID: \"196c6929-25a8-4376-a13d-b3bedc31a611\") " pod="openstack/dnsmasq-dns-bf7887b65-ql45f" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.028898 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj2t7\" (UniqueName: \"kubernetes.io/projected/f4998573-c12b-4a90-a58b-3af8be611b96-kube-api-access-pj2t7\") pod \"keystone-bootstrap-wn6ns\" (UID: \"f4998573-c12b-4a90-a58b-3af8be611b96\") " pod="openstack/keystone-bootstrap-wn6ns" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.028964 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-config\") pod \"dnsmasq-dns-bf7887b65-ql45f\" (UID: \"196c6929-25a8-4376-a13d-b3bedc31a611\") " pod="openstack/dnsmasq-dns-bf7887b65-ql45f" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.029206 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8258\" (UniqueName: \"kubernetes.io/projected/196c6929-25a8-4376-a13d-b3bedc31a611-kube-api-access-s8258\") pod \"dnsmasq-dns-bf7887b65-ql45f\" (UID: \"196c6929-25a8-4376-a13d-b3bedc31a611\") " pod="openstack/dnsmasq-dns-bf7887b65-ql45f" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.029299 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-ovsdbserver-sb\") pod \"dnsmasq-dns-bf7887b65-ql45f\" (UID: \"196c6929-25a8-4376-a13d-b3bedc31a611\") " pod="openstack/dnsmasq-dns-bf7887b65-ql45f" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.035334 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-config-data\") pod \"keystone-bootstrap-wn6ns\" (UID: \"f4998573-c12b-4a90-a58b-3af8be611b96\") " pod="openstack/keystone-bootstrap-wn6ns" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.035688 4719 generic.go:334] "Generic (PLEG): container finished" podID="c57b6800-99e7-4fc0-a09e-d963495a39c8" containerID="6ef3888a3c2854fec9e0745b81053353ae3902ae05b20f400eabf7db3560f157" exitCode=0 Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.035799 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bf7887b65-ql45f"] Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.035842 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d4a7-account-create-fnx7r" event={"ID":"c57b6800-99e7-4fc0-a09e-d963495a39c8","Type":"ContainerDied","Data":"6ef3888a3c2854fec9e0745b81053353ae3902ae05b20f400eabf7db3560f157"} Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.035870 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d4a7-account-create-fnx7r" event={"ID":"c57b6800-99e7-4fc0-a09e-d963495a39c8","Type":"ContainerStarted","Data":"56e61994146bda3684eb8c458313f06a7cd24fcc5765bcb22ec604aa623bdce1"} Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.045385 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-scripts\") pod \"keystone-bootstrap-wn6ns\" (UID: \"f4998573-c12b-4a90-a58b-3af8be611b96\") " pod="openstack/keystone-bootstrap-wn6ns" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.045731 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-credential-keys\") pod \"keystone-bootstrap-wn6ns\" (UID: \"f4998573-c12b-4a90-a58b-3af8be611b96\") " pod="openstack/keystone-bootstrap-wn6ns" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.054059 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-combined-ca-bundle\") pod \"keystone-bootstrap-wn6ns\" (UID: \"f4998573-c12b-4a90-a58b-3af8be611b96\") " pod="openstack/keystone-bootstrap-wn6ns" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.072067 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-fernet-keys\") pod \"keystone-bootstrap-wn6ns\" (UID: \"f4998573-c12b-4a90-a58b-3af8be611b96\") " pod="openstack/keystone-bootstrap-wn6ns" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.082026 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj2t7\" (UniqueName: \"kubernetes.io/projected/f4998573-c12b-4a90-a58b-3af8be611b96-kube-api-access-pj2t7\") pod \"keystone-bootstrap-wn6ns\" (UID: \"f4998573-c12b-4a90-a58b-3af8be611b96\") " pod="openstack/keystone-bootstrap-wn6ns" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.106652 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-xs6f4" podStartSLOduration=7.703730064 podStartE2EDuration="20.106635086s" podCreationTimestamp="2025-10-09 15:35:03 +0000 UTC" firstStartedPulling="2025-10-09 15:35:09.770185564 +0000 UTC m=+1015.279896849" lastFinishedPulling="2025-10-09 15:35:22.173090586 +0000 UTC m=+1027.682801871" observedRunningTime="2025-10-09 15:35:23.026524603 +0000 UTC m=+1028.536235888" watchObservedRunningTime="2025-10-09 15:35:23.106635086 +0000 UTC m=+1028.616346371" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.127189 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.128420 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.132683 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-dns-swift-storage-0\") pod \"dnsmasq-dns-bf7887b65-ql45f\" (UID: \"196c6929-25a8-4376-a13d-b3bedc31a611\") " pod="openstack/dnsmasq-dns-bf7887b65-ql45f" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.132872 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-config\") pod \"dnsmasq-dns-bf7887b65-ql45f\" (UID: \"196c6929-25a8-4376-a13d-b3bedc31a611\") " pod="openstack/dnsmasq-dns-bf7887b65-ql45f" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.132965 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8258\" (UniqueName: \"kubernetes.io/projected/196c6929-25a8-4376-a13d-b3bedc31a611-kube-api-access-s8258\") pod \"dnsmasq-dns-bf7887b65-ql45f\" (UID: \"196c6929-25a8-4376-a13d-b3bedc31a611\") " pod="openstack/dnsmasq-dns-bf7887b65-ql45f" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.133076 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-ovsdbserver-sb\") pod \"dnsmasq-dns-bf7887b65-ql45f\" (UID: \"196c6929-25a8-4376-a13d-b3bedc31a611\") " pod="openstack/dnsmasq-dns-bf7887b65-ql45f" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.133238 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-dns-svc\") pod \"dnsmasq-dns-bf7887b65-ql45f\" (UID: \"196c6929-25a8-4376-a13d-b3bedc31a611\") " pod="openstack/dnsmasq-dns-bf7887b65-ql45f" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.133333 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-ovsdbserver-nb\") pod \"dnsmasq-dns-bf7887b65-ql45f\" (UID: \"196c6929-25a8-4376-a13d-b3bedc31a611\") " pod="openstack/dnsmasq-dns-bf7887b65-ql45f" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.134365 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-ovsdbserver-nb\") pod \"dnsmasq-dns-bf7887b65-ql45f\" (UID: \"196c6929-25a8-4376-a13d-b3bedc31a611\") " pod="openstack/dnsmasq-dns-bf7887b65-ql45f" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.135927 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-ovsdbserver-sb\") pod \"dnsmasq-dns-bf7887b65-ql45f\" (UID: \"196c6929-25a8-4376-a13d-b3bedc31a611\") " pod="openstack/dnsmasq-dns-bf7887b65-ql45f" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.136270 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-dns-swift-storage-0\") pod \"dnsmasq-dns-bf7887b65-ql45f\" (UID: \"196c6929-25a8-4376-a13d-b3bedc31a611\") " pod="openstack/dnsmasq-dns-bf7887b65-ql45f" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.136379 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-sjklv" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.136632 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.136727 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-b66884797-vvjz5"] Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.136942 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-config\") pod \"dnsmasq-dns-bf7887b65-ql45f\" (UID: \"196c6929-25a8-4376-a13d-b3bedc31a611\") " pod="openstack/dnsmasq-dns-bf7887b65-ql45f" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.137623 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-dns-svc\") pod \"dnsmasq-dns-bf7887b65-ql45f\" (UID: \"196c6929-25a8-4376-a13d-b3bedc31a611\") " pod="openstack/dnsmasq-dns-bf7887b65-ql45f" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.138176 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b66884797-vvjz5" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.171881 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wn6ns" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.172927 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.172954 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.173174 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.173283 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-mjvl9" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.187057 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.224092 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8258\" (UniqueName: \"kubernetes.io/projected/196c6929-25a8-4376-a13d-b3bedc31a611-kube-api-access-s8258\") pod \"dnsmasq-dns-bf7887b65-ql45f\" (UID: \"196c6929-25a8-4376-a13d-b3bedc31a611\") " pod="openstack/dnsmasq-dns-bf7887b65-ql45f" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.238484 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.238754 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-config-data\") pod \"watcher-decision-engine-0\" (UID: \"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.238802 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/19fb157d-30e1-4432-9d62-0fa70d464148-horizon-secret-key\") pod \"horizon-b66884797-vvjz5\" (UID: \"19fb157d-30e1-4432-9d62-0fa70d464148\") " pod="openstack/horizon-b66884797-vvjz5" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.238844 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19fb157d-30e1-4432-9d62-0fa70d464148-scripts\") pod \"horizon-b66884797-vvjz5\" (UID: \"19fb157d-30e1-4432-9d62-0fa70d464148\") " pod="openstack/horizon-b66884797-vvjz5" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.238871 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.238892 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-logs\") pod \"watcher-decision-engine-0\" (UID: \"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.238925 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25lrr\" (UniqueName: \"kubernetes.io/projected/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-kube-api-access-25lrr\") pod \"watcher-decision-engine-0\" (UID: \"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.238953 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19fb157d-30e1-4432-9d62-0fa70d464148-config-data\") pod \"horizon-b66884797-vvjz5\" (UID: \"19fb157d-30e1-4432-9d62-0fa70d464148\") " pod="openstack/horizon-b66884797-vvjz5" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.238967 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls8s2\" (UniqueName: \"kubernetes.io/projected/19fb157d-30e1-4432-9d62-0fa70d464148-kube-api-access-ls8s2\") pod \"horizon-b66884797-vvjz5\" (UID: \"19fb157d-30e1-4432-9d62-0fa70d464148\") " pod="openstack/horizon-b66884797-vvjz5" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.239033 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19fb157d-30e1-4432-9d62-0fa70d464148-logs\") pod \"horizon-b66884797-vvjz5\" (UID: \"19fb157d-30e1-4432-9d62-0fa70d464148\") " pod="openstack/horizon-b66884797-vvjz5" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.249441 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b66884797-vvjz5"] Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.285792 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf7887b65-ql45f" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.293953 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.295235 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.303452 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.331570 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.334214 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.340036 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.340329 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.346845 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19fb157d-30e1-4432-9d62-0fa70d464148-scripts\") pod \"horizon-b66884797-vvjz5\" (UID: \"19fb157d-30e1-4432-9d62-0fa70d464148\") " pod="openstack/horizon-b66884797-vvjz5" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.347016 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.347140 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-logs\") pod \"watcher-decision-engine-0\" (UID: \"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.347243 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25lrr\" (UniqueName: \"kubernetes.io/projected/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-kube-api-access-25lrr\") pod \"watcher-decision-engine-0\" (UID: \"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.347319 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19fb157d-30e1-4432-9d62-0fa70d464148-config-data\") pod \"horizon-b66884797-vvjz5\" (UID: \"19fb157d-30e1-4432-9d62-0fa70d464148\") " pod="openstack/horizon-b66884797-vvjz5" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.347396 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls8s2\" (UniqueName: \"kubernetes.io/projected/19fb157d-30e1-4432-9d62-0fa70d464148-kube-api-access-ls8s2\") pod \"horizon-b66884797-vvjz5\" (UID: \"19fb157d-30e1-4432-9d62-0fa70d464148\") " pod="openstack/horizon-b66884797-vvjz5" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.347545 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19fb157d-30e1-4432-9d62-0fa70d464148-logs\") pod \"horizon-b66884797-vvjz5\" (UID: \"19fb157d-30e1-4432-9d62-0fa70d464148\") " pod="openstack/horizon-b66884797-vvjz5" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.347786 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.347885 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-config-data\") pod \"watcher-decision-engine-0\" (UID: \"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.347988 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/19fb157d-30e1-4432-9d62-0fa70d464148-horizon-secret-key\") pod \"horizon-b66884797-vvjz5\" (UID: \"19fb157d-30e1-4432-9d62-0fa70d464148\") " pod="openstack/horizon-b66884797-vvjz5" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.350622 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19fb157d-30e1-4432-9d62-0fa70d464148-config-data\") pod \"horizon-b66884797-vvjz5\" (UID: \"19fb157d-30e1-4432-9d62-0fa70d464148\") " pod="openstack/horizon-b66884797-vvjz5" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.350804 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19fb157d-30e1-4432-9d62-0fa70d464148-logs\") pod \"horizon-b66884797-vvjz5\" (UID: \"19fb157d-30e1-4432-9d62-0fa70d464148\") " pod="openstack/horizon-b66884797-vvjz5" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.351077 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19fb157d-30e1-4432-9d62-0fa70d464148-scripts\") pod \"horizon-b66884797-vvjz5\" (UID: \"19fb157d-30e1-4432-9d62-0fa70d464148\") " pod="openstack/horizon-b66884797-vvjz5" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.367770 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.367861 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.368156 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-logs\") pod \"watcher-decision-engine-0\" (UID: \"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.384922 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/19fb157d-30e1-4432-9d62-0fa70d464148-horizon-secret-key\") pod \"horizon-b66884797-vvjz5\" (UID: \"19fb157d-30e1-4432-9d62-0fa70d464148\") " pod="openstack/horizon-b66884797-vvjz5" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.408595 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.434077 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25lrr\" (UniqueName: \"kubernetes.io/projected/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-kube-api-access-25lrr\") pod \"watcher-decision-engine-0\" (UID: \"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.437999 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-config-data\") pod \"watcher-decision-engine-0\" (UID: \"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.438620 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls8s2\" (UniqueName: \"kubernetes.io/projected/19fb157d-30e1-4432-9d62-0fa70d464148-kube-api-access-ls8s2\") pod \"horizon-b66884797-vvjz5\" (UID: \"19fb157d-30e1-4432-9d62-0fa70d464148\") " pod="openstack/horizon-b66884797-vvjz5" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.453202 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-scripts\") pod \"ceilometer-0\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " pod="openstack/ceilometer-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.453245 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/938e3d88-39e3-4f8a-8920-0fdfcf98d5e5-logs\") pod \"watcher-applier-0\" (UID: \"938e3d88-39e3-4f8a-8920-0fdfcf98d5e5\") " pod="openstack/watcher-applier-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.453271 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj48h\" (UniqueName: \"kubernetes.io/projected/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-kube-api-access-kj48h\") pod \"ceilometer-0\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " pod="openstack/ceilometer-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.453293 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/938e3d88-39e3-4f8a-8920-0fdfcf98d5e5-config-data\") pod \"watcher-applier-0\" (UID: \"938e3d88-39e3-4f8a-8920-0fdfcf98d5e5\") " pod="openstack/watcher-applier-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.453323 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-run-httpd\") pod \"ceilometer-0\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " pod="openstack/ceilometer-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.453362 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-config-data\") pod \"ceilometer-0\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " pod="openstack/ceilometer-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.453399 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/938e3d88-39e3-4f8a-8920-0fdfcf98d5e5-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"938e3d88-39e3-4f8a-8920-0fdfcf98d5e5\") " pod="openstack/watcher-applier-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.453418 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-log-httpd\") pod \"ceilometer-0\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " pod="openstack/ceilometer-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.453478 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " pod="openstack/ceilometer-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.453496 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " pod="openstack/ceilometer-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.453542 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5lfw\" (UniqueName: \"kubernetes.io/projected/938e3d88-39e3-4f8a-8920-0fdfcf98d5e5-kube-api-access-b5lfw\") pod \"watcher-applier-0\" (UID: \"938e3d88-39e3-4f8a-8920-0fdfcf98d5e5\") " pod="openstack/watcher-applier-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.453628 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.497845 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.508411 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.510232 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.519735 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.552482 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.556100 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5lfw\" (UniqueName: \"kubernetes.io/projected/938e3d88-39e3-4f8a-8920-0fdfcf98d5e5-kube-api-access-b5lfw\") pod \"watcher-applier-0\" (UID: \"938e3d88-39e3-4f8a-8920-0fdfcf98d5e5\") " pod="openstack/watcher-applier-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.556163 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-scripts\") pod \"ceilometer-0\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " pod="openstack/ceilometer-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.556189 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/938e3d88-39e3-4f8a-8920-0fdfcf98d5e5-logs\") pod \"watcher-applier-0\" (UID: \"938e3d88-39e3-4f8a-8920-0fdfcf98d5e5\") " pod="openstack/watcher-applier-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.556236 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj48h\" (UniqueName: \"kubernetes.io/projected/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-kube-api-access-kj48h\") pod \"ceilometer-0\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " pod="openstack/ceilometer-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.556335 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/938e3d88-39e3-4f8a-8920-0fdfcf98d5e5-config-data\") pod \"watcher-applier-0\" (UID: \"938e3d88-39e3-4f8a-8920-0fdfcf98d5e5\") " pod="openstack/watcher-applier-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.556399 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-run-httpd\") pod \"ceilometer-0\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " pod="openstack/ceilometer-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.556435 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-config-data\") pod \"ceilometer-0\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " pod="openstack/ceilometer-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.556488 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/938e3d88-39e3-4f8a-8920-0fdfcf98d5e5-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"938e3d88-39e3-4f8a-8920-0fdfcf98d5e5\") " pod="openstack/watcher-applier-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.556513 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-log-httpd\") pod \"ceilometer-0\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " pod="openstack/ceilometer-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.556536 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " pod="openstack/ceilometer-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.556553 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " pod="openstack/ceilometer-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.569259 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-run-httpd\") pod \"ceilometer-0\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " pod="openstack/ceilometer-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.573286 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/938e3d88-39e3-4f8a-8920-0fdfcf98d5e5-logs\") pod \"watcher-applier-0\" (UID: \"938e3d88-39e3-4f8a-8920-0fdfcf98d5e5\") " pod="openstack/watcher-applier-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.575589 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-scripts\") pod \"ceilometer-0\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " pod="openstack/ceilometer-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.609200 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-log-httpd\") pod \"ceilometer-0\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " pod="openstack/ceilometer-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.609489 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-config-data\") pod \"ceilometer-0\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " pod="openstack/ceilometer-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.609924 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5lfw\" (UniqueName: \"kubernetes.io/projected/938e3d88-39e3-4f8a-8920-0fdfcf98d5e5-kube-api-access-b5lfw\") pod \"watcher-applier-0\" (UID: \"938e3d88-39e3-4f8a-8920-0fdfcf98d5e5\") " pod="openstack/watcher-applier-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.611003 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/938e3d88-39e3-4f8a-8920-0fdfcf98d5e5-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"938e3d88-39e3-4f8a-8920-0fdfcf98d5e5\") " pod="openstack/watcher-applier-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.627909 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " pod="openstack/ceilometer-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.639039 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " pod="openstack/ceilometer-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.640138 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj48h\" (UniqueName: \"kubernetes.io/projected/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-kube-api-access-kj48h\") pod \"ceilometer-0\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " pod="openstack/ceilometer-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.641146 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/938e3d88-39e3-4f8a-8920-0fdfcf98d5e5-config-data\") pod \"watcher-applier-0\" (UID: \"938e3d88-39e3-4f8a-8920-0fdfcf98d5e5\") " pod="openstack/watcher-applier-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.678636 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-d2888"] Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.679941 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-d2888" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.681732 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95a97721-d5b6-401a-94ef-751878f64947-logs\") pod \"watcher-api-0\" (UID: \"95a97721-d5b6-401a-94ef-751878f64947\") " pod="openstack/watcher-api-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.681838 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/95a97721-d5b6-401a-94ef-751878f64947-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"95a97721-d5b6-401a-94ef-751878f64947\") " pod="openstack/watcher-api-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.681859 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a97721-d5b6-401a-94ef-751878f64947-config-data\") pod \"watcher-api-0\" (UID: \"95a97721-d5b6-401a-94ef-751878f64947\") " pod="openstack/watcher-api-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.681878 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-527lf\" (UniqueName: \"kubernetes.io/projected/95a97721-d5b6-401a-94ef-751878f64947-kube-api-access-527lf\") pod \"watcher-api-0\" (UID: \"95a97721-d5b6-401a-94ef-751878f64947\") " pod="openstack/watcher-api-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.681926 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a97721-d5b6-401a-94ef-751878f64947-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"95a97721-d5b6-401a-94ef-751878f64947\") " pod="openstack/watcher-api-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.687979 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.688161 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.688261 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-mgtx8" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.696555 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-d2888"] Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.709170 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b66884797-vvjz5" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.766610 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bf7887b65-ql45f"] Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.790606 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-758645fbfc-c4jmx"] Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.791675 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cf902a-77e9-4e57-89d0-36765e27f361-config-data\") pod \"placement-db-sync-d2888\" (UID: \"19cf902a-77e9-4e57-89d0-36765e27f361\") " pod="openstack/placement-db-sync-d2888" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.791741 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bgzt\" (UniqueName: \"kubernetes.io/projected/19cf902a-77e9-4e57-89d0-36765e27f361-kube-api-access-5bgzt\") pod \"placement-db-sync-d2888\" (UID: \"19cf902a-77e9-4e57-89d0-36765e27f361\") " pod="openstack/placement-db-sync-d2888" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.791776 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cf902a-77e9-4e57-89d0-36765e27f361-combined-ca-bundle\") pod \"placement-db-sync-d2888\" (UID: \"19cf902a-77e9-4e57-89d0-36765e27f361\") " pod="openstack/placement-db-sync-d2888" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.791797 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19cf902a-77e9-4e57-89d0-36765e27f361-scripts\") pod \"placement-db-sync-d2888\" (UID: \"19cf902a-77e9-4e57-89d0-36765e27f361\") " pod="openstack/placement-db-sync-d2888" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.791838 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/95a97721-d5b6-401a-94ef-751878f64947-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"95a97721-d5b6-401a-94ef-751878f64947\") " pod="openstack/watcher-api-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.791860 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a97721-d5b6-401a-94ef-751878f64947-config-data\") pod \"watcher-api-0\" (UID: \"95a97721-d5b6-401a-94ef-751878f64947\") " pod="openstack/watcher-api-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.791879 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-527lf\" (UniqueName: \"kubernetes.io/projected/95a97721-d5b6-401a-94ef-751878f64947-kube-api-access-527lf\") pod \"watcher-api-0\" (UID: \"95a97721-d5b6-401a-94ef-751878f64947\") " pod="openstack/watcher-api-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.791911 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19cf902a-77e9-4e57-89d0-36765e27f361-logs\") pod \"placement-db-sync-d2888\" (UID: \"19cf902a-77e9-4e57-89d0-36765e27f361\") " pod="openstack/placement-db-sync-d2888" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.791949 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a97721-d5b6-401a-94ef-751878f64947-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"95a97721-d5b6-401a-94ef-751878f64947\") " pod="openstack/watcher-api-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.791999 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95a97721-d5b6-401a-94ef-751878f64947-logs\") pod \"watcher-api-0\" (UID: \"95a97721-d5b6-401a-94ef-751878f64947\") " pod="openstack/watcher-api-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.792342 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-758645fbfc-c4jmx" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.792518 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95a97721-d5b6-401a-94ef-751878f64947-logs\") pod \"watcher-api-0\" (UID: \"95a97721-d5b6-401a-94ef-751878f64947\") " pod="openstack/watcher-api-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.795105 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.805811 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/95a97721-d5b6-401a-94ef-751878f64947-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"95a97721-d5b6-401a-94ef-751878f64947\") " pod="openstack/watcher-api-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.806579 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a97721-d5b6-401a-94ef-751878f64947-config-data\") pod \"watcher-api-0\" (UID: \"95a97721-d5b6-401a-94ef-751878f64947\") " pod="openstack/watcher-api-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.811033 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a97721-d5b6-401a-94ef-751878f64947-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"95a97721-d5b6-401a-94ef-751878f64947\") " pod="openstack/watcher-api-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.821298 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.822772 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-758645fbfc-c4jmx"] Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.827633 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-527lf\" (UniqueName: \"kubernetes.io/projected/95a97721-d5b6-401a-94ef-751878f64947-kube-api-access-527lf\") pod \"watcher-api-0\" (UID: \"95a97721-d5b6-401a-94ef-751878f64947\") " pod="openstack/watcher-api-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.846720 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5789fb8fc7-6t67j"] Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.848457 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.850058 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.861005 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5789fb8fc7-6t67j"] Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.894141 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19cf902a-77e9-4e57-89d0-36765e27f361-logs\") pod \"placement-db-sync-d2888\" (UID: \"19cf902a-77e9-4e57-89d0-36765e27f361\") " pod="openstack/placement-db-sync-d2888" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.894311 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2deaf084-98a3-477d-b5ac-8933b175ef00-scripts\") pod \"horizon-758645fbfc-c4jmx\" (UID: \"2deaf084-98a3-477d-b5ac-8933b175ef00\") " pod="openstack/horizon-758645fbfc-c4jmx" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.894337 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2deaf084-98a3-477d-b5ac-8933b175ef00-logs\") pod \"horizon-758645fbfc-c4jmx\" (UID: \"2deaf084-98a3-477d-b5ac-8933b175ef00\") " pod="openstack/horizon-758645fbfc-c4jmx" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.894423 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2deaf084-98a3-477d-b5ac-8933b175ef00-horizon-secret-key\") pod \"horizon-758645fbfc-c4jmx\" (UID: \"2deaf084-98a3-477d-b5ac-8933b175ef00\") " pod="openstack/horizon-758645fbfc-c4jmx" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.894448 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cf902a-77e9-4e57-89d0-36765e27f361-config-data\") pod \"placement-db-sync-d2888\" (UID: \"19cf902a-77e9-4e57-89d0-36765e27f361\") " pod="openstack/placement-db-sync-d2888" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.894510 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bgzt\" (UniqueName: \"kubernetes.io/projected/19cf902a-77e9-4e57-89d0-36765e27f361-kube-api-access-5bgzt\") pod \"placement-db-sync-d2888\" (UID: \"19cf902a-77e9-4e57-89d0-36765e27f361\") " pod="openstack/placement-db-sync-d2888" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.894554 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cf902a-77e9-4e57-89d0-36765e27f361-combined-ca-bundle\") pod \"placement-db-sync-d2888\" (UID: \"19cf902a-77e9-4e57-89d0-36765e27f361\") " pod="openstack/placement-db-sync-d2888" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.894586 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19cf902a-77e9-4e57-89d0-36765e27f361-scripts\") pod \"placement-db-sync-d2888\" (UID: \"19cf902a-77e9-4e57-89d0-36765e27f361\") " pod="openstack/placement-db-sync-d2888" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.894642 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2deaf084-98a3-477d-b5ac-8933b175ef00-config-data\") pod \"horizon-758645fbfc-c4jmx\" (UID: \"2deaf084-98a3-477d-b5ac-8933b175ef00\") " pod="openstack/horizon-758645fbfc-c4jmx" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.894665 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmxzx\" (UniqueName: \"kubernetes.io/projected/2deaf084-98a3-477d-b5ac-8933b175ef00-kube-api-access-rmxzx\") pod \"horizon-758645fbfc-c4jmx\" (UID: \"2deaf084-98a3-477d-b5ac-8933b175ef00\") " pod="openstack/horizon-758645fbfc-c4jmx" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.895092 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19cf902a-77e9-4e57-89d0-36765e27f361-logs\") pod \"placement-db-sync-d2888\" (UID: \"19cf902a-77e9-4e57-89d0-36765e27f361\") " pod="openstack/placement-db-sync-d2888" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.915986 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19cf902a-77e9-4e57-89d0-36765e27f361-scripts\") pod \"placement-db-sync-d2888\" (UID: \"19cf902a-77e9-4e57-89d0-36765e27f361\") " pod="openstack/placement-db-sync-d2888" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.918765 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cf902a-77e9-4e57-89d0-36765e27f361-config-data\") pod \"placement-db-sync-d2888\" (UID: \"19cf902a-77e9-4e57-89d0-36765e27f361\") " pod="openstack/placement-db-sync-d2888" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.923409 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cf902a-77e9-4e57-89d0-36765e27f361-combined-ca-bundle\") pod \"placement-db-sync-d2888\" (UID: \"19cf902a-77e9-4e57-89d0-36765e27f361\") " pod="openstack/placement-db-sync-d2888" Oct 09 15:35:23 crc kubenswrapper[4719]: I1009 15:35:23.932468 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bgzt\" (UniqueName: \"kubernetes.io/projected/19cf902a-77e9-4e57-89d0-36765e27f361-kube-api-access-5bgzt\") pod \"placement-db-sync-d2888\" (UID: \"19cf902a-77e9-4e57-89d0-36765e27f361\") " pod="openstack/placement-db-sync-d2888" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:23.996303 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4v48\" (UniqueName: \"kubernetes.io/projected/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-kube-api-access-s4v48\") pod \"dnsmasq-dns-5789fb8fc7-6t67j\" (UID: \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\") " pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:23.996433 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-ovsdbserver-nb\") pod \"dnsmasq-dns-5789fb8fc7-6t67j\" (UID: \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\") " pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:23.996474 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-ovsdbserver-sb\") pod \"dnsmasq-dns-5789fb8fc7-6t67j\" (UID: \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\") " pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:23.996504 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-dns-swift-storage-0\") pod \"dnsmasq-dns-5789fb8fc7-6t67j\" (UID: \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\") " pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:23.996530 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2deaf084-98a3-477d-b5ac-8933b175ef00-config-data\") pod \"horizon-758645fbfc-c4jmx\" (UID: \"2deaf084-98a3-477d-b5ac-8933b175ef00\") " pod="openstack/horizon-758645fbfc-c4jmx" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:23.996546 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmxzx\" (UniqueName: \"kubernetes.io/projected/2deaf084-98a3-477d-b5ac-8933b175ef00-kube-api-access-rmxzx\") pod \"horizon-758645fbfc-c4jmx\" (UID: \"2deaf084-98a3-477d-b5ac-8933b175ef00\") " pod="openstack/horizon-758645fbfc-c4jmx" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:23.996643 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-dns-svc\") pod \"dnsmasq-dns-5789fb8fc7-6t67j\" (UID: \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\") " pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:23.996674 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-config\") pod \"dnsmasq-dns-5789fb8fc7-6t67j\" (UID: \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\") " pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:23.996690 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2deaf084-98a3-477d-b5ac-8933b175ef00-scripts\") pod \"horizon-758645fbfc-c4jmx\" (UID: \"2deaf084-98a3-477d-b5ac-8933b175ef00\") " pod="openstack/horizon-758645fbfc-c4jmx" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:23.996707 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2deaf084-98a3-477d-b5ac-8933b175ef00-logs\") pod \"horizon-758645fbfc-c4jmx\" (UID: \"2deaf084-98a3-477d-b5ac-8933b175ef00\") " pod="openstack/horizon-758645fbfc-c4jmx" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:23.996739 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2deaf084-98a3-477d-b5ac-8933b175ef00-horizon-secret-key\") pod \"horizon-758645fbfc-c4jmx\" (UID: \"2deaf084-98a3-477d-b5ac-8933b175ef00\") " pod="openstack/horizon-758645fbfc-c4jmx" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:23.999980 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2deaf084-98a3-477d-b5ac-8933b175ef00-config-data\") pod \"horizon-758645fbfc-c4jmx\" (UID: \"2deaf084-98a3-477d-b5ac-8933b175ef00\") " pod="openstack/horizon-758645fbfc-c4jmx" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.000640 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2deaf084-98a3-477d-b5ac-8933b175ef00-logs\") pod \"horizon-758645fbfc-c4jmx\" (UID: \"2deaf084-98a3-477d-b5ac-8933b175ef00\") " pod="openstack/horizon-758645fbfc-c4jmx" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.001546 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2deaf084-98a3-477d-b5ac-8933b175ef00-scripts\") pod \"horizon-758645fbfc-c4jmx\" (UID: \"2deaf084-98a3-477d-b5ac-8933b175ef00\") " pod="openstack/horizon-758645fbfc-c4jmx" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.007792 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2deaf084-98a3-477d-b5ac-8933b175ef00-horizon-secret-key\") pod \"horizon-758645fbfc-c4jmx\" (UID: \"2deaf084-98a3-477d-b5ac-8933b175ef00\") " pod="openstack/horizon-758645fbfc-c4jmx" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.021511 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmxzx\" (UniqueName: \"kubernetes.io/projected/2deaf084-98a3-477d-b5ac-8933b175ef00-kube-api-access-rmxzx\") pod \"horizon-758645fbfc-c4jmx\" (UID: \"2deaf084-98a3-477d-b5ac-8933b175ef00\") " pod="openstack/horizon-758645fbfc-c4jmx" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.100421 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4v48\" (UniqueName: \"kubernetes.io/projected/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-kube-api-access-s4v48\") pod \"dnsmasq-dns-5789fb8fc7-6t67j\" (UID: \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\") " pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.100486 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-ovsdbserver-nb\") pod \"dnsmasq-dns-5789fb8fc7-6t67j\" (UID: \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\") " pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.100511 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-ovsdbserver-sb\") pod \"dnsmasq-dns-5789fb8fc7-6t67j\" (UID: \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\") " pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.100537 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-dns-swift-storage-0\") pod \"dnsmasq-dns-5789fb8fc7-6t67j\" (UID: \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\") " pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.100606 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-dns-svc\") pod \"dnsmasq-dns-5789fb8fc7-6t67j\" (UID: \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\") " pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.100650 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-config\") pod \"dnsmasq-dns-5789fb8fc7-6t67j\" (UID: \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\") " pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.103443 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-ovsdbserver-nb\") pod \"dnsmasq-dns-5789fb8fc7-6t67j\" (UID: \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\") " pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.104097 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-config\") pod \"dnsmasq-dns-5789fb8fc7-6t67j\" (UID: \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\") " pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.104333 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-dns-swift-storage-0\") pod \"dnsmasq-dns-5789fb8fc7-6t67j\" (UID: \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\") " pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.104474 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-dns-svc\") pod \"dnsmasq-dns-5789fb8fc7-6t67j\" (UID: \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\") " pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.105080 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-ovsdbserver-sb\") pod \"dnsmasq-dns-5789fb8fc7-6t67j\" (UID: \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\") " pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.135607 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4v48\" (UniqueName: \"kubernetes.io/projected/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-kube-api-access-s4v48\") pod \"dnsmasq-dns-5789fb8fc7-6t67j\" (UID: \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\") " pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.164748 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-d2888" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.178630 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-758645fbfc-c4jmx" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.181847 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wn6ns"] Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.195526 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.307600 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.353849 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bf7887b65-ql45f"] Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.720113 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-60a1-account-create-8lvrq" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.830112 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zjrl\" (UniqueName: \"kubernetes.io/projected/fca1d680-ba45-4687-bce1-4dc7d3e029f4-kube-api-access-9zjrl\") pod \"fca1d680-ba45-4687-bce1-4dc7d3e029f4\" (UID: \"fca1d680-ba45-4687-bce1-4dc7d3e029f4\") " Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.855037 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fca1d680-ba45-4687-bce1-4dc7d3e029f4-kube-api-access-9zjrl" (OuterVolumeSpecName: "kube-api-access-9zjrl") pod "fca1d680-ba45-4687-bce1-4dc7d3e029f4" (UID: "fca1d680-ba45-4687-bce1-4dc7d3e029f4"). InnerVolumeSpecName "kube-api-access-9zjrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:35:24 crc kubenswrapper[4719]: I1009 15:35:24.933576 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zjrl\" (UniqueName: \"kubernetes.io/projected/fca1d680-ba45-4687-bce1-4dc7d3e029f4-kube-api-access-9zjrl\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.002017 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.005944 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2a57-account-create-9ctdx" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.010832 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d4a7-account-create-fnx7r" Oct 09 15:35:25 crc kubenswrapper[4719]: W1009 15:35:25.030006 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod590f0bbf_4518_4aa6_a71f_1f28b5f4e02a.slice/crio-d8f33df0dcfcb12578c1cfa83bf64cc6d0458f822f348ed5a5aac37ca7cc8ee4 WatchSource:0}: Error finding container d8f33df0dcfcb12578c1cfa83bf64cc6d0458f822f348ed5a5aac37ca7cc8ee4: Status 404 returned error can't find the container with id d8f33df0dcfcb12578c1cfa83bf64cc6d0458f822f348ed5a5aac37ca7cc8ee4 Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.039542 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.059914 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.068887 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b66884797-vvjz5"] Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.088905 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d4a7-account-create-fnx7r" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.089315 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d4a7-account-create-fnx7r" event={"ID":"c57b6800-99e7-4fc0-a09e-d963495a39c8","Type":"ContainerDied","Data":"56e61994146bda3684eb8c458313f06a7cd24fcc5765bcb22ec604aa623bdce1"} Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.089341 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56e61994146bda3684eb8c458313f06a7cd24fcc5765bcb22ec604aa623bdce1" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.105775 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a","Type":"ContainerStarted","Data":"d8f33df0dcfcb12578c1cfa83bf64cc6d0458f822f348ed5a5aac37ca7cc8ee4"} Oct 09 15:35:25 crc kubenswrapper[4719]: W1009 15:35:25.110153 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19fb157d_30e1_4432_9d62_0fa70d464148.slice/crio-d1de847ef5d248cfd346e67f64d43dc6b837f858f0e102c551c824b00cec39c7 WatchSource:0}: Error finding container d1de847ef5d248cfd346e67f64d43dc6b837f858f0e102c551c824b00cec39c7: Status 404 returned error can't find the container with id d1de847ef5d248cfd346e67f64d43dc6b837f858f0e102c551c824b00cec39c7 Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.112844 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"938e3d88-39e3-4f8a-8920-0fdfcf98d5e5","Type":"ContainerStarted","Data":"daf7889a300bb009ed82468173dceb3aac9118deef772e6d80c16abbf31384c4"} Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.131649 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2a57-account-create-9ctdx" event={"ID":"ccfcbb78-ac73-4cea-a8c3-f676443f187b","Type":"ContainerDied","Data":"bd64ff2096a2bc94c030b7c6b41c4a1380d0b36e22e593db72a9a5599540d3e6"} Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.131688 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd64ff2096a2bc94c030b7c6b41c4a1380d0b36e22e593db72a9a5599540d3e6" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.131742 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2a57-account-create-9ctdx" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.133875 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wn6ns" event={"ID":"f4998573-c12b-4a90-a58b-3af8be611b96","Type":"ContainerStarted","Data":"e5f20b20c42e0d1513484d9a4f5a033569d029d7a21fb90191416fde7f5713c0"} Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.133917 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wn6ns" event={"ID":"f4998573-c12b-4a90-a58b-3af8be611b96","Type":"ContainerStarted","Data":"f15279ddb3769b1f33a00bfcb0b9ed65e2846f47d0482f0a81442e56ed03d9af"} Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.136331 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rrx7\" (UniqueName: \"kubernetes.io/projected/c57b6800-99e7-4fc0-a09e-d963495a39c8-kube-api-access-8rrx7\") pod \"c57b6800-99e7-4fc0-a09e-d963495a39c8\" (UID: \"c57b6800-99e7-4fc0-a09e-d963495a39c8\") " Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.136547 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpf2n\" (UniqueName: \"kubernetes.io/projected/ccfcbb78-ac73-4cea-a8c3-f676443f187b-kube-api-access-lpf2n\") pod \"ccfcbb78-ac73-4cea-a8c3-f676443f187b\" (UID: \"ccfcbb78-ac73-4cea-a8c3-f676443f187b\") " Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.140891 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccfcbb78-ac73-4cea-a8c3-f676443f187b-kube-api-access-lpf2n" (OuterVolumeSpecName: "kube-api-access-lpf2n") pod "ccfcbb78-ac73-4cea-a8c3-f676443f187b" (UID: "ccfcbb78-ac73-4cea-a8c3-f676443f187b"). InnerVolumeSpecName "kube-api-access-lpf2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.142481 4719 generic.go:334] "Generic (PLEG): container finished" podID="196c6929-25a8-4376-a13d-b3bedc31a611" containerID="678a22c228d50b319141f31db32f6e4b75a8bbdd308cc3329f1297c4e034891b" exitCode=0 Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.142773 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf7887b65-ql45f" event={"ID":"196c6929-25a8-4376-a13d-b3bedc31a611","Type":"ContainerDied","Data":"678a22c228d50b319141f31db32f6e4b75a8bbdd308cc3329f1297c4e034891b"} Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.142805 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf7887b65-ql45f" event={"ID":"196c6929-25a8-4376-a13d-b3bedc31a611","Type":"ContainerStarted","Data":"a7f1b683c2e90787923f40f0edfdeb90b4d5a2b35c13305ae624d47694175bb9"} Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.143213 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c57b6800-99e7-4fc0-a09e-d963495a39c8-kube-api-access-8rrx7" (OuterVolumeSpecName: "kube-api-access-8rrx7") pod "c57b6800-99e7-4fc0-a09e-d963495a39c8" (UID: "c57b6800-99e7-4fc0-a09e-d963495a39c8"). InnerVolumeSpecName "kube-api-access-8rrx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.145934 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b","Type":"ContainerStarted","Data":"a725c1af11bf08b8d2cab41271fa2cca695c3bd9293f9cf4dc989a771f25c4b2"} Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.169555 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-wn6ns" podStartSLOduration=3.169533335 podStartE2EDuration="3.169533335s" podCreationTimestamp="2025-10-09 15:35:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:35:25.155660634 +0000 UTC m=+1030.665371939" watchObservedRunningTime="2025-10-09 15:35:25.169533335 +0000 UTC m=+1030.679244620" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.177710 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-60a1-account-create-8lvrq" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.212037 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-60a1-account-create-8lvrq" event={"ID":"fca1d680-ba45-4687-bce1-4dc7d3e029f4","Type":"ContainerDied","Data":"3fb8471ea69e4706145636f822454e2ca864f3afc17e44261715b801de8f08a4"} Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.212073 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fb8471ea69e4706145636f822454e2ca864f3afc17e44261715b801de8f08a4" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.216031 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5789fb8fc7-6t67j"] Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.240630 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rrx7\" (UniqueName: \"kubernetes.io/projected/c57b6800-99e7-4fc0-a09e-d963495a39c8-kube-api-access-8rrx7\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.240944 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpf2n\" (UniqueName: \"kubernetes.io/projected/ccfcbb78-ac73-4cea-a8c3-f676443f187b-kube-api-access-lpf2n\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.250279 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-d2888"] Oct 09 15:35:25 crc kubenswrapper[4719]: W1009 15:35:25.357581 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2deaf084_98a3_477d_b5ac_8933b175ef00.slice/crio-711ad45450bafdfdd5e761c86fd0264a12b213d9f76b2b6e42708a225073dcae WatchSource:0}: Error finding container 711ad45450bafdfdd5e761c86fd0264a12b213d9f76b2b6e42708a225073dcae: Status 404 returned error can't find the container with id 711ad45450bafdfdd5e761c86fd0264a12b213d9f76b2b6e42708a225073dcae Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.393176 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-758645fbfc-c4jmx"] Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.630706 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.665415 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-758645fbfc-c4jmx"] Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.701780 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf7887b65-ql45f" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.726680 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-574c54d6bf-d7655"] Oct 09 15:35:25 crc kubenswrapper[4719]: E1009 15:35:25.728432 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccfcbb78-ac73-4cea-a8c3-f676443f187b" containerName="mariadb-account-create" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.728460 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccfcbb78-ac73-4cea-a8c3-f676443f187b" containerName="mariadb-account-create" Oct 09 15:35:25 crc kubenswrapper[4719]: E1009 15:35:25.728485 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="196c6929-25a8-4376-a13d-b3bedc31a611" containerName="init" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.728494 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="196c6929-25a8-4376-a13d-b3bedc31a611" containerName="init" Oct 09 15:35:25 crc kubenswrapper[4719]: E1009 15:35:25.728509 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c57b6800-99e7-4fc0-a09e-d963495a39c8" containerName="mariadb-account-create" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.728517 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="c57b6800-99e7-4fc0-a09e-d963495a39c8" containerName="mariadb-account-create" Oct 09 15:35:25 crc kubenswrapper[4719]: E1009 15:35:25.728534 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fca1d680-ba45-4687-bce1-4dc7d3e029f4" containerName="mariadb-account-create" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.728542 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="fca1d680-ba45-4687-bce1-4dc7d3e029f4" containerName="mariadb-account-create" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.728756 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="196c6929-25a8-4376-a13d-b3bedc31a611" containerName="init" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.728779 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccfcbb78-ac73-4cea-a8c3-f676443f187b" containerName="mariadb-account-create" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.728790 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="c57b6800-99e7-4fc0-a09e-d963495a39c8" containerName="mariadb-account-create" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.728810 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="fca1d680-ba45-4687-bce1-4dc7d3e029f4" containerName="mariadb-account-create" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.738779 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-574c54d6bf-d7655" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.763758 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8258\" (UniqueName: \"kubernetes.io/projected/196c6929-25a8-4376-a13d-b3bedc31a611-kube-api-access-s8258\") pod \"196c6929-25a8-4376-a13d-b3bedc31a611\" (UID: \"196c6929-25a8-4376-a13d-b3bedc31a611\") " Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.764098 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-dns-swift-storage-0\") pod \"196c6929-25a8-4376-a13d-b3bedc31a611\" (UID: \"196c6929-25a8-4376-a13d-b3bedc31a611\") " Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.768824 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-config\") pod \"196c6929-25a8-4376-a13d-b3bedc31a611\" (UID: \"196c6929-25a8-4376-a13d-b3bedc31a611\") " Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.768936 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-dns-svc\") pod \"196c6929-25a8-4376-a13d-b3bedc31a611\" (UID: \"196c6929-25a8-4376-a13d-b3bedc31a611\") " Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.768973 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-ovsdbserver-nb\") pod \"196c6929-25a8-4376-a13d-b3bedc31a611\" (UID: \"196c6929-25a8-4376-a13d-b3bedc31a611\") " Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.768999 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-ovsdbserver-sb\") pod \"196c6929-25a8-4376-a13d-b3bedc31a611\" (UID: \"196c6929-25a8-4376-a13d-b3bedc31a611\") " Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.771630 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/196c6929-25a8-4376-a13d-b3bedc31a611-kube-api-access-s8258" (OuterVolumeSpecName: "kube-api-access-s8258") pod "196c6929-25a8-4376-a13d-b3bedc31a611" (UID: "196c6929-25a8-4376-a13d-b3bedc31a611"). InnerVolumeSpecName "kube-api-access-s8258". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.771691 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-574c54d6bf-d7655"] Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.774018 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8258\" (UniqueName: \"kubernetes.io/projected/196c6929-25a8-4376-a13d-b3bedc31a611-kube-api-access-s8258\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.830938 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.858405 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "196c6929-25a8-4376-a13d-b3bedc31a611" (UID: "196c6929-25a8-4376-a13d-b3bedc31a611"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.864838 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-config" (OuterVolumeSpecName: "config") pod "196c6929-25a8-4376-a13d-b3bedc31a611" (UID: "196c6929-25a8-4376-a13d-b3bedc31a611"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.877066 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "196c6929-25a8-4376-a13d-b3bedc31a611" (UID: "196c6929-25a8-4376-a13d-b3bedc31a611"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.888883 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jvpw\" (UniqueName: \"kubernetes.io/projected/dbae14e7-c975-42ac-bad6-b5ad764a239b-kube-api-access-7jvpw\") pod \"horizon-574c54d6bf-d7655\" (UID: \"dbae14e7-c975-42ac-bad6-b5ad764a239b\") " pod="openstack/horizon-574c54d6bf-d7655" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.888974 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbae14e7-c975-42ac-bad6-b5ad764a239b-logs\") pod \"horizon-574c54d6bf-d7655\" (UID: \"dbae14e7-c975-42ac-bad6-b5ad764a239b\") " pod="openstack/horizon-574c54d6bf-d7655" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.889000 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dbae14e7-c975-42ac-bad6-b5ad764a239b-config-data\") pod \"horizon-574c54d6bf-d7655\" (UID: \"dbae14e7-c975-42ac-bad6-b5ad764a239b\") " pod="openstack/horizon-574c54d6bf-d7655" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.889229 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dbae14e7-c975-42ac-bad6-b5ad764a239b-horizon-secret-key\") pod \"horizon-574c54d6bf-d7655\" (UID: \"dbae14e7-c975-42ac-bad6-b5ad764a239b\") " pod="openstack/horizon-574c54d6bf-d7655" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.889446 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbae14e7-c975-42ac-bad6-b5ad764a239b-scripts\") pod \"horizon-574c54d6bf-d7655\" (UID: \"dbae14e7-c975-42ac-bad6-b5ad764a239b\") " pod="openstack/horizon-574c54d6bf-d7655" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.889590 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.889607 4719 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.889615 4719 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.896779 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "196c6929-25a8-4376-a13d-b3bedc31a611" (UID: "196c6929-25a8-4376-a13d-b3bedc31a611"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.899244 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "196c6929-25a8-4376-a13d-b3bedc31a611" (UID: "196c6929-25a8-4376-a13d-b3bedc31a611"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.991137 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbae14e7-c975-42ac-bad6-b5ad764a239b-logs\") pod \"horizon-574c54d6bf-d7655\" (UID: \"dbae14e7-c975-42ac-bad6-b5ad764a239b\") " pod="openstack/horizon-574c54d6bf-d7655" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.991197 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dbae14e7-c975-42ac-bad6-b5ad764a239b-config-data\") pod \"horizon-574c54d6bf-d7655\" (UID: \"dbae14e7-c975-42ac-bad6-b5ad764a239b\") " pod="openstack/horizon-574c54d6bf-d7655" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.991271 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dbae14e7-c975-42ac-bad6-b5ad764a239b-horizon-secret-key\") pod \"horizon-574c54d6bf-d7655\" (UID: \"dbae14e7-c975-42ac-bad6-b5ad764a239b\") " pod="openstack/horizon-574c54d6bf-d7655" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.991367 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbae14e7-c975-42ac-bad6-b5ad764a239b-scripts\") pod \"horizon-574c54d6bf-d7655\" (UID: \"dbae14e7-c975-42ac-bad6-b5ad764a239b\") " pod="openstack/horizon-574c54d6bf-d7655" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.991434 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jvpw\" (UniqueName: \"kubernetes.io/projected/dbae14e7-c975-42ac-bad6-b5ad764a239b-kube-api-access-7jvpw\") pod \"horizon-574c54d6bf-d7655\" (UID: \"dbae14e7-c975-42ac-bad6-b5ad764a239b\") " pod="openstack/horizon-574c54d6bf-d7655" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.991490 4719 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.991505 4719 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/196c6929-25a8-4376-a13d-b3bedc31a611-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.992202 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbae14e7-c975-42ac-bad6-b5ad764a239b-logs\") pod \"horizon-574c54d6bf-d7655\" (UID: \"dbae14e7-c975-42ac-bad6-b5ad764a239b\") " pod="openstack/horizon-574c54d6bf-d7655" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.995624 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dbae14e7-c975-42ac-bad6-b5ad764a239b-config-data\") pod \"horizon-574c54d6bf-d7655\" (UID: \"dbae14e7-c975-42ac-bad6-b5ad764a239b\") " pod="openstack/horizon-574c54d6bf-d7655" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.996586 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbae14e7-c975-42ac-bad6-b5ad764a239b-scripts\") pod \"horizon-574c54d6bf-d7655\" (UID: \"dbae14e7-c975-42ac-bad6-b5ad764a239b\") " pod="openstack/horizon-574c54d6bf-d7655" Oct 09 15:35:25 crc kubenswrapper[4719]: I1009 15:35:25.998952 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dbae14e7-c975-42ac-bad6-b5ad764a239b-horizon-secret-key\") pod \"horizon-574c54d6bf-d7655\" (UID: \"dbae14e7-c975-42ac-bad6-b5ad764a239b\") " pod="openstack/horizon-574c54d6bf-d7655" Oct 09 15:35:26 crc kubenswrapper[4719]: I1009 15:35:26.012628 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jvpw\" (UniqueName: \"kubernetes.io/projected/dbae14e7-c975-42ac-bad6-b5ad764a239b-kube-api-access-7jvpw\") pod \"horizon-574c54d6bf-d7655\" (UID: \"dbae14e7-c975-42ac-bad6-b5ad764a239b\") " pod="openstack/horizon-574c54d6bf-d7655" Oct 09 15:35:26 crc kubenswrapper[4719]: I1009 15:35:26.028382 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-574c54d6bf-d7655" Oct 09 15:35:26 crc kubenswrapper[4719]: I1009 15:35:26.189749 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"95a97721-d5b6-401a-94ef-751878f64947","Type":"ContainerStarted","Data":"cbaefe5b72a7bc430834f4a54e4b4e8f324a4b84013b4ed6142dd1c0d5bfa12f"} Oct 09 15:35:26 crc kubenswrapper[4719]: I1009 15:35:26.190053 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"95a97721-d5b6-401a-94ef-751878f64947","Type":"ContainerStarted","Data":"e5273bd90474fb888f879987fd880d9eaffd14d206581635c205fa2b8a47f6c4"} Oct 09 15:35:26 crc kubenswrapper[4719]: I1009 15:35:26.191039 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-758645fbfc-c4jmx" event={"ID":"2deaf084-98a3-477d-b5ac-8933b175ef00","Type":"ContainerStarted","Data":"711ad45450bafdfdd5e761c86fd0264a12b213d9f76b2b6e42708a225073dcae"} Oct 09 15:35:26 crc kubenswrapper[4719]: I1009 15:35:26.193189 4719 generic.go:334] "Generic (PLEG): container finished" podID="0ea18a20-4f8e-460c-9625-2aa8333cf8f4" containerID="a5c8dac2514cab0b485142178177d06313e89059bbdb3f3c6f597c3b82c27c38" exitCode=0 Oct 09 15:35:26 crc kubenswrapper[4719]: I1009 15:35:26.193276 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" event={"ID":"0ea18a20-4f8e-460c-9625-2aa8333cf8f4","Type":"ContainerDied","Data":"a5c8dac2514cab0b485142178177d06313e89059bbdb3f3c6f597c3b82c27c38"} Oct 09 15:35:26 crc kubenswrapper[4719]: I1009 15:35:26.193304 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" event={"ID":"0ea18a20-4f8e-460c-9625-2aa8333cf8f4","Type":"ContainerStarted","Data":"33699d94693f6b3ff18f373d39bdeb8f9dd064635c60dff0f5a66d7540c8a03f"} Oct 09 15:35:26 crc kubenswrapper[4719]: I1009 15:35:26.197255 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf7887b65-ql45f" event={"ID":"196c6929-25a8-4376-a13d-b3bedc31a611","Type":"ContainerDied","Data":"a7f1b683c2e90787923f40f0edfdeb90b4d5a2b35c13305ae624d47694175bb9"} Oct 09 15:35:26 crc kubenswrapper[4719]: I1009 15:35:26.197302 4719 scope.go:117] "RemoveContainer" containerID="678a22c228d50b319141f31db32f6e4b75a8bbdd308cc3329f1297c4e034891b" Oct 09 15:35:26 crc kubenswrapper[4719]: I1009 15:35:26.197440 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf7887b65-ql45f" Oct 09 15:35:26 crc kubenswrapper[4719]: I1009 15:35:26.199971 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-d2888" event={"ID":"19cf902a-77e9-4e57-89d0-36765e27f361","Type":"ContainerStarted","Data":"f16c25aed36432b57fdabcd9aa11f093d6f55ebfe8d756cc9b0c8ddd5d3e6659"} Oct 09 15:35:26 crc kubenswrapper[4719]: I1009 15:35:26.205390 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b66884797-vvjz5" event={"ID":"19fb157d-30e1-4432-9d62-0fa70d464148","Type":"ContainerStarted","Data":"d1de847ef5d248cfd346e67f64d43dc6b837f858f0e102c551c824b00cec39c7"} Oct 09 15:35:26 crc kubenswrapper[4719]: I1009 15:35:26.284389 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bf7887b65-ql45f"] Oct 09 15:35:26 crc kubenswrapper[4719]: I1009 15:35:26.307968 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bf7887b65-ql45f"] Oct 09 15:35:26 crc kubenswrapper[4719]: I1009 15:35:26.694187 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-574c54d6bf-d7655"] Oct 09 15:35:27 crc kubenswrapper[4719]: I1009 15:35:27.174216 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="196c6929-25a8-4376-a13d-b3bedc31a611" path="/var/lib/kubelet/pods/196c6929-25a8-4376-a13d-b3bedc31a611/volumes" Oct 09 15:35:27 crc kubenswrapper[4719]: I1009 15:35:27.227258 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"95a97721-d5b6-401a-94ef-751878f64947","Type":"ContainerStarted","Data":"d09445c545bb0098a87a6e6741fec46a9ba9766227457d3ead82257c0463520d"} Oct 09 15:35:27 crc kubenswrapper[4719]: I1009 15:35:27.227480 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="95a97721-d5b6-401a-94ef-751878f64947" containerName="watcher-api-log" containerID="cri-o://cbaefe5b72a7bc430834f4a54e4b4e8f324a4b84013b4ed6142dd1c0d5bfa12f" gracePeriod=30 Oct 09 15:35:27 crc kubenswrapper[4719]: I1009 15:35:27.229598 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="95a97721-d5b6-401a-94ef-751878f64947" containerName="watcher-api" containerID="cri-o://d09445c545bb0098a87a6e6741fec46a9ba9766227457d3ead82257c0463520d" gracePeriod=30 Oct 09 15:35:27 crc kubenswrapper[4719]: I1009 15:35:27.229731 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 09 15:35:27 crc kubenswrapper[4719]: I1009 15:35:27.246070 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="95a97721-d5b6-401a-94ef-751878f64947" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.153:9322/\": EOF" Oct 09 15:35:27 crc kubenswrapper[4719]: I1009 15:35:27.254189 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=4.254168294 podStartE2EDuration="4.254168294s" podCreationTimestamp="2025-10-09 15:35:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:35:27.24680703 +0000 UTC m=+1032.756518315" watchObservedRunningTime="2025-10-09 15:35:27.254168294 +0000 UTC m=+1032.763879579" Oct 09 15:35:27 crc kubenswrapper[4719]: W1009 15:35:27.471456 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbae14e7_c975_42ac_bad6_b5ad764a239b.slice/crio-c713e76362374e7e2fb5691e5ab537368f9a1cf55c6ca3c504852415758d6a92 WatchSource:0}: Error finding container c713e76362374e7e2fb5691e5ab537368f9a1cf55c6ca3c504852415758d6a92: Status 404 returned error can't find the container with id c713e76362374e7e2fb5691e5ab537368f9a1cf55c6ca3c504852415758d6a92 Oct 09 15:35:28 crc kubenswrapper[4719]: I1009 15:35:28.242473 4719 generic.go:334] "Generic (PLEG): container finished" podID="95a97721-d5b6-401a-94ef-751878f64947" containerID="cbaefe5b72a7bc430834f4a54e4b4e8f324a4b84013b4ed6142dd1c0d5bfa12f" exitCode=143 Oct 09 15:35:28 crc kubenswrapper[4719]: I1009 15:35:28.242776 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"95a97721-d5b6-401a-94ef-751878f64947","Type":"ContainerDied","Data":"cbaefe5b72a7bc430834f4a54e4b4e8f324a4b84013b4ed6142dd1c0d5bfa12f"} Oct 09 15:35:28 crc kubenswrapper[4719]: I1009 15:35:28.245457 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-574c54d6bf-d7655" event={"ID":"dbae14e7-c975-42ac-bad6-b5ad764a239b","Type":"ContainerStarted","Data":"c713e76362374e7e2fb5691e5ab537368f9a1cf55c6ca3c504852415758d6a92"} Oct 09 15:35:28 crc kubenswrapper[4719]: I1009 15:35:28.851338 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.079691 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-gfc75"] Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.081552 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gfc75" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.084802 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-75wcg" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.085489 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.104082 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gfc75"] Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.158238 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08e04378-245e-4a13-b1de-f11cf96579ef-combined-ca-bundle\") pod \"barbican-db-sync-gfc75\" (UID: \"08e04378-245e-4a13-b1de-f11cf96579ef\") " pod="openstack/barbican-db-sync-gfc75" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.158595 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/08e04378-245e-4a13-b1de-f11cf96579ef-db-sync-config-data\") pod \"barbican-db-sync-gfc75\" (UID: \"08e04378-245e-4a13-b1de-f11cf96579ef\") " pod="openstack/barbican-db-sync-gfc75" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.158658 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx4ht\" (UniqueName: \"kubernetes.io/projected/08e04378-245e-4a13-b1de-f11cf96579ef-kube-api-access-nx4ht\") pod \"barbican-db-sync-gfc75\" (UID: \"08e04378-245e-4a13-b1de-f11cf96579ef\") " pod="openstack/barbican-db-sync-gfc75" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.181363 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-ztgbm"] Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.182544 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ztgbm" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.184856 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.185035 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.185106 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-kjw8r" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.195491 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-ztgbm"] Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.260287 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e899b0de-03a2-44a5-a165-25c988e8489d-db-sync-config-data\") pod \"cinder-db-sync-ztgbm\" (UID: \"e899b0de-03a2-44a5-a165-25c988e8489d\") " pod="openstack/cinder-db-sync-ztgbm" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.260420 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx4ht\" (UniqueName: \"kubernetes.io/projected/08e04378-245e-4a13-b1de-f11cf96579ef-kube-api-access-nx4ht\") pod \"barbican-db-sync-gfc75\" (UID: \"08e04378-245e-4a13-b1de-f11cf96579ef\") " pod="openstack/barbican-db-sync-gfc75" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.260458 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e899b0de-03a2-44a5-a165-25c988e8489d-etc-machine-id\") pod \"cinder-db-sync-ztgbm\" (UID: \"e899b0de-03a2-44a5-a165-25c988e8489d\") " pod="openstack/cinder-db-sync-ztgbm" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.260531 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e899b0de-03a2-44a5-a165-25c988e8489d-config-data\") pod \"cinder-db-sync-ztgbm\" (UID: \"e899b0de-03a2-44a5-a165-25c988e8489d\") " pod="openstack/cinder-db-sync-ztgbm" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.260651 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08e04378-245e-4a13-b1de-f11cf96579ef-combined-ca-bundle\") pod \"barbican-db-sync-gfc75\" (UID: \"08e04378-245e-4a13-b1de-f11cf96579ef\") " pod="openstack/barbican-db-sync-gfc75" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.260700 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e899b0de-03a2-44a5-a165-25c988e8489d-scripts\") pod \"cinder-db-sync-ztgbm\" (UID: \"e899b0de-03a2-44a5-a165-25c988e8489d\") " pod="openstack/cinder-db-sync-ztgbm" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.260725 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e899b0de-03a2-44a5-a165-25c988e8489d-combined-ca-bundle\") pod \"cinder-db-sync-ztgbm\" (UID: \"e899b0de-03a2-44a5-a165-25c988e8489d\") " pod="openstack/cinder-db-sync-ztgbm" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.260749 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/08e04378-245e-4a13-b1de-f11cf96579ef-db-sync-config-data\") pod \"barbican-db-sync-gfc75\" (UID: \"08e04378-245e-4a13-b1de-f11cf96579ef\") " pod="openstack/barbican-db-sync-gfc75" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.260798 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnxsf\" (UniqueName: \"kubernetes.io/projected/e899b0de-03a2-44a5-a165-25c988e8489d-kube-api-access-qnxsf\") pod \"cinder-db-sync-ztgbm\" (UID: \"e899b0de-03a2-44a5-a165-25c988e8489d\") " pod="openstack/cinder-db-sync-ztgbm" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.277954 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08e04378-245e-4a13-b1de-f11cf96579ef-combined-ca-bundle\") pod \"barbican-db-sync-gfc75\" (UID: \"08e04378-245e-4a13-b1de-f11cf96579ef\") " pod="openstack/barbican-db-sync-gfc75" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.279303 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/08e04378-245e-4a13-b1de-f11cf96579ef-db-sync-config-data\") pod \"barbican-db-sync-gfc75\" (UID: \"08e04378-245e-4a13-b1de-f11cf96579ef\") " pod="openstack/barbican-db-sync-gfc75" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.279923 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx4ht\" (UniqueName: \"kubernetes.io/projected/08e04378-245e-4a13-b1de-f11cf96579ef-kube-api-access-nx4ht\") pod \"barbican-db-sync-gfc75\" (UID: \"08e04378-245e-4a13-b1de-f11cf96579ef\") " pod="openstack/barbican-db-sync-gfc75" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.362535 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e899b0de-03a2-44a5-a165-25c988e8489d-scripts\") pod \"cinder-db-sync-ztgbm\" (UID: \"e899b0de-03a2-44a5-a165-25c988e8489d\") " pod="openstack/cinder-db-sync-ztgbm" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.362573 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e899b0de-03a2-44a5-a165-25c988e8489d-combined-ca-bundle\") pod \"cinder-db-sync-ztgbm\" (UID: \"e899b0de-03a2-44a5-a165-25c988e8489d\") " pod="openstack/cinder-db-sync-ztgbm" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.362612 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnxsf\" (UniqueName: \"kubernetes.io/projected/e899b0de-03a2-44a5-a165-25c988e8489d-kube-api-access-qnxsf\") pod \"cinder-db-sync-ztgbm\" (UID: \"e899b0de-03a2-44a5-a165-25c988e8489d\") " pod="openstack/cinder-db-sync-ztgbm" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.362648 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e899b0de-03a2-44a5-a165-25c988e8489d-db-sync-config-data\") pod \"cinder-db-sync-ztgbm\" (UID: \"e899b0de-03a2-44a5-a165-25c988e8489d\") " pod="openstack/cinder-db-sync-ztgbm" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.362673 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e899b0de-03a2-44a5-a165-25c988e8489d-etc-machine-id\") pod \"cinder-db-sync-ztgbm\" (UID: \"e899b0de-03a2-44a5-a165-25c988e8489d\") " pod="openstack/cinder-db-sync-ztgbm" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.362713 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e899b0de-03a2-44a5-a165-25c988e8489d-config-data\") pod \"cinder-db-sync-ztgbm\" (UID: \"e899b0de-03a2-44a5-a165-25c988e8489d\") " pod="openstack/cinder-db-sync-ztgbm" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.365015 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e899b0de-03a2-44a5-a165-25c988e8489d-etc-machine-id\") pod \"cinder-db-sync-ztgbm\" (UID: \"e899b0de-03a2-44a5-a165-25c988e8489d\") " pod="openstack/cinder-db-sync-ztgbm" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.370903 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e899b0de-03a2-44a5-a165-25c988e8489d-db-sync-config-data\") pod \"cinder-db-sync-ztgbm\" (UID: \"e899b0de-03a2-44a5-a165-25c988e8489d\") " pod="openstack/cinder-db-sync-ztgbm" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.371327 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e899b0de-03a2-44a5-a165-25c988e8489d-combined-ca-bundle\") pod \"cinder-db-sync-ztgbm\" (UID: \"e899b0de-03a2-44a5-a165-25c988e8489d\") " pod="openstack/cinder-db-sync-ztgbm" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.378768 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e899b0de-03a2-44a5-a165-25c988e8489d-scripts\") pod \"cinder-db-sync-ztgbm\" (UID: \"e899b0de-03a2-44a5-a165-25c988e8489d\") " pod="openstack/cinder-db-sync-ztgbm" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.388936 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-jk6nr"] Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.389048 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e899b0de-03a2-44a5-a165-25c988e8489d-config-data\") pod \"cinder-db-sync-ztgbm\" (UID: \"e899b0de-03a2-44a5-a165-25c988e8489d\") " pod="openstack/cinder-db-sync-ztgbm" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.390718 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jk6nr" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.391944 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnxsf\" (UniqueName: \"kubernetes.io/projected/e899b0de-03a2-44a5-a165-25c988e8489d-kube-api-access-qnxsf\") pod \"cinder-db-sync-ztgbm\" (UID: \"e899b0de-03a2-44a5-a165-25c988e8489d\") " pod="openstack/cinder-db-sync-ztgbm" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.395555 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.395615 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vth95" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.395643 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.403338 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jk6nr"] Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.407478 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gfc75" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.464813 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc4cj\" (UniqueName: \"kubernetes.io/projected/a2205fae-acbe-4123-936d-ad78cd542565-kube-api-access-cc4cj\") pod \"neutron-db-sync-jk6nr\" (UID: \"a2205fae-acbe-4123-936d-ad78cd542565\") " pod="openstack/neutron-db-sync-jk6nr" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.464894 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2205fae-acbe-4123-936d-ad78cd542565-combined-ca-bundle\") pod \"neutron-db-sync-jk6nr\" (UID: \"a2205fae-acbe-4123-936d-ad78cd542565\") " pod="openstack/neutron-db-sync-jk6nr" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.465170 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2205fae-acbe-4123-936d-ad78cd542565-config\") pod \"neutron-db-sync-jk6nr\" (UID: \"a2205fae-acbe-4123-936d-ad78cd542565\") " pod="openstack/neutron-db-sync-jk6nr" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.497901 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ztgbm" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.566835 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2205fae-acbe-4123-936d-ad78cd542565-config\") pod \"neutron-db-sync-jk6nr\" (UID: \"a2205fae-acbe-4123-936d-ad78cd542565\") " pod="openstack/neutron-db-sync-jk6nr" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.566953 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc4cj\" (UniqueName: \"kubernetes.io/projected/a2205fae-acbe-4123-936d-ad78cd542565-kube-api-access-cc4cj\") pod \"neutron-db-sync-jk6nr\" (UID: \"a2205fae-acbe-4123-936d-ad78cd542565\") " pod="openstack/neutron-db-sync-jk6nr" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.566994 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2205fae-acbe-4123-936d-ad78cd542565-combined-ca-bundle\") pod \"neutron-db-sync-jk6nr\" (UID: \"a2205fae-acbe-4123-936d-ad78cd542565\") " pod="openstack/neutron-db-sync-jk6nr" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.570446 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2205fae-acbe-4123-936d-ad78cd542565-combined-ca-bundle\") pod \"neutron-db-sync-jk6nr\" (UID: \"a2205fae-acbe-4123-936d-ad78cd542565\") " pod="openstack/neutron-db-sync-jk6nr" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.581120 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2205fae-acbe-4123-936d-ad78cd542565-config\") pod \"neutron-db-sync-jk6nr\" (UID: \"a2205fae-acbe-4123-936d-ad78cd542565\") " pod="openstack/neutron-db-sync-jk6nr" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.592372 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc4cj\" (UniqueName: \"kubernetes.io/projected/a2205fae-acbe-4123-936d-ad78cd542565-kube-api-access-cc4cj\") pod \"neutron-db-sync-jk6nr\" (UID: \"a2205fae-acbe-4123-936d-ad78cd542565\") " pod="openstack/neutron-db-sync-jk6nr" Oct 09 15:35:29 crc kubenswrapper[4719]: I1009 15:35:29.771908 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jk6nr" Oct 09 15:35:30 crc kubenswrapper[4719]: I1009 15:35:30.981807 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="95a97721-d5b6-401a-94ef-751878f64947" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.153:9322/\": read tcp 10.217.0.2:39498->10.217.0.153:9322: read: connection reset by peer" Oct 09 15:35:31 crc kubenswrapper[4719]: I1009 15:35:31.288636 4719 generic.go:334] "Generic (PLEG): container finished" podID="95a97721-d5b6-401a-94ef-751878f64947" containerID="d09445c545bb0098a87a6e6741fec46a9ba9766227457d3ead82257c0463520d" exitCode=0 Oct 09 15:35:31 crc kubenswrapper[4719]: I1009 15:35:31.288753 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"95a97721-d5b6-401a-94ef-751878f64947","Type":"ContainerDied","Data":"d09445c545bb0098a87a6e6741fec46a9ba9766227457d3ead82257c0463520d"} Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.011910 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b66884797-vvjz5"] Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.042100 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7457564986-k28cv"] Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.043829 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7457564986-k28cv" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.053192 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.061417 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7457564986-k28cv"] Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.124565 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d16f8bb5-9ca5-4042-ae67-756c79d00217-config-data\") pod \"horizon-7457564986-k28cv\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " pod="openstack/horizon-7457564986-k28cv" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.125020 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d16f8bb5-9ca5-4042-ae67-756c79d00217-logs\") pod \"horizon-7457564986-k28cv\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " pod="openstack/horizon-7457564986-k28cv" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.125046 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d16f8bb5-9ca5-4042-ae67-756c79d00217-horizon-tls-certs\") pod \"horizon-7457564986-k28cv\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " pod="openstack/horizon-7457564986-k28cv" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.125085 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d16f8bb5-9ca5-4042-ae67-756c79d00217-combined-ca-bundle\") pod \"horizon-7457564986-k28cv\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " pod="openstack/horizon-7457564986-k28cv" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.125135 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d16f8bb5-9ca5-4042-ae67-756c79d00217-horizon-secret-key\") pod \"horizon-7457564986-k28cv\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " pod="openstack/horizon-7457564986-k28cv" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.125170 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d16f8bb5-9ca5-4042-ae67-756c79d00217-scripts\") pod \"horizon-7457564986-k28cv\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " pod="openstack/horizon-7457564986-k28cv" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.125206 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvf64\" (UniqueName: \"kubernetes.io/projected/d16f8bb5-9ca5-4042-ae67-756c79d00217-kube-api-access-cvf64\") pod \"horizon-7457564986-k28cv\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " pod="openstack/horizon-7457564986-k28cv" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.155407 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-574c54d6bf-d7655"] Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.173687 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6f5bc696cd-sgqb2"] Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.175884 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.187995 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f5bc696cd-sgqb2"] Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.226704 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwpx2\" (UniqueName: \"kubernetes.io/projected/a66fd9c2-b3cc-43db-b520-6972ce53871f-kube-api-access-rwpx2\") pod \"horizon-6f5bc696cd-sgqb2\" (UID: \"a66fd9c2-b3cc-43db-b520-6972ce53871f\") " pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.226758 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d16f8bb5-9ca5-4042-ae67-756c79d00217-combined-ca-bundle\") pod \"horizon-7457564986-k28cv\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " pod="openstack/horizon-7457564986-k28cv" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.226796 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a66fd9c2-b3cc-43db-b520-6972ce53871f-logs\") pod \"horizon-6f5bc696cd-sgqb2\" (UID: \"a66fd9c2-b3cc-43db-b520-6972ce53871f\") " pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.226874 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d16f8bb5-9ca5-4042-ae67-756c79d00217-horizon-secret-key\") pod \"horizon-7457564986-k28cv\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " pod="openstack/horizon-7457564986-k28cv" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.226984 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d16f8bb5-9ca5-4042-ae67-756c79d00217-scripts\") pod \"horizon-7457564986-k28cv\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " pod="openstack/horizon-7457564986-k28cv" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.227012 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a66fd9c2-b3cc-43db-b520-6972ce53871f-horizon-tls-certs\") pod \"horizon-6f5bc696cd-sgqb2\" (UID: \"a66fd9c2-b3cc-43db-b520-6972ce53871f\") " pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.227047 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvf64\" (UniqueName: \"kubernetes.io/projected/d16f8bb5-9ca5-4042-ae67-756c79d00217-kube-api-access-cvf64\") pod \"horizon-7457564986-k28cv\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " pod="openstack/horizon-7457564986-k28cv" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.227090 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a66fd9c2-b3cc-43db-b520-6972ce53871f-horizon-secret-key\") pod \"horizon-6f5bc696cd-sgqb2\" (UID: \"a66fd9c2-b3cc-43db-b520-6972ce53871f\") " pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.227107 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a66fd9c2-b3cc-43db-b520-6972ce53871f-scripts\") pod \"horizon-6f5bc696cd-sgqb2\" (UID: \"a66fd9c2-b3cc-43db-b520-6972ce53871f\") " pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.227128 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d16f8bb5-9ca5-4042-ae67-756c79d00217-config-data\") pod \"horizon-7457564986-k28cv\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " pod="openstack/horizon-7457564986-k28cv" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.227168 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a66fd9c2-b3cc-43db-b520-6972ce53871f-config-data\") pod \"horizon-6f5bc696cd-sgqb2\" (UID: \"a66fd9c2-b3cc-43db-b520-6972ce53871f\") " pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.227196 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a66fd9c2-b3cc-43db-b520-6972ce53871f-combined-ca-bundle\") pod \"horizon-6f5bc696cd-sgqb2\" (UID: \"a66fd9c2-b3cc-43db-b520-6972ce53871f\") " pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.227229 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d16f8bb5-9ca5-4042-ae67-756c79d00217-logs\") pod \"horizon-7457564986-k28cv\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " pod="openstack/horizon-7457564986-k28cv" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.227245 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d16f8bb5-9ca5-4042-ae67-756c79d00217-horizon-tls-certs\") pod \"horizon-7457564986-k28cv\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " pod="openstack/horizon-7457564986-k28cv" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.228386 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d16f8bb5-9ca5-4042-ae67-756c79d00217-logs\") pod \"horizon-7457564986-k28cv\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " pod="openstack/horizon-7457564986-k28cv" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.229033 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d16f8bb5-9ca5-4042-ae67-756c79d00217-scripts\") pod \"horizon-7457564986-k28cv\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " pod="openstack/horizon-7457564986-k28cv" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.229257 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d16f8bb5-9ca5-4042-ae67-756c79d00217-config-data\") pod \"horizon-7457564986-k28cv\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " pod="openstack/horizon-7457564986-k28cv" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.233444 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d16f8bb5-9ca5-4042-ae67-756c79d00217-horizon-secret-key\") pod \"horizon-7457564986-k28cv\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " pod="openstack/horizon-7457564986-k28cv" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.234743 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d16f8bb5-9ca5-4042-ae67-756c79d00217-horizon-tls-certs\") pod \"horizon-7457564986-k28cv\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " pod="openstack/horizon-7457564986-k28cv" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.249277 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d16f8bb5-9ca5-4042-ae67-756c79d00217-combined-ca-bundle\") pod \"horizon-7457564986-k28cv\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " pod="openstack/horizon-7457564986-k28cv" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.257487 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvf64\" (UniqueName: \"kubernetes.io/projected/d16f8bb5-9ca5-4042-ae67-756c79d00217-kube-api-access-cvf64\") pod \"horizon-7457564986-k28cv\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " pod="openstack/horizon-7457564986-k28cv" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.304462 4719 generic.go:334] "Generic (PLEG): container finished" podID="f4998573-c12b-4a90-a58b-3af8be611b96" containerID="e5f20b20c42e0d1513484d9a4f5a033569d029d7a21fb90191416fde7f5713c0" exitCode=0 Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.304509 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wn6ns" event={"ID":"f4998573-c12b-4a90-a58b-3af8be611b96","Type":"ContainerDied","Data":"e5f20b20c42e0d1513484d9a4f5a033569d029d7a21fb90191416fde7f5713c0"} Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.328594 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a66fd9c2-b3cc-43db-b520-6972ce53871f-logs\") pod \"horizon-6f5bc696cd-sgqb2\" (UID: \"a66fd9c2-b3cc-43db-b520-6972ce53871f\") " pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.328691 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a66fd9c2-b3cc-43db-b520-6972ce53871f-horizon-tls-certs\") pod \"horizon-6f5bc696cd-sgqb2\" (UID: \"a66fd9c2-b3cc-43db-b520-6972ce53871f\") " pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.328748 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a66fd9c2-b3cc-43db-b520-6972ce53871f-horizon-secret-key\") pod \"horizon-6f5bc696cd-sgqb2\" (UID: \"a66fd9c2-b3cc-43db-b520-6972ce53871f\") " pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.328770 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a66fd9c2-b3cc-43db-b520-6972ce53871f-scripts\") pod \"horizon-6f5bc696cd-sgqb2\" (UID: \"a66fd9c2-b3cc-43db-b520-6972ce53871f\") " pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.328832 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a66fd9c2-b3cc-43db-b520-6972ce53871f-config-data\") pod \"horizon-6f5bc696cd-sgqb2\" (UID: \"a66fd9c2-b3cc-43db-b520-6972ce53871f\") " pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.328870 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a66fd9c2-b3cc-43db-b520-6972ce53871f-combined-ca-bundle\") pod \"horizon-6f5bc696cd-sgqb2\" (UID: \"a66fd9c2-b3cc-43db-b520-6972ce53871f\") " pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.328931 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwpx2\" (UniqueName: \"kubernetes.io/projected/a66fd9c2-b3cc-43db-b520-6972ce53871f-kube-api-access-rwpx2\") pod \"horizon-6f5bc696cd-sgqb2\" (UID: \"a66fd9c2-b3cc-43db-b520-6972ce53871f\") " pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.329067 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a66fd9c2-b3cc-43db-b520-6972ce53871f-logs\") pod \"horizon-6f5bc696cd-sgqb2\" (UID: \"a66fd9c2-b3cc-43db-b520-6972ce53871f\") " pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.329782 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a66fd9c2-b3cc-43db-b520-6972ce53871f-scripts\") pod \"horizon-6f5bc696cd-sgqb2\" (UID: \"a66fd9c2-b3cc-43db-b520-6972ce53871f\") " pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.330558 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a66fd9c2-b3cc-43db-b520-6972ce53871f-config-data\") pod \"horizon-6f5bc696cd-sgqb2\" (UID: \"a66fd9c2-b3cc-43db-b520-6972ce53871f\") " pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.335989 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a66fd9c2-b3cc-43db-b520-6972ce53871f-combined-ca-bundle\") pod \"horizon-6f5bc696cd-sgqb2\" (UID: \"a66fd9c2-b3cc-43db-b520-6972ce53871f\") " pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.339850 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a66fd9c2-b3cc-43db-b520-6972ce53871f-horizon-secret-key\") pod \"horizon-6f5bc696cd-sgqb2\" (UID: \"a66fd9c2-b3cc-43db-b520-6972ce53871f\") " pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.348848 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a66fd9c2-b3cc-43db-b520-6972ce53871f-horizon-tls-certs\") pod \"horizon-6f5bc696cd-sgqb2\" (UID: \"a66fd9c2-b3cc-43db-b520-6972ce53871f\") " pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.379086 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwpx2\" (UniqueName: \"kubernetes.io/projected/a66fd9c2-b3cc-43db-b520-6972ce53871f-kube-api-access-rwpx2\") pod \"horizon-6f5bc696cd-sgqb2\" (UID: \"a66fd9c2-b3cc-43db-b520-6972ce53871f\") " pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.380945 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7457564986-k28cv" Oct 09 15:35:32 crc kubenswrapper[4719]: I1009 15:35:32.498814 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:35:33 crc kubenswrapper[4719]: I1009 15:35:33.852138 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="95a97721-d5b6-401a-94ef-751878f64947" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.153:9322/\": dial tcp 10.217.0.153:9322: connect: connection refused" Oct 09 15:35:36 crc kubenswrapper[4719]: I1009 15:35:36.976629 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:35:36 crc kubenswrapper[4719]: I1009 15:35:36.977225 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:35:36 crc kubenswrapper[4719]: I1009 15:35:36.977267 4719 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 15:35:36 crc kubenswrapper[4719]: I1009 15:35:36.977989 4719 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b3d6b70762c7cbd23b776b68a38ad2cc0ec2c06605dcc7efb9d87a0020d07dde"} pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 15:35:36 crc kubenswrapper[4719]: I1009 15:35:36.978036 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" containerID="cri-o://b3d6b70762c7cbd23b776b68a38ad2cc0ec2c06605dcc7efb9d87a0020d07dde" gracePeriod=600 Oct 09 15:35:37 crc kubenswrapper[4719]: I1009 15:35:37.373584 4719 generic.go:334] "Generic (PLEG): container finished" podID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerID="b3d6b70762c7cbd23b776b68a38ad2cc0ec2c06605dcc7efb9d87a0020d07dde" exitCode=0 Oct 09 15:35:37 crc kubenswrapper[4719]: I1009 15:35:37.373637 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerDied","Data":"b3d6b70762c7cbd23b776b68a38ad2cc0ec2c06605dcc7efb9d87a0020d07dde"} Oct 09 15:35:37 crc kubenswrapper[4719]: I1009 15:35:37.373675 4719 scope.go:117] "RemoveContainer" containerID="68d8ab72b367a09fd501bf52a95e52e96b2dd8454309c2056f29b2264d60dcdd" Oct 09 15:35:39 crc kubenswrapper[4719]: E1009 15:35:39.951468 4719 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.66:5001/podified-master-centos10/openstack-horizon:watcher_latest" Oct 09 15:35:39 crc kubenswrapper[4719]: E1009 15:35:39.952544 4719 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.66:5001/podified-master-centos10/openstack-horizon:watcher_latest" Oct 09 15:35:39 crc kubenswrapper[4719]: E1009 15:35:39.952815 4719 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.66:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n56fhc7hbdh644h55fh64bh5b5h57bhd9hd6h5d6h96h5b4h564hc9hb4h5b6hd4h65fh569h6fh67bhd8hb4h66fh5bch546hdhf6h588h674h5f6q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rmxzx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-758645fbfc-c4jmx_openstack(2deaf084-98a3-477d-b5ac-8933b175ef00): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 15:35:39 crc kubenswrapper[4719]: E1009 15:35:39.962395 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.66:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-758645fbfc-c4jmx" podUID="2deaf084-98a3-477d-b5ac-8933b175ef00" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.103117 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wn6ns" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.109645 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.170546 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a97721-d5b6-401a-94ef-751878f64947-combined-ca-bundle\") pod \"95a97721-d5b6-401a-94ef-751878f64947\" (UID: \"95a97721-d5b6-401a-94ef-751878f64947\") " Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.170852 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-scripts\") pod \"f4998573-c12b-4a90-a58b-3af8be611b96\" (UID: \"f4998573-c12b-4a90-a58b-3af8be611b96\") " Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.171022 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95a97721-d5b6-401a-94ef-751878f64947-logs\") pod \"95a97721-d5b6-401a-94ef-751878f64947\" (UID: \"95a97721-d5b6-401a-94ef-751878f64947\") " Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.171146 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj2t7\" (UniqueName: \"kubernetes.io/projected/f4998573-c12b-4a90-a58b-3af8be611b96-kube-api-access-pj2t7\") pod \"f4998573-c12b-4a90-a58b-3af8be611b96\" (UID: \"f4998573-c12b-4a90-a58b-3af8be611b96\") " Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.171304 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-credential-keys\") pod \"f4998573-c12b-4a90-a58b-3af8be611b96\" (UID: \"f4998573-c12b-4a90-a58b-3af8be611b96\") " Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.171493 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/95a97721-d5b6-401a-94ef-751878f64947-custom-prometheus-ca\") pod \"95a97721-d5b6-401a-94ef-751878f64947\" (UID: \"95a97721-d5b6-401a-94ef-751878f64947\") " Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.171605 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-config-data\") pod \"f4998573-c12b-4a90-a58b-3af8be611b96\" (UID: \"f4998573-c12b-4a90-a58b-3af8be611b96\") " Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.171703 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-527lf\" (UniqueName: \"kubernetes.io/projected/95a97721-d5b6-401a-94ef-751878f64947-kube-api-access-527lf\") pod \"95a97721-d5b6-401a-94ef-751878f64947\" (UID: \"95a97721-d5b6-401a-94ef-751878f64947\") " Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.171842 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a97721-d5b6-401a-94ef-751878f64947-config-data\") pod \"95a97721-d5b6-401a-94ef-751878f64947\" (UID: \"95a97721-d5b6-401a-94ef-751878f64947\") " Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.172182 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-fernet-keys\") pod \"f4998573-c12b-4a90-a58b-3af8be611b96\" (UID: \"f4998573-c12b-4a90-a58b-3af8be611b96\") " Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.172305 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-combined-ca-bundle\") pod \"f4998573-c12b-4a90-a58b-3af8be611b96\" (UID: \"f4998573-c12b-4a90-a58b-3af8be611b96\") " Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.171732 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95a97721-d5b6-401a-94ef-751878f64947-logs" (OuterVolumeSpecName: "logs") pod "95a97721-d5b6-401a-94ef-751878f64947" (UID: "95a97721-d5b6-401a-94ef-751878f64947"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.173004 4719 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95a97721-d5b6-401a-94ef-751878f64947-logs\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.176891 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f4998573-c12b-4a90-a58b-3af8be611b96" (UID: "f4998573-c12b-4a90-a58b-3af8be611b96"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.177464 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4998573-c12b-4a90-a58b-3af8be611b96-kube-api-access-pj2t7" (OuterVolumeSpecName: "kube-api-access-pj2t7") pod "f4998573-c12b-4a90-a58b-3af8be611b96" (UID: "f4998573-c12b-4a90-a58b-3af8be611b96"). InnerVolumeSpecName "kube-api-access-pj2t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.177934 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-scripts" (OuterVolumeSpecName: "scripts") pod "f4998573-c12b-4a90-a58b-3af8be611b96" (UID: "f4998573-c12b-4a90-a58b-3af8be611b96"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.179222 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95a97721-d5b6-401a-94ef-751878f64947-kube-api-access-527lf" (OuterVolumeSpecName: "kube-api-access-527lf") pod "95a97721-d5b6-401a-94ef-751878f64947" (UID: "95a97721-d5b6-401a-94ef-751878f64947"). InnerVolumeSpecName "kube-api-access-527lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.180651 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f4998573-c12b-4a90-a58b-3af8be611b96" (UID: "f4998573-c12b-4a90-a58b-3af8be611b96"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.207646 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95a97721-d5b6-401a-94ef-751878f64947-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95a97721-d5b6-401a-94ef-751878f64947" (UID: "95a97721-d5b6-401a-94ef-751878f64947"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.214671 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95a97721-d5b6-401a-94ef-751878f64947-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "95a97721-d5b6-401a-94ef-751878f64947" (UID: "95a97721-d5b6-401a-94ef-751878f64947"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.224854 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4998573-c12b-4a90-a58b-3af8be611b96" (UID: "f4998573-c12b-4a90-a58b-3af8be611b96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.233916 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95a97721-d5b6-401a-94ef-751878f64947-config-data" (OuterVolumeSpecName: "config-data") pod "95a97721-d5b6-401a-94ef-751878f64947" (UID: "95a97721-d5b6-401a-94ef-751878f64947"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.248209 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-config-data" (OuterVolumeSpecName: "config-data") pod "f4998573-c12b-4a90-a58b-3af8be611b96" (UID: "f4998573-c12b-4a90-a58b-3af8be611b96"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.277214 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.277932 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj2t7\" (UniqueName: \"kubernetes.io/projected/f4998573-c12b-4a90-a58b-3af8be611b96-kube-api-access-pj2t7\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.278025 4719 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.278105 4719 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/95a97721-d5b6-401a-94ef-751878f64947-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.278162 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.278214 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-527lf\" (UniqueName: \"kubernetes.io/projected/95a97721-d5b6-401a-94ef-751878f64947-kube-api-access-527lf\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.278265 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a97721-d5b6-401a-94ef-751878f64947-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.278312 4719 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.278452 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4998573-c12b-4a90-a58b-3af8be611b96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.278534 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a97721-d5b6-401a-94ef-751878f64947-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.425278 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wn6ns" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.425370 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wn6ns" event={"ID":"f4998573-c12b-4a90-a58b-3af8be611b96","Type":"ContainerDied","Data":"f15279ddb3769b1f33a00bfcb0b9ed65e2846f47d0482f0a81442e56ed03d9af"} Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.425484 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f15279ddb3769b1f33a00bfcb0b9ed65e2846f47d0482f0a81442e56ed03d9af" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.428573 4719 generic.go:334] "Generic (PLEG): container finished" podID="52932375-ade4-4056-a4f8-6758db0df52f" containerID="9fd313329586b35b942fe6233e6f3512d120cb80c30bd770fb6e47f079d5fa27" exitCode=0 Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.428670 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xs6f4" event={"ID":"52932375-ade4-4056-a4f8-6758db0df52f","Type":"ContainerDied","Data":"9fd313329586b35b942fe6233e6f3512d120cb80c30bd770fb6e47f079d5fa27"} Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.431764 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"95a97721-d5b6-401a-94ef-751878f64947","Type":"ContainerDied","Data":"e5273bd90474fb888f879987fd880d9eaffd14d206581635c205fa2b8a47f6c4"} Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.431785 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.453538 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gfc75"] Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.516275 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.534196 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.554567 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 09 15:35:40 crc kubenswrapper[4719]: E1009 15:35:40.554965 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a97721-d5b6-401a-94ef-751878f64947" containerName="watcher-api-log" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.555172 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a97721-d5b6-401a-94ef-751878f64947" containerName="watcher-api-log" Oct 09 15:35:40 crc kubenswrapper[4719]: E1009 15:35:40.555182 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4998573-c12b-4a90-a58b-3af8be611b96" containerName="keystone-bootstrap" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.555188 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4998573-c12b-4a90-a58b-3af8be611b96" containerName="keystone-bootstrap" Oct 09 15:35:40 crc kubenswrapper[4719]: E1009 15:35:40.555210 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a97721-d5b6-401a-94ef-751878f64947" containerName="watcher-api" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.555216 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a97721-d5b6-401a-94ef-751878f64947" containerName="watcher-api" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.555388 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="95a97721-d5b6-401a-94ef-751878f64947" containerName="watcher-api" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.555406 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4998573-c12b-4a90-a58b-3af8be611b96" containerName="keystone-bootstrap" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.555419 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="95a97721-d5b6-401a-94ef-751878f64947" containerName="watcher-api-log" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.562536 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.569101 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.575613 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 09 15:35:40 crc kubenswrapper[4719]: E1009 15:35:40.630336 4719 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.66:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Oct 09 15:35:40 crc kubenswrapper[4719]: E1009 15:35:40.630429 4719 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.66:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Oct 09 15:35:40 crc kubenswrapper[4719]: E1009 15:35:40.630625 4719 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:38.102.83.66:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n58dh674h5dbhch548hdbh58dh57fh59dh7h5cbh576h64chfhf6h5d6h576h84hd9h547h669hc7h5d4h654h57ch694h5b6hb9h4h646h5fh57q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kj48h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(590f0bbf-4518-4aa6-a71f-1f28b5f4e02a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 15:35:40 crc kubenswrapper[4719]: W1009 15:35:40.664322 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08e04378_245e_4a13_b1de_f11cf96579ef.slice/crio-06be8d9bc5f128936eec343d64db74252ddf326eecc6c46a776759ecc3f0fb30 WatchSource:0}: Error finding container 06be8d9bc5f128936eec343d64db74252ddf326eecc6c46a776759ecc3f0fb30: Status 404 returned error can't find the container with id 06be8d9bc5f128936eec343d64db74252ddf326eecc6c46a776759ecc3f0fb30 Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.688465 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-config-data\") pod \"watcher-api-0\" (UID: \"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c\") " pod="openstack/watcher-api-0" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.688814 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c\") " pod="openstack/watcher-api-0" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.688971 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95gvk\" (UniqueName: \"kubernetes.io/projected/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-kube-api-access-95gvk\") pod \"watcher-api-0\" (UID: \"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c\") " pod="openstack/watcher-api-0" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.689297 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-logs\") pod \"watcher-api-0\" (UID: \"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c\") " pod="openstack/watcher-api-0" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.689883 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c\") " pod="openstack/watcher-api-0" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.792077 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c\") " pod="openstack/watcher-api-0" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.792470 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-config-data\") pod \"watcher-api-0\" (UID: \"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c\") " pod="openstack/watcher-api-0" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.792496 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c\") " pod="openstack/watcher-api-0" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.792525 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95gvk\" (UniqueName: \"kubernetes.io/projected/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-kube-api-access-95gvk\") pod \"watcher-api-0\" (UID: \"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c\") " pod="openstack/watcher-api-0" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.792553 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-logs\") pod \"watcher-api-0\" (UID: \"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c\") " pod="openstack/watcher-api-0" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.793185 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-logs\") pod \"watcher-api-0\" (UID: \"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c\") " pod="openstack/watcher-api-0" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.799999 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c\") " pod="openstack/watcher-api-0" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.800659 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-config-data\") pod \"watcher-api-0\" (UID: \"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c\") " pod="openstack/watcher-api-0" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.800865 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c\") " pod="openstack/watcher-api-0" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.811960 4719 scope.go:117] "RemoveContainer" containerID="d09445c545bb0098a87a6e6741fec46a9ba9766227457d3ead82257c0463520d" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.820953 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95gvk\" (UniqueName: \"kubernetes.io/projected/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-kube-api-access-95gvk\") pod \"watcher-api-0\" (UID: \"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c\") " pod="openstack/watcher-api-0" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.854595 4719 scope.go:117] "RemoveContainer" containerID="cbaefe5b72a7bc430834f4a54e4b4e8f324a4b84013b4ed6142dd1c0d5bfa12f" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.886937 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.913494 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-758645fbfc-c4jmx" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.997121 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2deaf084-98a3-477d-b5ac-8933b175ef00-horizon-secret-key\") pod \"2deaf084-98a3-477d-b5ac-8933b175ef00\" (UID: \"2deaf084-98a3-477d-b5ac-8933b175ef00\") " Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.997168 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2deaf084-98a3-477d-b5ac-8933b175ef00-logs\") pod \"2deaf084-98a3-477d-b5ac-8933b175ef00\" (UID: \"2deaf084-98a3-477d-b5ac-8933b175ef00\") " Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.997259 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2deaf084-98a3-477d-b5ac-8933b175ef00-scripts\") pod \"2deaf084-98a3-477d-b5ac-8933b175ef00\" (UID: \"2deaf084-98a3-477d-b5ac-8933b175ef00\") " Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.997372 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmxzx\" (UniqueName: \"kubernetes.io/projected/2deaf084-98a3-477d-b5ac-8933b175ef00-kube-api-access-rmxzx\") pod \"2deaf084-98a3-477d-b5ac-8933b175ef00\" (UID: \"2deaf084-98a3-477d-b5ac-8933b175ef00\") " Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.997436 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2deaf084-98a3-477d-b5ac-8933b175ef00-config-data\") pod \"2deaf084-98a3-477d-b5ac-8933b175ef00\" (UID: \"2deaf084-98a3-477d-b5ac-8933b175ef00\") " Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.998486 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2deaf084-98a3-477d-b5ac-8933b175ef00-logs" (OuterVolumeSpecName: "logs") pod "2deaf084-98a3-477d-b5ac-8933b175ef00" (UID: "2deaf084-98a3-477d-b5ac-8933b175ef00"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.998865 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2deaf084-98a3-477d-b5ac-8933b175ef00-scripts" (OuterVolumeSpecName: "scripts") pod "2deaf084-98a3-477d-b5ac-8933b175ef00" (UID: "2deaf084-98a3-477d-b5ac-8933b175ef00"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:35:40 crc kubenswrapper[4719]: I1009 15:35:40.999253 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2deaf084-98a3-477d-b5ac-8933b175ef00-config-data" (OuterVolumeSpecName: "config-data") pod "2deaf084-98a3-477d-b5ac-8933b175ef00" (UID: "2deaf084-98a3-477d-b5ac-8933b175ef00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.009586 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2deaf084-98a3-477d-b5ac-8933b175ef00-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2deaf084-98a3-477d-b5ac-8933b175ef00" (UID: "2deaf084-98a3-477d-b5ac-8933b175ef00"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.019538 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2deaf084-98a3-477d-b5ac-8933b175ef00-kube-api-access-rmxzx" (OuterVolumeSpecName: "kube-api-access-rmxzx") pod "2deaf084-98a3-477d-b5ac-8933b175ef00" (UID: "2deaf084-98a3-477d-b5ac-8933b175ef00"). InnerVolumeSpecName "kube-api-access-rmxzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.099637 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2deaf084-98a3-477d-b5ac-8933b175ef00-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.099678 4719 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2deaf084-98a3-477d-b5ac-8933b175ef00-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.099694 4719 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2deaf084-98a3-477d-b5ac-8933b175ef00-logs\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.099705 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2deaf084-98a3-477d-b5ac-8933b175ef00-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.099718 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmxzx\" (UniqueName: \"kubernetes.io/projected/2deaf084-98a3-477d-b5ac-8933b175ef00-kube-api-access-rmxzx\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.181700 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95a97721-d5b6-401a-94ef-751878f64947" path="/var/lib/kubelet/pods/95a97721-d5b6-401a-94ef-751878f64947/volumes" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.217721 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-wn6ns"] Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.227143 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-wn6ns"] Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.295282 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-cqxgt"] Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.296421 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cqxgt" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.305080 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.305279 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.305521 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.319447 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2vsx5" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.327009 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-cqxgt"] Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.407050 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-config-data\") pod \"keystone-bootstrap-cqxgt\" (UID: \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\") " pod="openstack/keystone-bootstrap-cqxgt" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.407210 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vwxp\" (UniqueName: \"kubernetes.io/projected/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-kube-api-access-6vwxp\") pod \"keystone-bootstrap-cqxgt\" (UID: \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\") " pod="openstack/keystone-bootstrap-cqxgt" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.407257 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-fernet-keys\") pod \"keystone-bootstrap-cqxgt\" (UID: \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\") " pod="openstack/keystone-bootstrap-cqxgt" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.407282 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-scripts\") pod \"keystone-bootstrap-cqxgt\" (UID: \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\") " pod="openstack/keystone-bootstrap-cqxgt" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.407304 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-combined-ca-bundle\") pod \"keystone-bootstrap-cqxgt\" (UID: \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\") " pod="openstack/keystone-bootstrap-cqxgt" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.407337 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-credential-keys\") pod \"keystone-bootstrap-cqxgt\" (UID: \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\") " pod="openstack/keystone-bootstrap-cqxgt" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.447894 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b","Type":"ContainerStarted","Data":"496846d3595546481753a9c36a3030f6a40d619da32e06e9365b318c4066277d"} Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.450723 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b66884797-vvjz5" event={"ID":"19fb157d-30e1-4432-9d62-0fa70d464148","Type":"ContainerStarted","Data":"1105f4f36eebf376e3165896576deef6174007b5847a89c5cf52e43cc44a1749"} Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.457523 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" event={"ID":"0ea18a20-4f8e-460c-9625-2aa8333cf8f4","Type":"ContainerStarted","Data":"c3420b29f4ae8cc02f43ccb61d99cc037e7b1d7ff80f91930f17fc72077bfbb1"} Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.457695 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.467554 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.84524738 podStartE2EDuration="18.467537079s" podCreationTimestamp="2025-10-09 15:35:23 +0000 UTC" firstStartedPulling="2025-10-09 15:35:24.317461061 +0000 UTC m=+1029.827172356" lastFinishedPulling="2025-10-09 15:35:39.93975076 +0000 UTC m=+1045.449462055" observedRunningTime="2025-10-09 15:35:41.466383762 +0000 UTC m=+1046.976095047" watchObservedRunningTime="2025-10-09 15:35:41.467537079 +0000 UTC m=+1046.977248374" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.473634 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"938e3d88-39e3-4f8a-8920-0fdfcf98d5e5","Type":"ContainerStarted","Data":"556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b"} Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.476166 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-758645fbfc-c4jmx" event={"ID":"2deaf084-98a3-477d-b5ac-8933b175ef00","Type":"ContainerDied","Data":"711ad45450bafdfdd5e761c86fd0264a12b213d9f76b2b6e42708a225073dcae"} Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.476224 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-758645fbfc-c4jmx" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.496869 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerStarted","Data":"3f9a20c39c2315beb69542aebd5bd73add66f7a319edb9051e38f9e594c365d9"} Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.499676 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" podStartSLOduration=18.499653698 podStartE2EDuration="18.499653698s" podCreationTimestamp="2025-10-09 15:35:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:35:41.489076833 +0000 UTC m=+1046.998788118" watchObservedRunningTime="2025-10-09 15:35:41.499653698 +0000 UTC m=+1047.009364983" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.505416 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gfc75" event={"ID":"08e04378-245e-4a13-b1de-f11cf96579ef","Type":"ContainerStarted","Data":"06be8d9bc5f128936eec343d64db74252ddf326eecc6c46a776759ecc3f0fb30"} Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.508855 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-combined-ca-bundle\") pod \"keystone-bootstrap-cqxgt\" (UID: \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\") " pod="openstack/keystone-bootstrap-cqxgt" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.508912 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-credential-keys\") pod \"keystone-bootstrap-cqxgt\" (UID: \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\") " pod="openstack/keystone-bootstrap-cqxgt" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.509043 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-config-data\") pod \"keystone-bootstrap-cqxgt\" (UID: \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\") " pod="openstack/keystone-bootstrap-cqxgt" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.509089 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vwxp\" (UniqueName: \"kubernetes.io/projected/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-kube-api-access-6vwxp\") pod \"keystone-bootstrap-cqxgt\" (UID: \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\") " pod="openstack/keystone-bootstrap-cqxgt" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.509157 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-fernet-keys\") pod \"keystone-bootstrap-cqxgt\" (UID: \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\") " pod="openstack/keystone-bootstrap-cqxgt" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.509180 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-scripts\") pod \"keystone-bootstrap-cqxgt\" (UID: \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\") " pod="openstack/keystone-bootstrap-cqxgt" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.520192 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-combined-ca-bundle\") pod \"keystone-bootstrap-cqxgt\" (UID: \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\") " pod="openstack/keystone-bootstrap-cqxgt" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.524655 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-scripts\") pod \"keystone-bootstrap-cqxgt\" (UID: \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\") " pod="openstack/keystone-bootstrap-cqxgt" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.528667 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-config-data\") pod \"keystone-bootstrap-cqxgt\" (UID: \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\") " pod="openstack/keystone-bootstrap-cqxgt" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.529806 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-credential-keys\") pod \"keystone-bootstrap-cqxgt\" (UID: \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\") " pod="openstack/keystone-bootstrap-cqxgt" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.537469 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-fernet-keys\") pod \"keystone-bootstrap-cqxgt\" (UID: \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\") " pod="openstack/keystone-bootstrap-cqxgt" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.538133 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vwxp\" (UniqueName: \"kubernetes.io/projected/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-kube-api-access-6vwxp\") pod \"keystone-bootstrap-cqxgt\" (UID: \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\") " pod="openstack/keystone-bootstrap-cqxgt" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.543402 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7457564986-k28cv"] Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.565301 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=3.658307455 podStartE2EDuration="18.565285273s" podCreationTimestamp="2025-10-09 15:35:23 +0000 UTC" firstStartedPulling="2025-10-09 15:35:25.059420969 +0000 UTC m=+1030.569132254" lastFinishedPulling="2025-10-09 15:35:39.966398787 +0000 UTC m=+1045.476110072" observedRunningTime="2025-10-09 15:35:41.52711319 +0000 UTC m=+1047.036824465" watchObservedRunningTime="2025-10-09 15:35:41.565285273 +0000 UTC m=+1047.074996558" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.566631 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f5bc696cd-sgqb2"] Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.607375 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-ztgbm"] Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.631749 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-758645fbfc-c4jmx"] Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.632161 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cqxgt" Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.648267 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-758645fbfc-c4jmx"] Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.656156 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jk6nr"] Oct 09 15:35:41 crc kubenswrapper[4719]: I1009 15:35:41.664897 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 09 15:35:41 crc kubenswrapper[4719]: W1009 15:35:41.939458 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2205fae_acbe_4123_936d_ad78cd542565.slice/crio-dbee069014d65bc419765e74aa622ff8ad46a071f900e757d0c6e881d80a2f46 WatchSource:0}: Error finding container dbee069014d65bc419765e74aa622ff8ad46a071f900e757d0c6e881d80a2f46: Status 404 returned error can't find the container with id dbee069014d65bc419765e74aa622ff8ad46a071f900e757d0c6e881d80a2f46 Oct 09 15:35:41 crc kubenswrapper[4719]: W1009 15:35:41.947554 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda66fd9c2_b3cc_43db_b520_6972ce53871f.slice/crio-950a53990da97b950dd63a52ab82a989f41e17f46fdd49cf2d9fc83861a5bc30 WatchSource:0}: Error finding container 950a53990da97b950dd63a52ab82a989f41e17f46fdd49cf2d9fc83861a5bc30: Status 404 returned error can't find the container with id 950a53990da97b950dd63a52ab82a989f41e17f46fdd49cf2d9fc83861a5bc30 Oct 09 15:35:41 crc kubenswrapper[4719]: W1009 15:35:41.951655 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9443b3b8_aa2e_45b2_bea6_b5a8eea89c0c.slice/crio-72ccaabe8a643cbab7f52394e3f00e5797a72bcf8a1011a48c7798c4e7ae0ee6 WatchSource:0}: Error finding container 72ccaabe8a643cbab7f52394e3f00e5797a72bcf8a1011a48c7798c4e7ae0ee6: Status 404 returned error can't find the container with id 72ccaabe8a643cbab7f52394e3f00e5797a72bcf8a1011a48c7798c4e7ae0ee6 Oct 09 15:35:41 crc kubenswrapper[4719]: W1009 15:35:41.956101 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode899b0de_03a2_44a5_a165_25c988e8489d.slice/crio-8e9fd52b958b37cc0d5f9bf89a297dd1eaa3e2a34c04dea47a43b7ff8ed5b5d3 WatchSource:0}: Error finding container 8e9fd52b958b37cc0d5f9bf89a297dd1eaa3e2a34c04dea47a43b7ff8ed5b5d3: Status 404 returned error can't find the container with id 8e9fd52b958b37cc0d5f9bf89a297dd1eaa3e2a34c04dea47a43b7ff8ed5b5d3 Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.376336 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xs6f4" Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.428433 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/52932375-ade4-4056-a4f8-6758db0df52f-db-sync-config-data\") pod \"52932375-ade4-4056-a4f8-6758db0df52f\" (UID: \"52932375-ade4-4056-a4f8-6758db0df52f\") " Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.428485 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wxvq\" (UniqueName: \"kubernetes.io/projected/52932375-ade4-4056-a4f8-6758db0df52f-kube-api-access-8wxvq\") pod \"52932375-ade4-4056-a4f8-6758db0df52f\" (UID: \"52932375-ade4-4056-a4f8-6758db0df52f\") " Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.428508 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52932375-ade4-4056-a4f8-6758db0df52f-combined-ca-bundle\") pod \"52932375-ade4-4056-a4f8-6758db0df52f\" (UID: \"52932375-ade4-4056-a4f8-6758db0df52f\") " Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.428611 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52932375-ade4-4056-a4f8-6758db0df52f-config-data\") pod \"52932375-ade4-4056-a4f8-6758db0df52f\" (UID: \"52932375-ade4-4056-a4f8-6758db0df52f\") " Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.442460 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52932375-ade4-4056-a4f8-6758db0df52f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "52932375-ade4-4056-a4f8-6758db0df52f" (UID: "52932375-ade4-4056-a4f8-6758db0df52f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.444025 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52932375-ade4-4056-a4f8-6758db0df52f-kube-api-access-8wxvq" (OuterVolumeSpecName: "kube-api-access-8wxvq") pod "52932375-ade4-4056-a4f8-6758db0df52f" (UID: "52932375-ade4-4056-a4f8-6758db0df52f"). InnerVolumeSpecName "kube-api-access-8wxvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.530721 4719 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/52932375-ade4-4056-a4f8-6758db0df52f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.530918 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wxvq\" (UniqueName: \"kubernetes.io/projected/52932375-ade4-4056-a4f8-6758db0df52f-kube-api-access-8wxvq\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.531314 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f5bc696cd-sgqb2" event={"ID":"a66fd9c2-b3cc-43db-b520-6972ce53871f","Type":"ContainerStarted","Data":"950a53990da97b950dd63a52ab82a989f41e17f46fdd49cf2d9fc83861a5bc30"} Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.538965 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jk6nr" event={"ID":"a2205fae-acbe-4123-936d-ad78cd542565","Type":"ContainerStarted","Data":"dbee069014d65bc419765e74aa622ff8ad46a071f900e757d0c6e881d80a2f46"} Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.540190 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c","Type":"ContainerStarted","Data":"72ccaabe8a643cbab7f52394e3f00e5797a72bcf8a1011a48c7798c4e7ae0ee6"} Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.544377 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-574c54d6bf-d7655" event={"ID":"dbae14e7-c975-42ac-bad6-b5ad764a239b","Type":"ContainerStarted","Data":"cce4edabb2875d37201fab578ddc34b5432828ea0f2793e131b78bd6c8d8a05f"} Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.545531 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7457564986-k28cv" event={"ID":"d16f8bb5-9ca5-4042-ae67-756c79d00217","Type":"ContainerStarted","Data":"806e4e8aa65d1fa1409cea50489da243031e292e11d4828f43fa4665990bd000"} Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.547484 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xs6f4" event={"ID":"52932375-ade4-4056-a4f8-6758db0df52f","Type":"ContainerDied","Data":"4ebff1e76ee1492af03d71cccd002188c39c10db18fe53076c0c8688abf75a5b"} Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.547523 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ebff1e76ee1492af03d71cccd002188c39c10db18fe53076c0c8688abf75a5b" Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.547596 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xs6f4" Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.557678 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-d2888" event={"ID":"19cf902a-77e9-4e57-89d0-36765e27f361","Type":"ContainerStarted","Data":"a78ab41081ba1e425676665ce30a819dc993b463bc2c9e718253b4f57da6b9c4"} Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.564000 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ztgbm" event={"ID":"e899b0de-03a2-44a5-a165-25c988e8489d","Type":"ContainerStarted","Data":"8e9fd52b958b37cc0d5f9bf89a297dd1eaa3e2a34c04dea47a43b7ff8ed5b5d3"} Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.572402 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-d2888" podStartSLOduration=4.190280535 podStartE2EDuration="19.572381929s" podCreationTimestamp="2025-10-09 15:35:23 +0000 UTC" firstStartedPulling="2025-10-09 15:35:25.284198875 +0000 UTC m=+1030.793910150" lastFinishedPulling="2025-10-09 15:35:40.666300259 +0000 UTC m=+1046.176011544" observedRunningTime="2025-10-09 15:35:42.571333375 +0000 UTC m=+1048.081044660" watchObservedRunningTime="2025-10-09 15:35:42.572381929 +0000 UTC m=+1048.082093214" Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.631806 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-cqxgt"] Oct 09 15:35:42 crc kubenswrapper[4719]: W1009 15:35:42.689980 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e76eab7_abf1_4d15_9f62_52aefceaf1cd.slice/crio-7943ecec6e6d6e6ec2464135e26281f1bbd157abcf49a195c63f08bd0e9d8586 WatchSource:0}: Error finding container 7943ecec6e6d6e6ec2464135e26281f1bbd157abcf49a195c63f08bd0e9d8586: Status 404 returned error can't find the container with id 7943ecec6e6d6e6ec2464135e26281f1bbd157abcf49a195c63f08bd0e9d8586 Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.830496 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5789fb8fc7-6t67j"] Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.896431 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d4b4465c7-xmnmh"] Oct 09 15:35:42 crc kubenswrapper[4719]: E1009 15:35:42.897120 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52932375-ade4-4056-a4f8-6758db0df52f" containerName="glance-db-sync" Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.897139 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="52932375-ade4-4056-a4f8-6758db0df52f" containerName="glance-db-sync" Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.897326 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="52932375-ade4-4056-a4f8-6758db0df52f" containerName="glance-db-sync" Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.898409 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.948577 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-dns-swift-storage-0\") pod \"dnsmasq-dns-5d4b4465c7-xmnmh\" (UID: \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\") " pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.948655 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-ovsdbserver-nb\") pod \"dnsmasq-dns-5d4b4465c7-xmnmh\" (UID: \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\") " pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.948674 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-dns-svc\") pod \"dnsmasq-dns-5d4b4465c7-xmnmh\" (UID: \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\") " pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.948698 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-config\") pod \"dnsmasq-dns-5d4b4465c7-xmnmh\" (UID: \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\") " pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.948719 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-ovsdbserver-sb\") pod \"dnsmasq-dns-5d4b4465c7-xmnmh\" (UID: \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\") " pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.948740 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm9wb\" (UniqueName: \"kubernetes.io/projected/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-kube-api-access-vm9wb\") pod \"dnsmasq-dns-5d4b4465c7-xmnmh\" (UID: \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\") " pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" Oct 09 15:35:42 crc kubenswrapper[4719]: I1009 15:35:42.953033 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d4b4465c7-xmnmh"] Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.052072 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-dns-swift-storage-0\") pod \"dnsmasq-dns-5d4b4465c7-xmnmh\" (UID: \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\") " pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.052154 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-ovsdbserver-nb\") pod \"dnsmasq-dns-5d4b4465c7-xmnmh\" (UID: \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\") " pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.052174 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-dns-svc\") pod \"dnsmasq-dns-5d4b4465c7-xmnmh\" (UID: \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\") " pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.052190 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-config\") pod \"dnsmasq-dns-5d4b4465c7-xmnmh\" (UID: \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\") " pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.052214 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-ovsdbserver-sb\") pod \"dnsmasq-dns-5d4b4465c7-xmnmh\" (UID: \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\") " pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.052236 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm9wb\" (UniqueName: \"kubernetes.io/projected/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-kube-api-access-vm9wb\") pod \"dnsmasq-dns-5d4b4465c7-xmnmh\" (UID: \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\") " pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.054157 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-dns-svc\") pod \"dnsmasq-dns-5d4b4465c7-xmnmh\" (UID: \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\") " pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.054807 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-dns-swift-storage-0\") pod \"dnsmasq-dns-5d4b4465c7-xmnmh\" (UID: \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\") " pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.055858 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-ovsdbserver-nb\") pod \"dnsmasq-dns-5d4b4465c7-xmnmh\" (UID: \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\") " pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.056655 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-config\") pod \"dnsmasq-dns-5d4b4465c7-xmnmh\" (UID: \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\") " pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.065663 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-ovsdbserver-sb\") pod \"dnsmasq-dns-5d4b4465c7-xmnmh\" (UID: \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\") " pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.086510 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm9wb\" (UniqueName: \"kubernetes.io/projected/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-kube-api-access-vm9wb\") pod \"dnsmasq-dns-5d4b4465c7-xmnmh\" (UID: \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\") " pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.134966 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52932375-ade4-4056-a4f8-6758db0df52f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52932375-ade4-4056-a4f8-6758db0df52f" (UID: "52932375-ade4-4056-a4f8-6758db0df52f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.155268 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52932375-ade4-4056-a4f8-6758db0df52f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.205537 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52932375-ade4-4056-a4f8-6758db0df52f-config-data" (OuterVolumeSpecName: "config-data") pod "52932375-ade4-4056-a4f8-6758db0df52f" (UID: "52932375-ade4-4056-a4f8-6758db0df52f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.224980 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2deaf084-98a3-477d-b5ac-8933b175ef00" path="/var/lib/kubelet/pods/2deaf084-98a3-477d-b5ac-8933b175ef00/volumes" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.225460 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4998573-c12b-4a90-a58b-3af8be611b96" path="/var/lib/kubelet/pods/f4998573-c12b-4a90-a58b-3af8be611b96/volumes" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.256558 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52932375-ade4-4056-a4f8-6758db0df52f-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.265181 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.501320 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.573601 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.592246 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b66884797-vvjz5" event={"ID":"19fb157d-30e1-4432-9d62-0fa70d464148","Type":"ContainerStarted","Data":"0a1f9c31e8eae1dcfb066516d44c1b096c008dc82f57f5c126e5525a1576f68a"} Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.592508 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-b66884797-vvjz5" podUID="19fb157d-30e1-4432-9d62-0fa70d464148" containerName="horizon-log" containerID="cri-o://1105f4f36eebf376e3165896576deef6174007b5847a89c5cf52e43cc44a1749" gracePeriod=30 Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.592751 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-b66884797-vvjz5" podUID="19fb157d-30e1-4432-9d62-0fa70d464148" containerName="horizon" containerID="cri-o://0a1f9c31e8eae1dcfb066516d44c1b096c008dc82f57f5c126e5525a1576f68a" gracePeriod=30 Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.609311 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f5bc696cd-sgqb2" event={"ID":"a66fd9c2-b3cc-43db-b520-6972ce53871f","Type":"ContainerStarted","Data":"bbdcad133a05c8d708ade6406c54a0c366c7960cdaa1c7a1179c74978f0c78b5"} Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.627343 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cqxgt" event={"ID":"3e76eab7-abf1-4d15-9f62-52aefceaf1cd","Type":"ContainerStarted","Data":"7943ecec6e6d6e6ec2464135e26281f1bbd157abcf49a195c63f08bd0e9d8586"} Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.630583 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jk6nr" event={"ID":"a2205fae-acbe-4123-936d-ad78cd542565","Type":"ContainerStarted","Data":"d2dcc9015f72d339f90b0e01e06a39cf15ee638599453598e6e0077f826a49f9"} Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.633006 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c","Type":"ContainerStarted","Data":"c976c2032017464bb71a885b6a5fcdfa33f72571adc927ff50eb056178fd2665"} Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.635549 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-574c54d6bf-d7655" event={"ID":"dbae14e7-c975-42ac-bad6-b5ad764a239b","Type":"ContainerStarted","Data":"3931bfae713f63c29b202cd397f9e76dacc3296a74bf8e4cc3fef6c369b966f9"} Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.635661 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-574c54d6bf-d7655" podUID="dbae14e7-c975-42ac-bad6-b5ad764a239b" containerName="horizon-log" containerID="cri-o://cce4edabb2875d37201fab578ddc34b5432828ea0f2793e131b78bd6c8d8a05f" gracePeriod=30 Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.635771 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-574c54d6bf-d7655" podUID="dbae14e7-c975-42ac-bad6-b5ad764a239b" containerName="horizon" containerID="cri-o://3931bfae713f63c29b202cd397f9e76dacc3296a74bf8e4cc3fef6c369b966f9" gracePeriod=30 Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.643842 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7457564986-k28cv" event={"ID":"d16f8bb5-9ca5-4042-ae67-756c79d00217","Type":"ContainerStarted","Data":"b066db9dcb69b83cf18678779a349eae16c407271c43f6f1d333a2cd48b9227c"} Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.646528 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a","Type":"ContainerStarted","Data":"0bb9bc0ec575175f114ad0572c1c1f37cede57bc3aa752d335d6140570036942"} Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.646797 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" podUID="0ea18a20-4f8e-460c-9625-2aa8333cf8f4" containerName="dnsmasq-dns" containerID="cri-o://c3420b29f4ae8cc02f43ccb61d99cc037e7b1d7ff80f91930f17fc72077bfbb1" gracePeriod=10 Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.647146 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.663765 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-b66884797-vvjz5" podStartSLOduration=5.174479555 podStartE2EDuration="20.66374666s" podCreationTimestamp="2025-10-09 15:35:23 +0000 UTC" firstStartedPulling="2025-10-09 15:35:25.210027421 +0000 UTC m=+1030.719738706" lastFinishedPulling="2025-10-09 15:35:40.699294526 +0000 UTC m=+1046.209005811" observedRunningTime="2025-10-09 15:35:43.662925005 +0000 UTC m=+1049.172636290" watchObservedRunningTime="2025-10-09 15:35:43.66374666 +0000 UTC m=+1049.173457945" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.711789 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-b66884797-vvjz5" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.713393 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-574c54d6bf-d7655" podStartSLOduration=5.394069719 podStartE2EDuration="18.713381527s" podCreationTimestamp="2025-10-09 15:35:25 +0000 UTC" firstStartedPulling="2025-10-09 15:35:27.477578107 +0000 UTC m=+1032.987289392" lastFinishedPulling="2025-10-09 15:35:40.796889915 +0000 UTC m=+1046.306601200" observedRunningTime="2025-10-09 15:35:43.690723397 +0000 UTC m=+1049.200434682" watchObservedRunningTime="2025-10-09 15:35:43.713381527 +0000 UTC m=+1049.223092822" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.726255 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-jk6nr" podStartSLOduration=14.726235334 podStartE2EDuration="14.726235334s" podCreationTimestamp="2025-10-09 15:35:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:35:43.711739784 +0000 UTC m=+1049.221451069" watchObservedRunningTime="2025-10-09 15:35:43.726235334 +0000 UTC m=+1049.235946629" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.796573 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.796620 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.838203 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.843548 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.851115 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="95a97721-d5b6-401a-94ef-751878f64947" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.153:9322/\": dial tcp 10.217.0.153:9322: i/o timeout (Client.Timeout exceeded while awaiting headers)" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.851756 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.852142 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zsmzp" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.852280 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.867588 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.949235 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.952948 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.973955 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 09 15:35:43 crc kubenswrapper[4719]: I1009 15:35:43.991284 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.010783 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08ea2228-1919-46f9-be38-4111d34aa3bc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " pod="openstack/glance-default-external-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.010847 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08ea2228-1919-46f9-be38-4111d34aa3bc-config-data\") pod \"glance-default-external-api-0\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " pod="openstack/glance-default-external-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.010904 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08ea2228-1919-46f9-be38-4111d34aa3bc-logs\") pod \"glance-default-external-api-0\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " pod="openstack/glance-default-external-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.011039 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08ea2228-1919-46f9-be38-4111d34aa3bc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " pod="openstack/glance-default-external-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.011071 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg6l8\" (UniqueName: \"kubernetes.io/projected/08ea2228-1919-46f9-be38-4111d34aa3bc-kube-api-access-hg6l8\") pod \"glance-default-external-api-0\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " pod="openstack/glance-default-external-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.011112 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " pod="openstack/glance-default-external-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.012578 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08ea2228-1919-46f9-be38-4111d34aa3bc-scripts\") pod \"glance-default-external-api-0\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " pod="openstack/glance-default-external-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.050946 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.117715 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87773a88-c22e-421e-8d36-d1d48d484dd9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.117766 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87773a88-c22e-421e-8d36-d1d48d484dd9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.117850 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87773a88-c22e-421e-8d36-d1d48d484dd9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.117887 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.117918 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08ea2228-1919-46f9-be38-4111d34aa3bc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " pod="openstack/glance-default-external-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.117940 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fwqx\" (UniqueName: \"kubernetes.io/projected/87773a88-c22e-421e-8d36-d1d48d484dd9-kube-api-access-6fwqx\") pod \"glance-default-internal-api-0\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.117964 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg6l8\" (UniqueName: \"kubernetes.io/projected/08ea2228-1919-46f9-be38-4111d34aa3bc-kube-api-access-hg6l8\") pod \"glance-default-external-api-0\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " pod="openstack/glance-default-external-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.117990 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " pod="openstack/glance-default-external-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.118036 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87773a88-c22e-421e-8d36-d1d48d484dd9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.118081 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08ea2228-1919-46f9-be38-4111d34aa3bc-scripts\") pod \"glance-default-external-api-0\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " pod="openstack/glance-default-external-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.118114 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08ea2228-1919-46f9-be38-4111d34aa3bc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " pod="openstack/glance-default-external-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.118141 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08ea2228-1919-46f9-be38-4111d34aa3bc-config-data\") pod \"glance-default-external-api-0\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " pod="openstack/glance-default-external-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.118165 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87773a88-c22e-421e-8d36-d1d48d484dd9-logs\") pod \"glance-default-internal-api-0\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.118203 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08ea2228-1919-46f9-be38-4111d34aa3bc-logs\") pod \"glance-default-external-api-0\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " pod="openstack/glance-default-external-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.118758 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08ea2228-1919-46f9-be38-4111d34aa3bc-logs\") pod \"glance-default-external-api-0\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " pod="openstack/glance-default-external-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.119326 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08ea2228-1919-46f9-be38-4111d34aa3bc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " pod="openstack/glance-default-external-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.127617 4719 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.135902 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08ea2228-1919-46f9-be38-4111d34aa3bc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " pod="openstack/glance-default-external-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.137056 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08ea2228-1919-46f9-be38-4111d34aa3bc-config-data\") pod \"glance-default-external-api-0\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " pod="openstack/glance-default-external-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.191129 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg6l8\" (UniqueName: \"kubernetes.io/projected/08ea2228-1919-46f9-be38-4111d34aa3bc-kube-api-access-hg6l8\") pod \"glance-default-external-api-0\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " pod="openstack/glance-default-external-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.244093 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.244148 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fwqx\" (UniqueName: \"kubernetes.io/projected/87773a88-c22e-421e-8d36-d1d48d484dd9-kube-api-access-6fwqx\") pod \"glance-default-internal-api-0\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.244199 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87773a88-c22e-421e-8d36-d1d48d484dd9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.244265 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87773a88-c22e-421e-8d36-d1d48d484dd9-logs\") pod \"glance-default-internal-api-0\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.244297 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87773a88-c22e-421e-8d36-d1d48d484dd9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.244316 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87773a88-c22e-421e-8d36-d1d48d484dd9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.244376 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87773a88-c22e-421e-8d36-d1d48d484dd9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.244558 4719 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.244800 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87773a88-c22e-421e-8d36-d1d48d484dd9-logs\") pod \"glance-default-internal-api-0\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.245416 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87773a88-c22e-421e-8d36-d1d48d484dd9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.258419 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87773a88-c22e-421e-8d36-d1d48d484dd9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.258472 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.266239 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fwqx\" (UniqueName: \"kubernetes.io/projected/87773a88-c22e-421e-8d36-d1d48d484dd9-kube-api-access-6fwqx\") pod \"glance-default-internal-api-0\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.275787 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d4b4465c7-xmnmh"] Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.329919 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08ea2228-1919-46f9-be38-4111d34aa3bc-scripts\") pod \"glance-default-external-api-0\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " pod="openstack/glance-default-external-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.330391 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87773a88-c22e-421e-8d36-d1d48d484dd9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.362894 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87773a88-c22e-421e-8d36-d1d48d484dd9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.614063 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " pod="openstack/glance-default-external-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.617199 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.624472 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.674907 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.688489 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f5bc696cd-sgqb2" event={"ID":"a66fd9c2-b3cc-43db-b520-6972ce53871f","Type":"ContainerStarted","Data":"3c3616b45c4c48afe9f8aee424859da1333a55b2fdbd472317c6d5274ad84821"} Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.701537 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cqxgt" event={"ID":"3e76eab7-abf1-4d15-9f62-52aefceaf1cd","Type":"ContainerStarted","Data":"63237fb476ce410d4fe48e51b67339fa6dc864befe7f8f952a555fafc005e3f4"} Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.720837 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6f5bc696cd-sgqb2" podStartSLOduration=12.720817423 podStartE2EDuration="12.720817423s" podCreationTimestamp="2025-10-09 15:35:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:35:44.720460562 +0000 UTC m=+1050.230171877" watchObservedRunningTime="2025-10-09 15:35:44.720817423 +0000 UTC m=+1050.230528708" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.727207 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" event={"ID":"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0","Type":"ContainerStarted","Data":"e183ff5ab72e1be831938d030ccd662c2692774bec76608f2bc7a1f81801ffab"} Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.729744 4719 generic.go:334] "Generic (PLEG): container finished" podID="0ea18a20-4f8e-460c-9625-2aa8333cf8f4" containerID="c3420b29f4ae8cc02f43ccb61d99cc037e7b1d7ff80f91930f17fc72077bfbb1" exitCode=0 Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.729802 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" event={"ID":"0ea18a20-4f8e-460c-9625-2aa8333cf8f4","Type":"ContainerDied","Data":"c3420b29f4ae8cc02f43ccb61d99cc037e7b1d7ff80f91930f17fc72077bfbb1"} Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.729824 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" event={"ID":"0ea18a20-4f8e-460c-9625-2aa8333cf8f4","Type":"ContainerDied","Data":"33699d94693f6b3ff18f373d39bdeb8f9dd064635c60dff0f5a66d7540c8a03f"} Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.729839 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33699d94693f6b3ff18f373d39bdeb8f9dd064635c60dff0f5a66d7540c8a03f" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.756514 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-cqxgt" podStartSLOduration=3.756493036 podStartE2EDuration="3.756493036s" podCreationTimestamp="2025-10-09 15:35:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:35:44.738422392 +0000 UTC m=+1050.248133707" watchObservedRunningTime="2025-10-09 15:35:44.756493036 +0000 UTC m=+1050.266204321" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.774786 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.785258 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.786748 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.163:9322/\": dial tcp 10.217.0.163:9322: connect: connection refused" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.808650 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.814614 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7457564986-k28cv" event={"ID":"d16f8bb5-9ca5-4042-ae67-756c79d00217","Type":"ContainerStarted","Data":"10745d1f988a6d6acb4c497cc78d9b04e38ddb779ac8f9940910e552683c668d"} Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.833380 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=4.833338255 podStartE2EDuration="4.833338255s" podCreationTimestamp="2025-10-09 15:35:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:35:44.824900697 +0000 UTC m=+1050.334611992" watchObservedRunningTime="2025-10-09 15:35:44.833338255 +0000 UTC m=+1050.343049540" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.862720 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7457564986-k28cv" podStartSLOduration=12.862693568 podStartE2EDuration="12.862693568s" podCreationTimestamp="2025-10-09 15:35:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:35:44.849681925 +0000 UTC m=+1050.359393210" watchObservedRunningTime="2025-10-09 15:35:44.862693568 +0000 UTC m=+1050.372404873" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.890479 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.970825 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-dns-svc\") pod \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\" (UID: \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\") " Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.970885 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-config\") pod \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\" (UID: \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\") " Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.970946 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4v48\" (UniqueName: \"kubernetes.io/projected/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-kube-api-access-s4v48\") pod \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\" (UID: \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\") " Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.971018 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-ovsdbserver-nb\") pod \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\" (UID: \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\") " Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.971079 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-dns-swift-storage-0\") pod \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\" (UID: \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\") " Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.971147 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-ovsdbserver-sb\") pod \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\" (UID: \"0ea18a20-4f8e-460c-9625-2aa8333cf8f4\") " Oct 09 15:35:44 crc kubenswrapper[4719]: I1009 15:35:44.982280 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-kube-api-access-s4v48" (OuterVolumeSpecName: "kube-api-access-s4v48") pod "0ea18a20-4f8e-460c-9625-2aa8333cf8f4" (UID: "0ea18a20-4f8e-460c-9625-2aa8333cf8f4"). InnerVolumeSpecName "kube-api-access-s4v48". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:35:45 crc kubenswrapper[4719]: I1009 15:35:45.038890 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Oct 09 15:35:45 crc kubenswrapper[4719]: I1009 15:35:45.063990 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-config" (OuterVolumeSpecName: "config") pod "0ea18a20-4f8e-460c-9625-2aa8333cf8f4" (UID: "0ea18a20-4f8e-460c-9625-2aa8333cf8f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:35:45 crc kubenswrapper[4719]: I1009 15:35:45.074849 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:45 crc kubenswrapper[4719]: I1009 15:35:45.074883 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4v48\" (UniqueName: \"kubernetes.io/projected/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-kube-api-access-s4v48\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:45 crc kubenswrapper[4719]: I1009 15:35:45.091760 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0ea18a20-4f8e-460c-9625-2aa8333cf8f4" (UID: "0ea18a20-4f8e-460c-9625-2aa8333cf8f4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:35:45 crc kubenswrapper[4719]: I1009 15:35:45.093900 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0ea18a20-4f8e-460c-9625-2aa8333cf8f4" (UID: "0ea18a20-4f8e-460c-9625-2aa8333cf8f4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:35:45 crc kubenswrapper[4719]: I1009 15:35:45.108876 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0ea18a20-4f8e-460c-9625-2aa8333cf8f4" (UID: "0ea18a20-4f8e-460c-9625-2aa8333cf8f4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:35:45 crc kubenswrapper[4719]: I1009 15:35:45.122623 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0ea18a20-4f8e-460c-9625-2aa8333cf8f4" (UID: "0ea18a20-4f8e-460c-9625-2aa8333cf8f4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:35:45 crc kubenswrapper[4719]: I1009 15:35:45.176113 4719 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:45 crc kubenswrapper[4719]: I1009 15:35:45.176148 4719 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:45 crc kubenswrapper[4719]: I1009 15:35:45.176157 4719 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:45 crc kubenswrapper[4719]: I1009 15:35:45.176166 4719 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ea18a20-4f8e-460c-9625-2aa8333cf8f4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:45 crc kubenswrapper[4719]: I1009 15:35:45.834492 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c","Type":"ContainerStarted","Data":"c231706ec5906c45822ce5a46fdfb1762c323ce3890c77c104bd0cd1c23f4758"} Oct 09 15:35:45 crc kubenswrapper[4719]: I1009 15:35:45.834873 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5789fb8fc7-6t67j" Oct 09 15:35:45 crc kubenswrapper[4719]: I1009 15:35:45.835958 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="8a94c6c0-6e99-4a00-bdde-fe1e7927af5b" containerName="watcher-decision-engine" containerID="cri-o://496846d3595546481753a9c36a3030f6a40d619da32e06e9365b318c4066277d" gracePeriod=30 Oct 09 15:35:45 crc kubenswrapper[4719]: I1009 15:35:45.871409 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5789fb8fc7-6t67j"] Oct 09 15:35:45 crc kubenswrapper[4719]: I1009 15:35:45.888081 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 09 15:35:45 crc kubenswrapper[4719]: I1009 15:35:45.895250 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5789fb8fc7-6t67j"] Oct 09 15:35:46 crc kubenswrapper[4719]: I1009 15:35:46.028807 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-574c54d6bf-d7655" Oct 09 15:35:46 crc kubenswrapper[4719]: I1009 15:35:46.188310 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 15:35:46 crc kubenswrapper[4719]: I1009 15:35:46.306149 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 15:35:46 crc kubenswrapper[4719]: I1009 15:35:46.855815 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="938e3d88-39e3-4f8a-8920-0fdfcf98d5e5" containerName="watcher-applier" containerID="cri-o://556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b" gracePeriod=30 Oct 09 15:35:47 crc kubenswrapper[4719]: I1009 15:35:47.174508 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ea18a20-4f8e-460c-9625-2aa8333cf8f4" path="/var/lib/kubelet/pods/0ea18a20-4f8e-460c-9625-2aa8333cf8f4/volumes" Oct 09 15:35:47 crc kubenswrapper[4719]: I1009 15:35:47.869468 4719 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 15:35:48 crc kubenswrapper[4719]: E1009 15:35:48.797513 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b is running failed: container process not found" containerID="556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 09 15:35:48 crc kubenswrapper[4719]: E1009 15:35:48.797831 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b is running failed: container process not found" containerID="556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 09 15:35:48 crc kubenswrapper[4719]: E1009 15:35:48.798226 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b is running failed: container process not found" containerID="556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 09 15:35:48 crc kubenswrapper[4719]: E1009 15:35:48.798312 4719 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="938e3d88-39e3-4f8a-8920-0fdfcf98d5e5" containerName="watcher-applier" Oct 09 15:35:48 crc kubenswrapper[4719]: I1009 15:35:48.940091 4719 generic.go:334] "Generic (PLEG): container finished" podID="938e3d88-39e3-4f8a-8920-0fdfcf98d5e5" containerID="556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b" exitCode=0 Oct 09 15:35:48 crc kubenswrapper[4719]: I1009 15:35:48.940195 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"938e3d88-39e3-4f8a-8920-0fdfcf98d5e5","Type":"ContainerDied","Data":"556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b"} Oct 09 15:35:48 crc kubenswrapper[4719]: I1009 15:35:48.946087 4719 generic.go:334] "Generic (PLEG): container finished" podID="8a94c6c0-6e99-4a00-bdde-fe1e7927af5b" containerID="496846d3595546481753a9c36a3030f6a40d619da32e06e9365b318c4066277d" exitCode=1 Oct 09 15:35:48 crc kubenswrapper[4719]: I1009 15:35:48.946136 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b","Type":"ContainerDied","Data":"496846d3595546481753a9c36a3030f6a40d619da32e06e9365b318c4066277d"} Oct 09 15:35:48 crc kubenswrapper[4719]: I1009 15:35:48.948377 4719 generic.go:334] "Generic (PLEG): container finished" podID="19cf902a-77e9-4e57-89d0-36765e27f361" containerID="a78ab41081ba1e425676665ce30a819dc993b463bc2c9e718253b4f57da6b9c4" exitCode=0 Oct 09 15:35:48 crc kubenswrapper[4719]: I1009 15:35:48.948629 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-d2888" event={"ID":"19cf902a-77e9-4e57-89d0-36765e27f361","Type":"ContainerDied","Data":"a78ab41081ba1e425676665ce30a819dc993b463bc2c9e718253b4f57da6b9c4"} Oct 09 15:35:48 crc kubenswrapper[4719]: I1009 15:35:48.957214 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 09 15:35:50 crc kubenswrapper[4719]: I1009 15:35:50.888583 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Oct 09 15:35:50 crc kubenswrapper[4719]: I1009 15:35:50.897117 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Oct 09 15:35:50 crc kubenswrapper[4719]: I1009 15:35:50.983058 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 09 15:35:52 crc kubenswrapper[4719]: I1009 15:35:52.382451 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7457564986-k28cv" Oct 09 15:35:52 crc kubenswrapper[4719]: I1009 15:35:52.382773 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7457564986-k28cv" Oct 09 15:35:52 crc kubenswrapper[4719]: I1009 15:35:52.500450 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:35:52 crc kubenswrapper[4719]: I1009 15:35:52.500652 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:35:53 crc kubenswrapper[4719]: I1009 15:35:53.010091 4719 generic.go:334] "Generic (PLEG): container finished" podID="3e76eab7-abf1-4d15-9f62-52aefceaf1cd" containerID="63237fb476ce410d4fe48e51b67339fa6dc864befe7f8f952a555fafc005e3f4" exitCode=0 Oct 09 15:35:53 crc kubenswrapper[4719]: I1009 15:35:53.010162 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cqxgt" event={"ID":"3e76eab7-abf1-4d15-9f62-52aefceaf1cd","Type":"ContainerDied","Data":"63237fb476ce410d4fe48e51b67339fa6dc864befe7f8f952a555fafc005e3f4"} Oct 09 15:35:53 crc kubenswrapper[4719]: I1009 15:35:53.012922 4719 generic.go:334] "Generic (PLEG): container finished" podID="6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0" containerID="83a913a45cd58f2dd977bd09b3e7ae5d581cb6a2864604e7f86296b9f951b410" exitCode=0 Oct 09 15:35:53 crc kubenswrapper[4719]: I1009 15:35:53.012986 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" event={"ID":"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0","Type":"ContainerDied","Data":"83a913a45cd58f2dd977bd09b3e7ae5d581cb6a2864604e7f86296b9f951b410"} Oct 09 15:35:53 crc kubenswrapper[4719]: E1009 15:35:53.504526 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 496846d3595546481753a9c36a3030f6a40d619da32e06e9365b318c4066277d is running failed: container process not found" containerID="496846d3595546481753a9c36a3030f6a40d619da32e06e9365b318c4066277d" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Oct 09 15:35:53 crc kubenswrapper[4719]: E1009 15:35:53.508464 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 496846d3595546481753a9c36a3030f6a40d619da32e06e9365b318c4066277d is running failed: container process not found" containerID="496846d3595546481753a9c36a3030f6a40d619da32e06e9365b318c4066277d" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Oct 09 15:35:53 crc kubenswrapper[4719]: E1009 15:35:53.508930 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 496846d3595546481753a9c36a3030f6a40d619da32e06e9365b318c4066277d is running failed: container process not found" containerID="496846d3595546481753a9c36a3030f6a40d619da32e06e9365b318c4066277d" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Oct 09 15:35:53 crc kubenswrapper[4719]: E1009 15:35:53.508988 4719 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 496846d3595546481753a9c36a3030f6a40d619da32e06e9365b318c4066277d is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-decision-engine-0" podUID="8a94c6c0-6e99-4a00-bdde-fe1e7927af5b" containerName="watcher-decision-engine" Oct 09 15:35:53 crc kubenswrapper[4719]: E1009 15:35:53.799305 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b is running failed: container process not found" containerID="556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 09 15:35:53 crc kubenswrapper[4719]: E1009 15:35:53.802254 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b is running failed: container process not found" containerID="556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 09 15:35:53 crc kubenswrapper[4719]: E1009 15:35:53.805503 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b is running failed: container process not found" containerID="556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 09 15:35:53 crc kubenswrapper[4719]: E1009 15:35:53.805584 4719 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="938e3d88-39e3-4f8a-8920-0fdfcf98d5e5" containerName="watcher-applier" Oct 09 15:35:54 crc kubenswrapper[4719]: I1009 15:35:54.202717 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 09 15:35:54 crc kubenswrapper[4719]: I1009 15:35:54.203413 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c" containerName="watcher-api-log" containerID="cri-o://c976c2032017464bb71a885b6a5fcdfa33f72571adc927ff50eb056178fd2665" gracePeriod=30 Oct 09 15:35:54 crc kubenswrapper[4719]: I1009 15:35:54.203559 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c" containerName="watcher-api" containerID="cri-o://c231706ec5906c45822ce5a46fdfb1762c323ce3890c77c104bd0cd1c23f4758" gracePeriod=30 Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.052721 4719 generic.go:334] "Generic (PLEG): container finished" podID="9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c" containerID="c976c2032017464bb71a885b6a5fcdfa33f72571adc927ff50eb056178fd2665" exitCode=143 Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.052767 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c","Type":"ContainerDied","Data":"c976c2032017464bb71a885b6a5fcdfa33f72571adc927ff50eb056178fd2665"} Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.260002 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-d2888" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.277145 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.297149 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cqxgt" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.427985 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19cf902a-77e9-4e57-89d0-36765e27f361-scripts\") pod \"19cf902a-77e9-4e57-89d0-36765e27f361\" (UID: \"19cf902a-77e9-4e57-89d0-36765e27f361\") " Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.428045 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-scripts\") pod \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\" (UID: \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\") " Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.428086 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-fernet-keys\") pod \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\" (UID: \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\") " Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.428145 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bgzt\" (UniqueName: \"kubernetes.io/projected/19cf902a-77e9-4e57-89d0-36765e27f361-kube-api-access-5bgzt\") pod \"19cf902a-77e9-4e57-89d0-36765e27f361\" (UID: \"19cf902a-77e9-4e57-89d0-36765e27f361\") " Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.428190 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25lrr\" (UniqueName: \"kubernetes.io/projected/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-kube-api-access-25lrr\") pod \"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b\" (UID: \"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b\") " Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.428217 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cf902a-77e9-4e57-89d0-36765e27f361-combined-ca-bundle\") pod \"19cf902a-77e9-4e57-89d0-36765e27f361\" (UID: \"19cf902a-77e9-4e57-89d0-36765e27f361\") " Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.428242 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-credential-keys\") pod \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\" (UID: \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\") " Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.428258 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-config-data\") pod \"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b\" (UID: \"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b\") " Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.428276 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-combined-ca-bundle\") pod \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\" (UID: \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\") " Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.428315 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-custom-prometheus-ca\") pod \"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b\" (UID: \"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b\") " Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.428335 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-combined-ca-bundle\") pod \"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b\" (UID: \"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b\") " Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.428478 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cf902a-77e9-4e57-89d0-36765e27f361-config-data\") pod \"19cf902a-77e9-4e57-89d0-36765e27f361\" (UID: \"19cf902a-77e9-4e57-89d0-36765e27f361\") " Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.428532 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-config-data\") pod \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\" (UID: \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\") " Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.428575 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-logs\") pod \"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b\" (UID: \"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b\") " Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.428603 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19cf902a-77e9-4e57-89d0-36765e27f361-logs\") pod \"19cf902a-77e9-4e57-89d0-36765e27f361\" (UID: \"19cf902a-77e9-4e57-89d0-36765e27f361\") " Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.428704 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vwxp\" (UniqueName: \"kubernetes.io/projected/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-kube-api-access-6vwxp\") pod \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\" (UID: \"3e76eab7-abf1-4d15-9f62-52aefceaf1cd\") " Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.434299 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-logs" (OuterVolumeSpecName: "logs") pod "8a94c6c0-6e99-4a00-bdde-fe1e7927af5b" (UID: "8a94c6c0-6e99-4a00-bdde-fe1e7927af5b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.434616 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19cf902a-77e9-4e57-89d0-36765e27f361-logs" (OuterVolumeSpecName: "logs") pod "19cf902a-77e9-4e57-89d0-36765e27f361" (UID: "19cf902a-77e9-4e57-89d0-36765e27f361"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.440946 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-kube-api-access-6vwxp" (OuterVolumeSpecName: "kube-api-access-6vwxp") pod "3e76eab7-abf1-4d15-9f62-52aefceaf1cd" (UID: "3e76eab7-abf1-4d15-9f62-52aefceaf1cd"). InnerVolumeSpecName "kube-api-access-6vwxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.444549 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19cf902a-77e9-4e57-89d0-36765e27f361-scripts" (OuterVolumeSpecName: "scripts") pod "19cf902a-77e9-4e57-89d0-36765e27f361" (UID: "19cf902a-77e9-4e57-89d0-36765e27f361"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.447150 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3e76eab7-abf1-4d15-9f62-52aefceaf1cd" (UID: "3e76eab7-abf1-4d15-9f62-52aefceaf1cd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.447226 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-scripts" (OuterVolumeSpecName: "scripts") pod "3e76eab7-abf1-4d15-9f62-52aefceaf1cd" (UID: "3e76eab7-abf1-4d15-9f62-52aefceaf1cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.452143 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-kube-api-access-25lrr" (OuterVolumeSpecName: "kube-api-access-25lrr") pod "8a94c6c0-6e99-4a00-bdde-fe1e7927af5b" (UID: "8a94c6c0-6e99-4a00-bdde-fe1e7927af5b"). InnerVolumeSpecName "kube-api-access-25lrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.468631 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19cf902a-77e9-4e57-89d0-36765e27f361-kube-api-access-5bgzt" (OuterVolumeSpecName: "kube-api-access-5bgzt") pod "19cf902a-77e9-4e57-89d0-36765e27f361" (UID: "19cf902a-77e9-4e57-89d0-36765e27f361"). InnerVolumeSpecName "kube-api-access-5bgzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.485531 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3e76eab7-abf1-4d15-9f62-52aefceaf1cd" (UID: "3e76eab7-abf1-4d15-9f62-52aefceaf1cd"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.501785 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-config-data" (OuterVolumeSpecName: "config-data") pod "3e76eab7-abf1-4d15-9f62-52aefceaf1cd" (UID: "3e76eab7-abf1-4d15-9f62-52aefceaf1cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.513496 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a94c6c0-6e99-4a00-bdde-fe1e7927af5b" (UID: "8a94c6c0-6e99-4a00-bdde-fe1e7927af5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.513512 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e76eab7-abf1-4d15-9f62-52aefceaf1cd" (UID: "3e76eab7-abf1-4d15-9f62-52aefceaf1cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.513576 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19cf902a-77e9-4e57-89d0-36765e27f361-config-data" (OuterVolumeSpecName: "config-data") pod "19cf902a-77e9-4e57-89d0-36765e27f361" (UID: "19cf902a-77e9-4e57-89d0-36765e27f361"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.531498 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bgzt\" (UniqueName: \"kubernetes.io/projected/19cf902a-77e9-4e57-89d0-36765e27f361-kube-api-access-5bgzt\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.531807 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25lrr\" (UniqueName: \"kubernetes.io/projected/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-kube-api-access-25lrr\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.531940 4719 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.532055 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.532194 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.532521 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cf902a-77e9-4e57-89d0-36765e27f361-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.532794 4719 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-logs\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.532869 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.532937 4719 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19cf902a-77e9-4e57-89d0-36765e27f361-logs\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.533073 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vwxp\" (UniqueName: \"kubernetes.io/projected/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-kube-api-access-6vwxp\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.533191 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19cf902a-77e9-4e57-89d0-36765e27f361-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.533271 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.533433 4719 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3e76eab7-abf1-4d15-9f62-52aefceaf1cd-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.531891 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19cf902a-77e9-4e57-89d0-36765e27f361-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19cf902a-77e9-4e57-89d0-36765e27f361" (UID: "19cf902a-77e9-4e57-89d0-36765e27f361"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.556514 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "8a94c6c0-6e99-4a00-bdde-fe1e7927af5b" (UID: "8a94c6c0-6e99-4a00-bdde-fe1e7927af5b"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.584506 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-config-data" (OuterVolumeSpecName: "config-data") pod "8a94c6c0-6e99-4a00-bdde-fe1e7927af5b" (UID: "8a94c6c0-6e99-4a00-bdde-fe1e7927af5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.638035 4719 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.638067 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cf902a-77e9-4e57-89d0-36765e27f361-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.638078 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.887919 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.163:9322/\": dial tcp 10.217.0.163:9322: connect: connection refused" Oct 09 15:35:55 crc kubenswrapper[4719]: I1009 15:35:55.887922 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9322/\": dial tcp 10.217.0.163:9322: connect: connection refused" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.066097 4719 generic.go:334] "Generic (PLEG): container finished" podID="9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c" containerID="c231706ec5906c45822ce5a46fdfb1762c323ce3890c77c104bd0cd1c23f4758" exitCode=0 Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.066178 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c","Type":"ContainerDied","Data":"c231706ec5906c45822ce5a46fdfb1762c323ce3890c77c104bd0cd1c23f4758"} Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.069544 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.069684 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8a94c6c0-6e99-4a00-bdde-fe1e7927af5b","Type":"ContainerDied","Data":"a725c1af11bf08b8d2cab41271fa2cca695c3bd9293f9cf4dc989a771f25c4b2"} Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.069741 4719 scope.go:117] "RemoveContainer" containerID="496846d3595546481753a9c36a3030f6a40d619da32e06e9365b318c4066277d" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.075067 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-d2888" event={"ID":"19cf902a-77e9-4e57-89d0-36765e27f361","Type":"ContainerDied","Data":"f16c25aed36432b57fdabcd9aa11f093d6f55ebfe8d756cc9b0c8ddd5d3e6659"} Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.075207 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f16c25aed36432b57fdabcd9aa11f093d6f55ebfe8d756cc9b0c8ddd5d3e6659" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.075310 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-d2888" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.083520 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cqxgt" event={"ID":"3e76eab7-abf1-4d15-9f62-52aefceaf1cd","Type":"ContainerDied","Data":"7943ecec6e6d6e6ec2464135e26281f1bbd157abcf49a195c63f08bd0e9d8586"} Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.083554 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7943ecec6e6d6e6ec2464135e26281f1bbd157abcf49a195c63f08bd0e9d8586" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.083617 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cqxgt" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.130023 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.153884 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.168472 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 09 15:35:56 crc kubenswrapper[4719]: E1009 15:35:56.169010 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea18a20-4f8e-460c-9625-2aa8333cf8f4" containerName="dnsmasq-dns" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.169027 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea18a20-4f8e-460c-9625-2aa8333cf8f4" containerName="dnsmasq-dns" Oct 09 15:35:56 crc kubenswrapper[4719]: E1009 15:35:56.169056 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea18a20-4f8e-460c-9625-2aa8333cf8f4" containerName="init" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.169062 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea18a20-4f8e-460c-9625-2aa8333cf8f4" containerName="init" Oct 09 15:35:56 crc kubenswrapper[4719]: E1009 15:35:56.169073 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19cf902a-77e9-4e57-89d0-36765e27f361" containerName="placement-db-sync" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.169079 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="19cf902a-77e9-4e57-89d0-36765e27f361" containerName="placement-db-sync" Oct 09 15:35:56 crc kubenswrapper[4719]: E1009 15:35:56.169097 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a94c6c0-6e99-4a00-bdde-fe1e7927af5b" containerName="watcher-decision-engine" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.169103 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a94c6c0-6e99-4a00-bdde-fe1e7927af5b" containerName="watcher-decision-engine" Oct 09 15:35:56 crc kubenswrapper[4719]: E1009 15:35:56.169111 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e76eab7-abf1-4d15-9f62-52aefceaf1cd" containerName="keystone-bootstrap" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.169117 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e76eab7-abf1-4d15-9f62-52aefceaf1cd" containerName="keystone-bootstrap" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.169456 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a94c6c0-6e99-4a00-bdde-fe1e7927af5b" containerName="watcher-decision-engine" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.169473 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="19cf902a-77e9-4e57-89d0-36765e27f361" containerName="placement-db-sync" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.169484 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ea18a20-4f8e-460c-9625-2aa8333cf8f4" containerName="dnsmasq-dns" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.169502 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e76eab7-abf1-4d15-9f62-52aefceaf1cd" containerName="keystone-bootstrap" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.170145 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.172010 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.182526 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.253064 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75999b62-ce1b-4a9b-8507-c8af12441083-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"75999b62-ce1b-4a9b-8507-c8af12441083\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.253202 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75999b62-ce1b-4a9b-8507-c8af12441083-config-data\") pod \"watcher-decision-engine-0\" (UID: \"75999b62-ce1b-4a9b-8507-c8af12441083\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.254342 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75999b62-ce1b-4a9b-8507-c8af12441083-logs\") pod \"watcher-decision-engine-0\" (UID: \"75999b62-ce1b-4a9b-8507-c8af12441083\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.254462 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/75999b62-ce1b-4a9b-8507-c8af12441083-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"75999b62-ce1b-4a9b-8507-c8af12441083\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.254499 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9j4r\" (UniqueName: \"kubernetes.io/projected/75999b62-ce1b-4a9b-8507-c8af12441083-kube-api-access-n9j4r\") pod \"watcher-decision-engine-0\" (UID: \"75999b62-ce1b-4a9b-8507-c8af12441083\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.361619 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75999b62-ce1b-4a9b-8507-c8af12441083-logs\") pod \"watcher-decision-engine-0\" (UID: \"75999b62-ce1b-4a9b-8507-c8af12441083\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.361825 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/75999b62-ce1b-4a9b-8507-c8af12441083-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"75999b62-ce1b-4a9b-8507-c8af12441083\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.361863 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9j4r\" (UniqueName: \"kubernetes.io/projected/75999b62-ce1b-4a9b-8507-c8af12441083-kube-api-access-n9j4r\") pod \"watcher-decision-engine-0\" (UID: \"75999b62-ce1b-4a9b-8507-c8af12441083\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.361927 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75999b62-ce1b-4a9b-8507-c8af12441083-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"75999b62-ce1b-4a9b-8507-c8af12441083\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.362100 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75999b62-ce1b-4a9b-8507-c8af12441083-config-data\") pod \"watcher-decision-engine-0\" (UID: \"75999b62-ce1b-4a9b-8507-c8af12441083\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.373255 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75999b62-ce1b-4a9b-8507-c8af12441083-logs\") pod \"watcher-decision-engine-0\" (UID: \"75999b62-ce1b-4a9b-8507-c8af12441083\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.402715 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/75999b62-ce1b-4a9b-8507-c8af12441083-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"75999b62-ce1b-4a9b-8507-c8af12441083\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.404873 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75999b62-ce1b-4a9b-8507-c8af12441083-config-data\") pod \"watcher-decision-engine-0\" (UID: \"75999b62-ce1b-4a9b-8507-c8af12441083\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.419401 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75999b62-ce1b-4a9b-8507-c8af12441083-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"75999b62-ce1b-4a9b-8507-c8af12441083\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.437017 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5db5d6b746-l6xlx"] Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.449862 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.470047 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-mgtx8" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.470316 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.470440 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.470547 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.471202 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.471695 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5db5d6b746-l6xlx"] Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.492020 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9j4r\" (UniqueName: \"kubernetes.io/projected/75999b62-ce1b-4a9b-8507-c8af12441083-kube-api-access-n9j4r\") pod \"watcher-decision-engine-0\" (UID: \"75999b62-ce1b-4a9b-8507-c8af12441083\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.576496 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-69dbc5fbc7-t286g"] Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.577656 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.589292 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/494a5aaa-f833-4429-bb35-d745fcdf4ad1-combined-ca-bundle\") pod \"placement-5db5d6b746-l6xlx\" (UID: \"494a5aaa-f833-4429-bb35-d745fcdf4ad1\") " pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.589391 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/494a5aaa-f833-4429-bb35-d745fcdf4ad1-internal-tls-certs\") pod \"placement-5db5d6b746-l6xlx\" (UID: \"494a5aaa-f833-4429-bb35-d745fcdf4ad1\") " pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.589423 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/494a5aaa-f833-4429-bb35-d745fcdf4ad1-public-tls-certs\") pod \"placement-5db5d6b746-l6xlx\" (UID: \"494a5aaa-f833-4429-bb35-d745fcdf4ad1\") " pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.589474 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/494a5aaa-f833-4429-bb35-d745fcdf4ad1-logs\") pod \"placement-5db5d6b746-l6xlx\" (UID: \"494a5aaa-f833-4429-bb35-d745fcdf4ad1\") " pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.589520 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xl6k\" (UniqueName: \"kubernetes.io/projected/494a5aaa-f833-4429-bb35-d745fcdf4ad1-kube-api-access-4xl6k\") pod \"placement-5db5d6b746-l6xlx\" (UID: \"494a5aaa-f833-4429-bb35-d745fcdf4ad1\") " pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.589568 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/494a5aaa-f833-4429-bb35-d745fcdf4ad1-config-data\") pod \"placement-5db5d6b746-l6xlx\" (UID: \"494a5aaa-f833-4429-bb35-d745fcdf4ad1\") " pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.589605 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/494a5aaa-f833-4429-bb35-d745fcdf4ad1-scripts\") pod \"placement-5db5d6b746-l6xlx\" (UID: \"494a5aaa-f833-4429-bb35-d745fcdf4ad1\") " pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.590136 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.590502 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.590654 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.590906 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.591008 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2vsx5" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.591112 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.648215 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-69dbc5fbc7-t286g"] Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.692360 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/494a5aaa-f833-4429-bb35-d745fcdf4ad1-combined-ca-bundle\") pod \"placement-5db5d6b746-l6xlx\" (UID: \"494a5aaa-f833-4429-bb35-d745fcdf4ad1\") " pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.692432 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad01be3d-57da-4019-8059-f0a78501266b-public-tls-certs\") pod \"keystone-69dbc5fbc7-t286g\" (UID: \"ad01be3d-57da-4019-8059-f0a78501266b\") " pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.692459 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad01be3d-57da-4019-8059-f0a78501266b-scripts\") pod \"keystone-69dbc5fbc7-t286g\" (UID: \"ad01be3d-57da-4019-8059-f0a78501266b\") " pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.692480 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/494a5aaa-f833-4429-bb35-d745fcdf4ad1-internal-tls-certs\") pod \"placement-5db5d6b746-l6xlx\" (UID: \"494a5aaa-f833-4429-bb35-d745fcdf4ad1\") " pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.692502 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad01be3d-57da-4019-8059-f0a78501266b-combined-ca-bundle\") pod \"keystone-69dbc5fbc7-t286g\" (UID: \"ad01be3d-57da-4019-8059-f0a78501266b\") " pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.692519 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/494a5aaa-f833-4429-bb35-d745fcdf4ad1-public-tls-certs\") pod \"placement-5db5d6b746-l6xlx\" (UID: \"494a5aaa-f833-4429-bb35-d745fcdf4ad1\") " pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.692543 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad01be3d-57da-4019-8059-f0a78501266b-internal-tls-certs\") pod \"keystone-69dbc5fbc7-t286g\" (UID: \"ad01be3d-57da-4019-8059-f0a78501266b\") " pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.692567 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/494a5aaa-f833-4429-bb35-d745fcdf4ad1-logs\") pod \"placement-5db5d6b746-l6xlx\" (UID: \"494a5aaa-f833-4429-bb35-d745fcdf4ad1\") " pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.692593 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad01be3d-57da-4019-8059-f0a78501266b-config-data\") pod \"keystone-69dbc5fbc7-t286g\" (UID: \"ad01be3d-57da-4019-8059-f0a78501266b\") " pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.692616 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xl6k\" (UniqueName: \"kubernetes.io/projected/494a5aaa-f833-4429-bb35-d745fcdf4ad1-kube-api-access-4xl6k\") pod \"placement-5db5d6b746-l6xlx\" (UID: \"494a5aaa-f833-4429-bb35-d745fcdf4ad1\") " pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.692638 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ad01be3d-57da-4019-8059-f0a78501266b-credential-keys\") pod \"keystone-69dbc5fbc7-t286g\" (UID: \"ad01be3d-57da-4019-8059-f0a78501266b\") " pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.692661 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ad01be3d-57da-4019-8059-f0a78501266b-fernet-keys\") pod \"keystone-69dbc5fbc7-t286g\" (UID: \"ad01be3d-57da-4019-8059-f0a78501266b\") " pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.692683 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/494a5aaa-f833-4429-bb35-d745fcdf4ad1-config-data\") pod \"placement-5db5d6b746-l6xlx\" (UID: \"494a5aaa-f833-4429-bb35-d745fcdf4ad1\") " pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.692709 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/494a5aaa-f833-4429-bb35-d745fcdf4ad1-scripts\") pod \"placement-5db5d6b746-l6xlx\" (UID: \"494a5aaa-f833-4429-bb35-d745fcdf4ad1\") " pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.692730 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-762wl\" (UniqueName: \"kubernetes.io/projected/ad01be3d-57da-4019-8059-f0a78501266b-kube-api-access-762wl\") pod \"keystone-69dbc5fbc7-t286g\" (UID: \"ad01be3d-57da-4019-8059-f0a78501266b\") " pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.708909 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/494a5aaa-f833-4429-bb35-d745fcdf4ad1-logs\") pod \"placement-5db5d6b746-l6xlx\" (UID: \"494a5aaa-f833-4429-bb35-d745fcdf4ad1\") " pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.709679 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/494a5aaa-f833-4429-bb35-d745fcdf4ad1-public-tls-certs\") pod \"placement-5db5d6b746-l6xlx\" (UID: \"494a5aaa-f833-4429-bb35-d745fcdf4ad1\") " pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.710039 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/494a5aaa-f833-4429-bb35-d745fcdf4ad1-internal-tls-certs\") pod \"placement-5db5d6b746-l6xlx\" (UID: \"494a5aaa-f833-4429-bb35-d745fcdf4ad1\") " pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.714873 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/494a5aaa-f833-4429-bb35-d745fcdf4ad1-scripts\") pod \"placement-5db5d6b746-l6xlx\" (UID: \"494a5aaa-f833-4429-bb35-d745fcdf4ad1\") " pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.718052 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/494a5aaa-f833-4429-bb35-d745fcdf4ad1-config-data\") pod \"placement-5db5d6b746-l6xlx\" (UID: \"494a5aaa-f833-4429-bb35-d745fcdf4ad1\") " pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.731769 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xl6k\" (UniqueName: \"kubernetes.io/projected/494a5aaa-f833-4429-bb35-d745fcdf4ad1-kube-api-access-4xl6k\") pod \"placement-5db5d6b746-l6xlx\" (UID: \"494a5aaa-f833-4429-bb35-d745fcdf4ad1\") " pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.736605 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/494a5aaa-f833-4429-bb35-d745fcdf4ad1-combined-ca-bundle\") pod \"placement-5db5d6b746-l6xlx\" (UID: \"494a5aaa-f833-4429-bb35-d745fcdf4ad1\") " pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.791206 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.794023 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad01be3d-57da-4019-8059-f0a78501266b-public-tls-certs\") pod \"keystone-69dbc5fbc7-t286g\" (UID: \"ad01be3d-57da-4019-8059-f0a78501266b\") " pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.794080 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad01be3d-57da-4019-8059-f0a78501266b-scripts\") pod \"keystone-69dbc5fbc7-t286g\" (UID: \"ad01be3d-57da-4019-8059-f0a78501266b\") " pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.794122 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad01be3d-57da-4019-8059-f0a78501266b-combined-ca-bundle\") pod \"keystone-69dbc5fbc7-t286g\" (UID: \"ad01be3d-57da-4019-8059-f0a78501266b\") " pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.794156 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad01be3d-57da-4019-8059-f0a78501266b-internal-tls-certs\") pod \"keystone-69dbc5fbc7-t286g\" (UID: \"ad01be3d-57da-4019-8059-f0a78501266b\") " pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.794201 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad01be3d-57da-4019-8059-f0a78501266b-config-data\") pod \"keystone-69dbc5fbc7-t286g\" (UID: \"ad01be3d-57da-4019-8059-f0a78501266b\") " pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.794232 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ad01be3d-57da-4019-8059-f0a78501266b-credential-keys\") pod \"keystone-69dbc5fbc7-t286g\" (UID: \"ad01be3d-57da-4019-8059-f0a78501266b\") " pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.794254 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ad01be3d-57da-4019-8059-f0a78501266b-fernet-keys\") pod \"keystone-69dbc5fbc7-t286g\" (UID: \"ad01be3d-57da-4019-8059-f0a78501266b\") " pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.794291 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-762wl\" (UniqueName: \"kubernetes.io/projected/ad01be3d-57da-4019-8059-f0a78501266b-kube-api-access-762wl\") pod \"keystone-69dbc5fbc7-t286g\" (UID: \"ad01be3d-57da-4019-8059-f0a78501266b\") " pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.803945 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad01be3d-57da-4019-8059-f0a78501266b-combined-ca-bundle\") pod \"keystone-69dbc5fbc7-t286g\" (UID: \"ad01be3d-57da-4019-8059-f0a78501266b\") " pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.804531 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad01be3d-57da-4019-8059-f0a78501266b-internal-tls-certs\") pod \"keystone-69dbc5fbc7-t286g\" (UID: \"ad01be3d-57da-4019-8059-f0a78501266b\") " pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.804599 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ad01be3d-57da-4019-8059-f0a78501266b-fernet-keys\") pod \"keystone-69dbc5fbc7-t286g\" (UID: \"ad01be3d-57da-4019-8059-f0a78501266b\") " pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.804750 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad01be3d-57da-4019-8059-f0a78501266b-public-tls-certs\") pod \"keystone-69dbc5fbc7-t286g\" (UID: \"ad01be3d-57da-4019-8059-f0a78501266b\") " pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.804790 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ad01be3d-57da-4019-8059-f0a78501266b-credential-keys\") pod \"keystone-69dbc5fbc7-t286g\" (UID: \"ad01be3d-57da-4019-8059-f0a78501266b\") " pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.804809 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad01be3d-57da-4019-8059-f0a78501266b-config-data\") pod \"keystone-69dbc5fbc7-t286g\" (UID: \"ad01be3d-57da-4019-8059-f0a78501266b\") " pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.805901 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad01be3d-57da-4019-8059-f0a78501266b-scripts\") pod \"keystone-69dbc5fbc7-t286g\" (UID: \"ad01be3d-57da-4019-8059-f0a78501266b\") " pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.817213 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-762wl\" (UniqueName: \"kubernetes.io/projected/ad01be3d-57da-4019-8059-f0a78501266b-kube-api-access-762wl\") pod \"keystone-69dbc5fbc7-t286g\" (UID: \"ad01be3d-57da-4019-8059-f0a78501266b\") " pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.956272 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:35:56 crc kubenswrapper[4719]: I1009 15:35:56.976921 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:35:57 crc kubenswrapper[4719]: I1009 15:35:57.188796 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a94c6c0-6e99-4a00-bdde-fe1e7927af5b" path="/var/lib/kubelet/pods/8a94c6c0-6e99-4a00-bdde-fe1e7927af5b/volumes" Oct 09 15:35:58 crc kubenswrapper[4719]: E1009 15:35:58.797738 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b is running failed: container process not found" containerID="556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 09 15:35:58 crc kubenswrapper[4719]: E1009 15:35:58.799791 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b is running failed: container process not found" containerID="556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 09 15:35:58 crc kubenswrapper[4719]: E1009 15:35:58.800340 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b is running failed: container process not found" containerID="556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 09 15:35:58 crc kubenswrapper[4719]: E1009 15:35:58.800413 4719 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="938e3d88-39e3-4f8a-8920-0fdfcf98d5e5" containerName="watcher-applier" Oct 09 15:36:03 crc kubenswrapper[4719]: E1009 15:36:03.797004 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b is running failed: container process not found" containerID="556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 09 15:36:03 crc kubenswrapper[4719]: E1009 15:36:03.798066 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b is running failed: container process not found" containerID="556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 09 15:36:03 crc kubenswrapper[4719]: E1009 15:36:03.798384 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b is running failed: container process not found" containerID="556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 09 15:36:03 crc kubenswrapper[4719]: E1009 15:36:03.798419 4719 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="938e3d88-39e3-4f8a-8920-0fdfcf98d5e5" containerName="watcher-applier" Oct 09 15:36:04 crc kubenswrapper[4719]: I1009 15:36:04.247623 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7457564986-k28cv" Oct 09 15:36:04 crc kubenswrapper[4719]: I1009 15:36:04.538653 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:36:05 crc kubenswrapper[4719]: I1009 15:36:05.888249 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9322/\": dial tcp 10.217.0.163:9322: i/o timeout (Client.Timeout exceeded while awaiting headers)" Oct 09 15:36:05 crc kubenswrapper[4719]: I1009 15:36:05.888246 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.163:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 15:36:06 crc kubenswrapper[4719]: I1009 15:36:06.158771 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7457564986-k28cv" Oct 09 15:36:06 crc kubenswrapper[4719]: I1009 15:36:06.287498 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6f5bc696cd-sgqb2" Oct 09 15:36:06 crc kubenswrapper[4719]: I1009 15:36:06.343778 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7457564986-k28cv"] Oct 09 15:36:06 crc kubenswrapper[4719]: I1009 15:36:06.344035 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7457564986-k28cv" podUID="d16f8bb5-9ca5-4042-ae67-756c79d00217" containerName="horizon-log" containerID="cri-o://b066db9dcb69b83cf18678779a349eae16c407271c43f6f1d333a2cd48b9227c" gracePeriod=30 Oct 09 15:36:06 crc kubenswrapper[4719]: I1009 15:36:06.344190 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7457564986-k28cv" podUID="d16f8bb5-9ca5-4042-ae67-756c79d00217" containerName="horizon" containerID="cri-o://10745d1f988a6d6acb4c497cc78d9b04e38ddb779ac8f9940910e552683c668d" gracePeriod=30 Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.223847 4719 generic.go:334] "Generic (PLEG): container finished" podID="d16f8bb5-9ca5-4042-ae67-756c79d00217" containerID="10745d1f988a6d6acb4c497cc78d9b04e38ddb779ac8f9940910e552683c668d" exitCode=0 Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.223915 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7457564986-k28cv" event={"ID":"d16f8bb5-9ca5-4042-ae67-756c79d00217","Type":"ContainerDied","Data":"10745d1f988a6d6acb4c497cc78d9b04e38ddb779ac8f9940910e552683c668d"} Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.482368 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.511825 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.656339 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-custom-prometheus-ca\") pod \"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c\" (UID: \"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c\") " Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.656481 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-combined-ca-bundle\") pod \"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c\" (UID: \"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c\") " Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.656612 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95gvk\" (UniqueName: \"kubernetes.io/projected/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-kube-api-access-95gvk\") pod \"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c\" (UID: \"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c\") " Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.656638 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-logs\") pod \"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c\" (UID: \"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c\") " Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.656683 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-config-data\") pod \"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c\" (UID: \"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c\") " Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.656726 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/938e3d88-39e3-4f8a-8920-0fdfcf98d5e5-combined-ca-bundle\") pod \"938e3d88-39e3-4f8a-8920-0fdfcf98d5e5\" (UID: \"938e3d88-39e3-4f8a-8920-0fdfcf98d5e5\") " Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.656740 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/938e3d88-39e3-4f8a-8920-0fdfcf98d5e5-logs\") pod \"938e3d88-39e3-4f8a-8920-0fdfcf98d5e5\" (UID: \"938e3d88-39e3-4f8a-8920-0fdfcf98d5e5\") " Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.656770 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/938e3d88-39e3-4f8a-8920-0fdfcf98d5e5-config-data\") pod \"938e3d88-39e3-4f8a-8920-0fdfcf98d5e5\" (UID: \"938e3d88-39e3-4f8a-8920-0fdfcf98d5e5\") " Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.656822 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5lfw\" (UniqueName: \"kubernetes.io/projected/938e3d88-39e3-4f8a-8920-0fdfcf98d5e5-kube-api-access-b5lfw\") pod \"938e3d88-39e3-4f8a-8920-0fdfcf98d5e5\" (UID: \"938e3d88-39e3-4f8a-8920-0fdfcf98d5e5\") " Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.657642 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-logs" (OuterVolumeSpecName: "logs") pod "9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c" (UID: "9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.658125 4719 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-logs\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.658419 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/938e3d88-39e3-4f8a-8920-0fdfcf98d5e5-logs" (OuterVolumeSpecName: "logs") pod "938e3d88-39e3-4f8a-8920-0fdfcf98d5e5" (UID: "938e3d88-39e3-4f8a-8920-0fdfcf98d5e5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.665451 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/938e3d88-39e3-4f8a-8920-0fdfcf98d5e5-kube-api-access-b5lfw" (OuterVolumeSpecName: "kube-api-access-b5lfw") pod "938e3d88-39e3-4f8a-8920-0fdfcf98d5e5" (UID: "938e3d88-39e3-4f8a-8920-0fdfcf98d5e5"). InnerVolumeSpecName "kube-api-access-b5lfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.690253 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/938e3d88-39e3-4f8a-8920-0fdfcf98d5e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "938e3d88-39e3-4f8a-8920-0fdfcf98d5e5" (UID: "938e3d88-39e3-4f8a-8920-0fdfcf98d5e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.692215 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c" (UID: "9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.694576 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-kube-api-access-95gvk" (OuterVolumeSpecName: "kube-api-access-95gvk") pod "9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c" (UID: "9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c"). InnerVolumeSpecName "kube-api-access-95gvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.708441 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c" (UID: "9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.721744 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-config-data" (OuterVolumeSpecName: "config-data") pod "9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c" (UID: "9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.736465 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/938e3d88-39e3-4f8a-8920-0fdfcf98d5e5-config-data" (OuterVolumeSpecName: "config-data") pod "938e3d88-39e3-4f8a-8920-0fdfcf98d5e5" (UID: "938e3d88-39e3-4f8a-8920-0fdfcf98d5e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.760102 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/938e3d88-39e3-4f8a-8920-0fdfcf98d5e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.760138 4719 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/938e3d88-39e3-4f8a-8920-0fdfcf98d5e5-logs\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.760148 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/938e3d88-39e3-4f8a-8920-0fdfcf98d5e5-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.760158 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5lfw\" (UniqueName: \"kubernetes.io/projected/938e3d88-39e3-4f8a-8920-0fdfcf98d5e5-kube-api-access-b5lfw\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.760168 4719 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.760176 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.760193 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95gvk\" (UniqueName: \"kubernetes.io/projected/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-kube-api-access-95gvk\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.760202 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:08 crc kubenswrapper[4719]: I1009 15:36:08.896805 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.236707 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"938e3d88-39e3-4f8a-8920-0fdfcf98d5e5","Type":"ContainerDied","Data":"daf7889a300bb009ed82468173dceb3aac9118deef772e6d80c16abbf31384c4"} Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.236841 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.241099 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c","Type":"ContainerDied","Data":"72ccaabe8a643cbab7f52394e3f00e5797a72bcf8a1011a48c7798c4e7ae0ee6"} Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.241148 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.263828 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.282677 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.314557 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.334531 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Oct 09 15:36:09 crc kubenswrapper[4719]: E1009 15:36:09.335039 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c" containerName="watcher-api" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.335059 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c" containerName="watcher-api" Oct 09 15:36:09 crc kubenswrapper[4719]: E1009 15:36:09.335076 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="938e3d88-39e3-4f8a-8920-0fdfcf98d5e5" containerName="watcher-applier" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.335084 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="938e3d88-39e3-4f8a-8920-0fdfcf98d5e5" containerName="watcher-applier" Oct 09 15:36:09 crc kubenswrapper[4719]: E1009 15:36:09.335112 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c" containerName="watcher-api-log" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.335120 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c" containerName="watcher-api-log" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.335374 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c" containerName="watcher-api" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.335400 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="938e3d88-39e3-4f8a-8920-0fdfcf98d5e5" containerName="watcher-applier" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.335421 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c" containerName="watcher-api-log" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.336082 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.340724 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.358713 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.367110 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.374633 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.376404 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.378057 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.378781 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.383685 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.390386 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.484883 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ead85b36-611c-47d5-8eb2-cddfecffaa77-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"ead85b36-611c-47d5-8eb2-cddfecffaa77\") " pod="openstack/watcher-api-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.484939 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ead85b36-611c-47d5-8eb2-cddfecffaa77-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"ead85b36-611c-47d5-8eb2-cddfecffaa77\") " pod="openstack/watcher-api-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.484982 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9629af3-81da-4d90-a2a8-735ac9bdaeb2-config-data\") pod \"watcher-applier-0\" (UID: \"b9629af3-81da-4d90-a2a8-735ac9bdaeb2\") " pod="openstack/watcher-applier-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.485011 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ead85b36-611c-47d5-8eb2-cddfecffaa77-config-data\") pod \"watcher-api-0\" (UID: \"ead85b36-611c-47d5-8eb2-cddfecffaa77\") " pod="openstack/watcher-api-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.485029 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ead85b36-611c-47d5-8eb2-cddfecffaa77-public-tls-certs\") pod \"watcher-api-0\" (UID: \"ead85b36-611c-47d5-8eb2-cddfecffaa77\") " pod="openstack/watcher-api-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.485045 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrhpm\" (UniqueName: \"kubernetes.io/projected/ead85b36-611c-47d5-8eb2-cddfecffaa77-kube-api-access-mrhpm\") pod \"watcher-api-0\" (UID: \"ead85b36-611c-47d5-8eb2-cddfecffaa77\") " pod="openstack/watcher-api-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.485083 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9629af3-81da-4d90-a2a8-735ac9bdaeb2-logs\") pod \"watcher-applier-0\" (UID: \"b9629af3-81da-4d90-a2a8-735ac9bdaeb2\") " pod="openstack/watcher-applier-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.485177 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9629af3-81da-4d90-a2a8-735ac9bdaeb2-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"b9629af3-81da-4d90-a2a8-735ac9bdaeb2\") " pod="openstack/watcher-applier-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.485213 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg484\" (UniqueName: \"kubernetes.io/projected/b9629af3-81da-4d90-a2a8-735ac9bdaeb2-kube-api-access-sg484\") pod \"watcher-applier-0\" (UID: \"b9629af3-81da-4d90-a2a8-735ac9bdaeb2\") " pod="openstack/watcher-applier-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.485244 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ead85b36-611c-47d5-8eb2-cddfecffaa77-logs\") pod \"watcher-api-0\" (UID: \"ead85b36-611c-47d5-8eb2-cddfecffaa77\") " pod="openstack/watcher-api-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.485257 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ead85b36-611c-47d5-8eb2-cddfecffaa77-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"ead85b36-611c-47d5-8eb2-cddfecffaa77\") " pod="openstack/watcher-api-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.587209 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ead85b36-611c-47d5-8eb2-cddfecffaa77-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"ead85b36-611c-47d5-8eb2-cddfecffaa77\") " pod="openstack/watcher-api-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.587304 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ead85b36-611c-47d5-8eb2-cddfecffaa77-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"ead85b36-611c-47d5-8eb2-cddfecffaa77\") " pod="openstack/watcher-api-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.587394 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9629af3-81da-4d90-a2a8-735ac9bdaeb2-config-data\") pod \"watcher-applier-0\" (UID: \"b9629af3-81da-4d90-a2a8-735ac9bdaeb2\") " pod="openstack/watcher-applier-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.587426 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ead85b36-611c-47d5-8eb2-cddfecffaa77-config-data\") pod \"watcher-api-0\" (UID: \"ead85b36-611c-47d5-8eb2-cddfecffaa77\") " pod="openstack/watcher-api-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.587440 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ead85b36-611c-47d5-8eb2-cddfecffaa77-public-tls-certs\") pod \"watcher-api-0\" (UID: \"ead85b36-611c-47d5-8eb2-cddfecffaa77\") " pod="openstack/watcher-api-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.587455 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrhpm\" (UniqueName: \"kubernetes.io/projected/ead85b36-611c-47d5-8eb2-cddfecffaa77-kube-api-access-mrhpm\") pod \"watcher-api-0\" (UID: \"ead85b36-611c-47d5-8eb2-cddfecffaa77\") " pod="openstack/watcher-api-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.587514 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9629af3-81da-4d90-a2a8-735ac9bdaeb2-logs\") pod \"watcher-applier-0\" (UID: \"b9629af3-81da-4d90-a2a8-735ac9bdaeb2\") " pod="openstack/watcher-applier-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.587553 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9629af3-81da-4d90-a2a8-735ac9bdaeb2-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"b9629af3-81da-4d90-a2a8-735ac9bdaeb2\") " pod="openstack/watcher-applier-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.587598 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg484\" (UniqueName: \"kubernetes.io/projected/b9629af3-81da-4d90-a2a8-735ac9bdaeb2-kube-api-access-sg484\") pod \"watcher-applier-0\" (UID: \"b9629af3-81da-4d90-a2a8-735ac9bdaeb2\") " pod="openstack/watcher-applier-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.587637 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ead85b36-611c-47d5-8eb2-cddfecffaa77-logs\") pod \"watcher-api-0\" (UID: \"ead85b36-611c-47d5-8eb2-cddfecffaa77\") " pod="openstack/watcher-api-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.587654 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ead85b36-611c-47d5-8eb2-cddfecffaa77-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"ead85b36-611c-47d5-8eb2-cddfecffaa77\") " pod="openstack/watcher-api-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.590492 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ead85b36-611c-47d5-8eb2-cddfecffaa77-logs\") pod \"watcher-api-0\" (UID: \"ead85b36-611c-47d5-8eb2-cddfecffaa77\") " pod="openstack/watcher-api-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.591072 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9629af3-81da-4d90-a2a8-735ac9bdaeb2-logs\") pod \"watcher-applier-0\" (UID: \"b9629af3-81da-4d90-a2a8-735ac9bdaeb2\") " pod="openstack/watcher-applier-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.596161 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ead85b36-611c-47d5-8eb2-cddfecffaa77-public-tls-certs\") pod \"watcher-api-0\" (UID: \"ead85b36-611c-47d5-8eb2-cddfecffaa77\") " pod="openstack/watcher-api-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.599607 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9629af3-81da-4d90-a2a8-735ac9bdaeb2-config-data\") pod \"watcher-applier-0\" (UID: \"b9629af3-81da-4d90-a2a8-735ac9bdaeb2\") " pod="openstack/watcher-applier-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.600076 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9629af3-81da-4d90-a2a8-735ac9bdaeb2-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"b9629af3-81da-4d90-a2a8-735ac9bdaeb2\") " pod="openstack/watcher-applier-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.602252 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ead85b36-611c-47d5-8eb2-cddfecffaa77-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"ead85b36-611c-47d5-8eb2-cddfecffaa77\") " pod="openstack/watcher-api-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.602556 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ead85b36-611c-47d5-8eb2-cddfecffaa77-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"ead85b36-611c-47d5-8eb2-cddfecffaa77\") " pod="openstack/watcher-api-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.602925 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ead85b36-611c-47d5-8eb2-cddfecffaa77-config-data\") pod \"watcher-api-0\" (UID: \"ead85b36-611c-47d5-8eb2-cddfecffaa77\") " pod="openstack/watcher-api-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.603869 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ead85b36-611c-47d5-8eb2-cddfecffaa77-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"ead85b36-611c-47d5-8eb2-cddfecffaa77\") " pod="openstack/watcher-api-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.607750 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrhpm\" (UniqueName: \"kubernetes.io/projected/ead85b36-611c-47d5-8eb2-cddfecffaa77-kube-api-access-mrhpm\") pod \"watcher-api-0\" (UID: \"ead85b36-611c-47d5-8eb2-cddfecffaa77\") " pod="openstack/watcher-api-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.607857 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg484\" (UniqueName: \"kubernetes.io/projected/b9629af3-81da-4d90-a2a8-735ac9bdaeb2-kube-api-access-sg484\") pod \"watcher-applier-0\" (UID: \"b9629af3-81da-4d90-a2a8-735ac9bdaeb2\") " pod="openstack/watcher-applier-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.667996 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.691511 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 09 15:36:09 crc kubenswrapper[4719]: I1009 15:36:09.946968 4719 scope.go:117] "RemoveContainer" containerID="556eab4d055c4943d1b366e6789b5af6178230f8f679f669e5fb1b5b4592c85b" Oct 09 15:36:09 crc kubenswrapper[4719]: E1009 15:36:09.974738 4719 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.66:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Oct 09 15:36:09 crc kubenswrapper[4719]: E1009 15:36:09.975035 4719 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.66:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Oct 09 15:36:09 crc kubenswrapper[4719]: E1009 15:36:09.975149 4719 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:38.102.83.66:5001/podified-master-centos10/openstack-cinder-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qnxsf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-ztgbm_openstack(e899b0de-03a2-44a5-a165-25c988e8489d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 15:36:09 crc kubenswrapper[4719]: E1009 15:36:09.976298 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-ztgbm" podUID="e899b0de-03a2-44a5-a165-25c988e8489d" Oct 09 15:36:10 crc kubenswrapper[4719]: I1009 15:36:10.000930 4719 scope.go:117] "RemoveContainer" containerID="c231706ec5906c45822ce5a46fdfb1762c323ce3890c77c104bd0cd1c23f4758" Oct 09 15:36:10 crc kubenswrapper[4719]: I1009 15:36:10.208789 4719 scope.go:117] "RemoveContainer" containerID="c976c2032017464bb71a885b6a5fcdfa33f72571adc927ff50eb056178fd2665" Oct 09 15:36:10 crc kubenswrapper[4719]: I1009 15:36:10.284873 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"87773a88-c22e-421e-8d36-d1d48d484dd9","Type":"ContainerStarted","Data":"e99f4c2e8079b8ecdc9938d4b6289cdff9763330e97d681ad8888affcff4979d"} Oct 09 15:36:10 crc kubenswrapper[4719]: E1009 15:36:10.304654 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.66:5001/podified-master-centos10/openstack-cinder-api:watcher_latest\\\"\"" pod="openstack/cinder-db-sync-ztgbm" podUID="e899b0de-03a2-44a5-a165-25c988e8489d" Oct 09 15:36:10 crc kubenswrapper[4719]: I1009 15:36:10.518667 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 09 15:36:10 crc kubenswrapper[4719]: I1009 15:36:10.588065 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 15:36:10 crc kubenswrapper[4719]: I1009 15:36:10.658236 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5db5d6b746-l6xlx"] Oct 09 15:36:10 crc kubenswrapper[4719]: W1009 15:36:10.682630 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod494a5aaa_f833_4429_bb35_d745fcdf4ad1.slice/crio-55d231b97b1e03968dd036d6020088fbf4ac76718235144e0bc4a69cf02a8480 WatchSource:0}: Error finding container 55d231b97b1e03968dd036d6020088fbf4ac76718235144e0bc4a69cf02a8480: Status 404 returned error can't find the container with id 55d231b97b1e03968dd036d6020088fbf4ac76718235144e0bc4a69cf02a8480 Oct 09 15:36:10 crc kubenswrapper[4719]: I1009 15:36:10.682875 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-69dbc5fbc7-t286g"] Oct 09 15:36:10 crc kubenswrapper[4719]: W1009 15:36:10.689794 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad01be3d_57da_4019_8059_f0a78501266b.slice/crio-e1dea77239670f60e0d2305cf4f32191985cb91ddfd35e9a5ddbbe723c4b0983 WatchSource:0}: Error finding container e1dea77239670f60e0d2305cf4f32191985cb91ddfd35e9a5ddbbe723c4b0983: Status 404 returned error can't find the container with id e1dea77239670f60e0d2305cf4f32191985cb91ddfd35e9a5ddbbe723c4b0983 Oct 09 15:36:10 crc kubenswrapper[4719]: I1009 15:36:10.774615 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 09 15:36:10 crc kubenswrapper[4719]: I1009 15:36:10.828197 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 09 15:36:10 crc kubenswrapper[4719]: W1009 15:36:10.847610 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9629af3_81da_4d90_a2a8_735ac9bdaeb2.slice/crio-7158d6eebb95ba8094925d1ebc7f0978249f5d817475c3bb2d88e5bed9caf198 WatchSource:0}: Error finding container 7158d6eebb95ba8094925d1ebc7f0978249f5d817475c3bb2d88e5bed9caf198: Status 404 returned error can't find the container with id 7158d6eebb95ba8094925d1ebc7f0978249f5d817475c3bb2d88e5bed9caf198 Oct 09 15:36:10 crc kubenswrapper[4719]: W1009 15:36:10.876644 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podead85b36_611c_47d5_8eb2_cddfecffaa77.slice/crio-fa982c6adae80a1f68e07700f8c68415d4ea78798f07e896925444da024e7eb3 WatchSource:0}: Error finding container fa982c6adae80a1f68e07700f8c68415d4ea78798f07e896925444da024e7eb3: Status 404 returned error can't find the container with id fa982c6adae80a1f68e07700f8c68415d4ea78798f07e896925444da024e7eb3 Oct 09 15:36:10 crc kubenswrapper[4719]: I1009 15:36:10.889376 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 15:36:10 crc kubenswrapper[4719]: I1009 15:36:10.890407 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.163:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 15:36:11 crc kubenswrapper[4719]: I1009 15:36:11.171399 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="938e3d88-39e3-4f8a-8920-0fdfcf98d5e5" path="/var/lib/kubelet/pods/938e3d88-39e3-4f8a-8920-0fdfcf98d5e5/volumes" Oct 09 15:36:11 crc kubenswrapper[4719]: I1009 15:36:11.172425 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c" path="/var/lib/kubelet/pods/9443b3b8-aa2e-45b2-bea6-b5a8eea89c0c/volumes" Oct 09 15:36:11 crc kubenswrapper[4719]: I1009 15:36:11.297890 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" event={"ID":"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0","Type":"ContainerStarted","Data":"d8b8706ff6e4dce5089ae88b4d5c4c18a1e7a5a458e508a9d46bebba8b725611"} Oct 09 15:36:11 crc kubenswrapper[4719]: I1009 15:36:11.298032 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" Oct 09 15:36:11 crc kubenswrapper[4719]: I1009 15:36:11.301176 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gfc75" event={"ID":"08e04378-245e-4a13-b1de-f11cf96579ef","Type":"ContainerStarted","Data":"d6277e8a831c6da4e6159e08cafb95492961642c0693a9ebcd2a6bee847db126"} Oct 09 15:36:11 crc kubenswrapper[4719]: I1009 15:36:11.303866 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a","Type":"ContainerStarted","Data":"9410096f029088d510296b7fd6f73e792d5010bebbcf5a4b606890e48fccba01"} Oct 09 15:36:11 crc kubenswrapper[4719]: I1009 15:36:11.305309 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-69dbc5fbc7-t286g" event={"ID":"ad01be3d-57da-4019-8059-f0a78501266b","Type":"ContainerStarted","Data":"e1dea77239670f60e0d2305cf4f32191985cb91ddfd35e9a5ddbbe723c4b0983"} Oct 09 15:36:11 crc kubenswrapper[4719]: I1009 15:36:11.306220 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"75999b62-ce1b-4a9b-8507-c8af12441083","Type":"ContainerStarted","Data":"ed4278e0cbed707901ccadffa63f799e939d112caf148579263c4d77f79e2389"} Oct 09 15:36:11 crc kubenswrapper[4719]: I1009 15:36:11.307005 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5db5d6b746-l6xlx" event={"ID":"494a5aaa-f833-4429-bb35-d745fcdf4ad1","Type":"ContainerStarted","Data":"55d231b97b1e03968dd036d6020088fbf4ac76718235144e0bc4a69cf02a8480"} Oct 09 15:36:11 crc kubenswrapper[4719]: I1009 15:36:11.307778 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"b9629af3-81da-4d90-a2a8-735ac9bdaeb2","Type":"ContainerStarted","Data":"7158d6eebb95ba8094925d1ebc7f0978249f5d817475c3bb2d88e5bed9caf198"} Oct 09 15:36:11 crc kubenswrapper[4719]: I1009 15:36:11.312322 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ead85b36-611c-47d5-8eb2-cddfecffaa77","Type":"ContainerStarted","Data":"fa982c6adae80a1f68e07700f8c68415d4ea78798f07e896925444da024e7eb3"} Oct 09 15:36:11 crc kubenswrapper[4719]: I1009 15:36:11.316373 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" podStartSLOduration=29.316357904 podStartE2EDuration="29.316357904s" podCreationTimestamp="2025-10-09 15:35:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:36:11.312610344 +0000 UTC m=+1076.822321629" watchObservedRunningTime="2025-10-09 15:36:11.316357904 +0000 UTC m=+1076.826069189" Oct 09 15:36:11 crc kubenswrapper[4719]: I1009 15:36:11.316694 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"08ea2228-1919-46f9-be38-4111d34aa3bc","Type":"ContainerStarted","Data":"aea408ab20854e5a1044c820af9978351f77a0ff739b4cc6cbaf49060989778c"} Oct 09 15:36:11 crc kubenswrapper[4719]: I1009 15:36:11.331263 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-gfc75" podStartSLOduration=24.113165168 podStartE2EDuration="42.331246996s" podCreationTimestamp="2025-10-09 15:35:29 +0000 UTC" firstStartedPulling="2025-10-09 15:35:40.667327532 +0000 UTC m=+1046.177038817" lastFinishedPulling="2025-10-09 15:35:58.88540936 +0000 UTC m=+1064.395120645" observedRunningTime="2025-10-09 15:36:11.328741196 +0000 UTC m=+1076.838452501" watchObservedRunningTime="2025-10-09 15:36:11.331246996 +0000 UTC m=+1076.840958281" Oct 09 15:36:12 crc kubenswrapper[4719]: I1009 15:36:12.382803 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7457564986-k28cv" podUID="d16f8bb5-9ca5-4042-ae67-756c79d00217" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.161:8443: connect: connection refused" Oct 09 15:36:13 crc kubenswrapper[4719]: I1009 15:36:13.339172 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-69dbc5fbc7-t286g" event={"ID":"ad01be3d-57da-4019-8059-f0a78501266b","Type":"ContainerStarted","Data":"f410a889895a26c9da4a0d8a905f30105211aff2bc0c893648285b379359eaed"} Oct 09 15:36:13 crc kubenswrapper[4719]: I1009 15:36:13.339515 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:36:13 crc kubenswrapper[4719]: I1009 15:36:13.342585 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"75999b62-ce1b-4a9b-8507-c8af12441083","Type":"ContainerStarted","Data":"34ade78fde967d1fd91220099d799f1a4c62436fb1434170671876fc94b49948"} Oct 09 15:36:13 crc kubenswrapper[4719]: I1009 15:36:13.348246 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"87773a88-c22e-421e-8d36-d1d48d484dd9","Type":"ContainerStarted","Data":"7fbd64f6742148ed0849f0f6ee366f87b7b4453d00a80e327f56e9c1a6e4fade"} Oct 09 15:36:13 crc kubenswrapper[4719]: I1009 15:36:13.354486 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5db5d6b746-l6xlx" event={"ID":"494a5aaa-f833-4429-bb35-d745fcdf4ad1","Type":"ContainerStarted","Data":"cb22f0a0b834862e5620f164498df28d812eda5585713ade7b4aa71f3798f159"} Oct 09 15:36:13 crc kubenswrapper[4719]: I1009 15:36:13.354534 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5db5d6b746-l6xlx" event={"ID":"494a5aaa-f833-4429-bb35-d745fcdf4ad1","Type":"ContainerStarted","Data":"6e7ada6fa25d1c83b205c86ec6704f50ee4f4537acdb3a4a762a646fd85f57eb"} Oct 09 15:36:13 crc kubenswrapper[4719]: I1009 15:36:13.354614 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:36:13 crc kubenswrapper[4719]: I1009 15:36:13.354652 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:36:13 crc kubenswrapper[4719]: I1009 15:36:13.357523 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-69dbc5fbc7-t286g" podStartSLOduration=17.357507462 podStartE2EDuration="17.357507462s" podCreationTimestamp="2025-10-09 15:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:36:13.355735595 +0000 UTC m=+1078.865446880" watchObservedRunningTime="2025-10-09 15:36:13.357507462 +0000 UTC m=+1078.867218747" Oct 09 15:36:13 crc kubenswrapper[4719]: I1009 15:36:13.366594 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"b9629af3-81da-4d90-a2a8-735ac9bdaeb2","Type":"ContainerStarted","Data":"5df3677eb7c7ccb6e9e9b466b4b0a74a8b4fefe85140a705408e75f75363d72b"} Oct 09 15:36:13 crc kubenswrapper[4719]: I1009 15:36:13.374158 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ead85b36-611c-47d5-8eb2-cddfecffaa77","Type":"ContainerStarted","Data":"42617227296ae41ce7d3e8c75198987db70c570775226a3bd38dbc8c2ac31733"} Oct 09 15:36:13 crc kubenswrapper[4719]: I1009 15:36:13.374197 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ead85b36-611c-47d5-8eb2-cddfecffaa77","Type":"ContainerStarted","Data":"9e8bfdc5afbe2f768d19acfa301a54d29e8b93737caa0a99fea8c532dbe97d17"} Oct 09 15:36:13 crc kubenswrapper[4719]: I1009 15:36:13.374673 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 09 15:36:13 crc kubenswrapper[4719]: I1009 15:36:13.376333 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"08ea2228-1919-46f9-be38-4111d34aa3bc","Type":"ContainerStarted","Data":"42745f3380988d9b63da764bcb030aa806763c179458fc3441184ccb4b6beffe"} Oct 09 15:36:13 crc kubenswrapper[4719]: I1009 15:36:13.389839 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=17.389757595 podStartE2EDuration="17.389757595s" podCreationTimestamp="2025-10-09 15:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:36:13.384251421 +0000 UTC m=+1078.893962716" watchObservedRunningTime="2025-10-09 15:36:13.389757595 +0000 UTC m=+1078.899468880" Oct 09 15:36:13 crc kubenswrapper[4719]: I1009 15:36:13.429274 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5db5d6b746-l6xlx" podStartSLOduration=17.429249049 podStartE2EDuration="17.429249049s" podCreationTimestamp="2025-10-09 15:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:36:13.402708667 +0000 UTC m=+1078.912419952" watchObservedRunningTime="2025-10-09 15:36:13.429249049 +0000 UTC m=+1078.938960344" Oct 09 15:36:13 crc kubenswrapper[4719]: I1009 15:36:13.473377 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=4.473344599 podStartE2EDuration="4.473344599s" podCreationTimestamp="2025-10-09 15:36:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:36:13.471071407 +0000 UTC m=+1078.980782702" watchObservedRunningTime="2025-10-09 15:36:13.473344599 +0000 UTC m=+1078.983055884" Oct 09 15:36:13 crc kubenswrapper[4719]: I1009 15:36:13.474544 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=4.474535337 podStartE2EDuration="4.474535337s" podCreationTimestamp="2025-10-09 15:36:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:36:13.448976186 +0000 UTC m=+1078.958687481" watchObservedRunningTime="2025-10-09 15:36:13.474535337 +0000 UTC m=+1078.984246632" Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.326905 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-574c54d6bf-d7655" Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.408857 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"87773a88-c22e-421e-8d36-d1d48d484dd9","Type":"ContainerStarted","Data":"53ae5d4c5c1400f93ac89e63b8e946a262ec7e795cd040c2840dad93c5fef4f2"} Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.409006 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="87773a88-c22e-421e-8d36-d1d48d484dd9" containerName="glance-log" containerID="cri-o://7fbd64f6742148ed0849f0f6ee366f87b7b4453d00a80e327f56e9c1a6e4fade" gracePeriod=30 Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.409798 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="87773a88-c22e-421e-8d36-d1d48d484dd9" containerName="glance-httpd" containerID="cri-o://53ae5d4c5c1400f93ac89e63b8e946a262ec7e795cd040c2840dad93c5fef4f2" gracePeriod=30 Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.428093 4719 generic.go:334] "Generic (PLEG): container finished" podID="dbae14e7-c975-42ac-bad6-b5ad764a239b" containerID="3931bfae713f63c29b202cd397f9e76dacc3296a74bf8e4cc3fef6c369b966f9" exitCode=137 Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.428130 4719 generic.go:334] "Generic (PLEG): container finished" podID="dbae14e7-c975-42ac-bad6-b5ad764a239b" containerID="cce4edabb2875d37201fab578ddc34b5432828ea0f2793e131b78bd6c8d8a05f" exitCode=137 Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.428196 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-574c54d6bf-d7655" event={"ID":"dbae14e7-c975-42ac-bad6-b5ad764a239b","Type":"ContainerDied","Data":"3931bfae713f63c29b202cd397f9e76dacc3296a74bf8e4cc3fef6c369b966f9"} Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.428222 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-574c54d6bf-d7655" event={"ID":"dbae14e7-c975-42ac-bad6-b5ad764a239b","Type":"ContainerDied","Data":"cce4edabb2875d37201fab578ddc34b5432828ea0f2793e131b78bd6c8d8a05f"} Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.428231 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-574c54d6bf-d7655" event={"ID":"dbae14e7-c975-42ac-bad6-b5ad764a239b","Type":"ContainerDied","Data":"c713e76362374e7e2fb5691e5ab537368f9a1cf55c6ca3c504852415758d6a92"} Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.428244 4719 scope.go:117] "RemoveContainer" containerID="3931bfae713f63c29b202cd397f9e76dacc3296a74bf8e4cc3fef6c369b966f9" Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.428392 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-574c54d6bf-d7655" Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.453514 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=32.45350077 podStartE2EDuration="32.45350077s" podCreationTimestamp="2025-10-09 15:35:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:36:14.452075375 +0000 UTC m=+1079.961786660" watchObservedRunningTime="2025-10-09 15:36:14.45350077 +0000 UTC m=+1079.963212055" Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.481319 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"08ea2228-1919-46f9-be38-4111d34aa3bc","Type":"ContainerStarted","Data":"fe72c2b996fdfbf94d0a167f99b32cc52ce7fb82eec9f276b8220416818c6645"} Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.481582 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="08ea2228-1919-46f9-be38-4111d34aa3bc" containerName="glance-log" containerID="cri-o://42745f3380988d9b63da764bcb030aa806763c179458fc3441184ccb4b6beffe" gracePeriod=30 Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.482162 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="08ea2228-1919-46f9-be38-4111d34aa3bc" containerName="glance-httpd" containerID="cri-o://fe72c2b996fdfbf94d0a167f99b32cc52ce7fb82eec9f276b8220416818c6645" gracePeriod=30 Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.504218 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbae14e7-c975-42ac-bad6-b5ad764a239b-logs\") pod \"dbae14e7-c975-42ac-bad6-b5ad764a239b\" (UID: \"dbae14e7-c975-42ac-bad6-b5ad764a239b\") " Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.504254 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbae14e7-c975-42ac-bad6-b5ad764a239b-scripts\") pod \"dbae14e7-c975-42ac-bad6-b5ad764a239b\" (UID: \"dbae14e7-c975-42ac-bad6-b5ad764a239b\") " Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.504651 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jvpw\" (UniqueName: \"kubernetes.io/projected/dbae14e7-c975-42ac-bad6-b5ad764a239b-kube-api-access-7jvpw\") pod \"dbae14e7-c975-42ac-bad6-b5ad764a239b\" (UID: \"dbae14e7-c975-42ac-bad6-b5ad764a239b\") " Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.504713 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dbae14e7-c975-42ac-bad6-b5ad764a239b-horizon-secret-key\") pod \"dbae14e7-c975-42ac-bad6-b5ad764a239b\" (UID: \"dbae14e7-c975-42ac-bad6-b5ad764a239b\") " Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.504756 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dbae14e7-c975-42ac-bad6-b5ad764a239b-config-data\") pod \"dbae14e7-c975-42ac-bad6-b5ad764a239b\" (UID: \"dbae14e7-c975-42ac-bad6-b5ad764a239b\") " Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.506001 4719 generic.go:334] "Generic (PLEG): container finished" podID="19fb157d-30e1-4432-9d62-0fa70d464148" containerID="0a1f9c31e8eae1dcfb066516d44c1b096c008dc82f57f5c126e5525a1576f68a" exitCode=137 Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.506032 4719 generic.go:334] "Generic (PLEG): container finished" podID="19fb157d-30e1-4432-9d62-0fa70d464148" containerID="1105f4f36eebf376e3165896576deef6174007b5847a89c5cf52e43cc44a1749" exitCode=137 Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.506865 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b66884797-vvjz5" event={"ID":"19fb157d-30e1-4432-9d62-0fa70d464148","Type":"ContainerDied","Data":"0a1f9c31e8eae1dcfb066516d44c1b096c008dc82f57f5c126e5525a1576f68a"} Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.506895 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b66884797-vvjz5" event={"ID":"19fb157d-30e1-4432-9d62-0fa70d464148","Type":"ContainerDied","Data":"1105f4f36eebf376e3165896576deef6174007b5847a89c5cf52e43cc44a1749"} Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.508004 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbae14e7-c975-42ac-bad6-b5ad764a239b-logs" (OuterVolumeSpecName: "logs") pod "dbae14e7-c975-42ac-bad6-b5ad764a239b" (UID: "dbae14e7-c975-42ac-bad6-b5ad764a239b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.516613 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbae14e7-c975-42ac-bad6-b5ad764a239b-kube-api-access-7jvpw" (OuterVolumeSpecName: "kube-api-access-7jvpw") pod "dbae14e7-c975-42ac-bad6-b5ad764a239b" (UID: "dbae14e7-c975-42ac-bad6-b5ad764a239b"). InnerVolumeSpecName "kube-api-access-7jvpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.527122 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbae14e7-c975-42ac-bad6-b5ad764a239b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "dbae14e7-c975-42ac-bad6-b5ad764a239b" (UID: "dbae14e7-c975-42ac-bad6-b5ad764a239b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.543112 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=32.543089315 podStartE2EDuration="32.543089315s" podCreationTimestamp="2025-10-09 15:35:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:36:14.523823173 +0000 UTC m=+1080.033534458" watchObservedRunningTime="2025-10-09 15:36:14.543089315 +0000 UTC m=+1080.052800600" Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.572062 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbae14e7-c975-42ac-bad6-b5ad764a239b-config-data" (OuterVolumeSpecName: "config-data") pod "dbae14e7-c975-42ac-bad6-b5ad764a239b" (UID: "dbae14e7-c975-42ac-bad6-b5ad764a239b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.606936 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbae14e7-c975-42ac-bad6-b5ad764a239b-scripts" (OuterVolumeSpecName: "scripts") pod "dbae14e7-c975-42ac-bad6-b5ad764a239b" (UID: "dbae14e7-c975-42ac-bad6-b5ad764a239b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.610344 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jvpw\" (UniqueName: \"kubernetes.io/projected/dbae14e7-c975-42ac-bad6-b5ad764a239b-kube-api-access-7jvpw\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.610382 4719 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dbae14e7-c975-42ac-bad6-b5ad764a239b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.610394 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dbae14e7-c975-42ac-bad6-b5ad764a239b-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.610402 4719 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbae14e7-c975-42ac-bad6-b5ad764a239b-logs\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.610411 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbae14e7-c975-42ac-bad6-b5ad764a239b-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.668127 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.675954 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.676008 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.693610 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.776512 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.776588 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.786425 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-574c54d6bf-d7655"] Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.800865 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-574c54d6bf-d7655"] Oct 09 15:36:14 crc kubenswrapper[4719]: I1009 15:36:14.858153 4719 scope.go:117] "RemoveContainer" containerID="cce4edabb2875d37201fab578ddc34b5432828ea0f2793e131b78bd6c8d8a05f" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.096135 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b66884797-vvjz5" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.111797 4719 scope.go:117] "RemoveContainer" containerID="3931bfae713f63c29b202cd397f9e76dacc3296a74bf8e4cc3fef6c369b966f9" Oct 09 15:36:15 crc kubenswrapper[4719]: E1009 15:36:15.112310 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3931bfae713f63c29b202cd397f9e76dacc3296a74bf8e4cc3fef6c369b966f9\": container with ID starting with 3931bfae713f63c29b202cd397f9e76dacc3296a74bf8e4cc3fef6c369b966f9 not found: ID does not exist" containerID="3931bfae713f63c29b202cd397f9e76dacc3296a74bf8e4cc3fef6c369b966f9" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.112470 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3931bfae713f63c29b202cd397f9e76dacc3296a74bf8e4cc3fef6c369b966f9"} err="failed to get container status \"3931bfae713f63c29b202cd397f9e76dacc3296a74bf8e4cc3fef6c369b966f9\": rpc error: code = NotFound desc = could not find container \"3931bfae713f63c29b202cd397f9e76dacc3296a74bf8e4cc3fef6c369b966f9\": container with ID starting with 3931bfae713f63c29b202cd397f9e76dacc3296a74bf8e4cc3fef6c369b966f9 not found: ID does not exist" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.112511 4719 scope.go:117] "RemoveContainer" containerID="cce4edabb2875d37201fab578ddc34b5432828ea0f2793e131b78bd6c8d8a05f" Oct 09 15:36:15 crc kubenswrapper[4719]: E1009 15:36:15.113049 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cce4edabb2875d37201fab578ddc34b5432828ea0f2793e131b78bd6c8d8a05f\": container with ID starting with cce4edabb2875d37201fab578ddc34b5432828ea0f2793e131b78bd6c8d8a05f not found: ID does not exist" containerID="cce4edabb2875d37201fab578ddc34b5432828ea0f2793e131b78bd6c8d8a05f" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.113081 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cce4edabb2875d37201fab578ddc34b5432828ea0f2793e131b78bd6c8d8a05f"} err="failed to get container status \"cce4edabb2875d37201fab578ddc34b5432828ea0f2793e131b78bd6c8d8a05f\": rpc error: code = NotFound desc = could not find container \"cce4edabb2875d37201fab578ddc34b5432828ea0f2793e131b78bd6c8d8a05f\": container with ID starting with cce4edabb2875d37201fab578ddc34b5432828ea0f2793e131b78bd6c8d8a05f not found: ID does not exist" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.113102 4719 scope.go:117] "RemoveContainer" containerID="3931bfae713f63c29b202cd397f9e76dacc3296a74bf8e4cc3fef6c369b966f9" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.113334 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3931bfae713f63c29b202cd397f9e76dacc3296a74bf8e4cc3fef6c369b966f9"} err="failed to get container status \"3931bfae713f63c29b202cd397f9e76dacc3296a74bf8e4cc3fef6c369b966f9\": rpc error: code = NotFound desc = could not find container \"3931bfae713f63c29b202cd397f9e76dacc3296a74bf8e4cc3fef6c369b966f9\": container with ID starting with 3931bfae713f63c29b202cd397f9e76dacc3296a74bf8e4cc3fef6c369b966f9 not found: ID does not exist" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.113378 4719 scope.go:117] "RemoveContainer" containerID="cce4edabb2875d37201fab578ddc34b5432828ea0f2793e131b78bd6c8d8a05f" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.113599 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cce4edabb2875d37201fab578ddc34b5432828ea0f2793e131b78bd6c8d8a05f"} err="failed to get container status \"cce4edabb2875d37201fab578ddc34b5432828ea0f2793e131b78bd6c8d8a05f\": rpc error: code = NotFound desc = could not find container \"cce4edabb2875d37201fab578ddc34b5432828ea0f2793e131b78bd6c8d8a05f\": container with ID starting with cce4edabb2875d37201fab578ddc34b5432828ea0f2793e131b78bd6c8d8a05f not found: ID does not exist" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.201064 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbae14e7-c975-42ac-bad6-b5ad764a239b" path="/var/lib/kubelet/pods/dbae14e7-c975-42ac-bad6-b5ad764a239b/volumes" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.228948 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/19fb157d-30e1-4432-9d62-0fa70d464148-horizon-secret-key\") pod \"19fb157d-30e1-4432-9d62-0fa70d464148\" (UID: \"19fb157d-30e1-4432-9d62-0fa70d464148\") " Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.229731 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls8s2\" (UniqueName: \"kubernetes.io/projected/19fb157d-30e1-4432-9d62-0fa70d464148-kube-api-access-ls8s2\") pod \"19fb157d-30e1-4432-9d62-0fa70d464148\" (UID: \"19fb157d-30e1-4432-9d62-0fa70d464148\") " Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.229844 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19fb157d-30e1-4432-9d62-0fa70d464148-config-data\") pod \"19fb157d-30e1-4432-9d62-0fa70d464148\" (UID: \"19fb157d-30e1-4432-9d62-0fa70d464148\") " Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.229992 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19fb157d-30e1-4432-9d62-0fa70d464148-scripts\") pod \"19fb157d-30e1-4432-9d62-0fa70d464148\" (UID: \"19fb157d-30e1-4432-9d62-0fa70d464148\") " Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.230070 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19fb157d-30e1-4432-9d62-0fa70d464148-logs\") pod \"19fb157d-30e1-4432-9d62-0fa70d464148\" (UID: \"19fb157d-30e1-4432-9d62-0fa70d464148\") " Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.237233 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19fb157d-30e1-4432-9d62-0fa70d464148-logs" (OuterVolumeSpecName: "logs") pod "19fb157d-30e1-4432-9d62-0fa70d464148" (UID: "19fb157d-30e1-4432-9d62-0fa70d464148"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.240529 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19fb157d-30e1-4432-9d62-0fa70d464148-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "19fb157d-30e1-4432-9d62-0fa70d464148" (UID: "19fb157d-30e1-4432-9d62-0fa70d464148"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.242325 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19fb157d-30e1-4432-9d62-0fa70d464148-kube-api-access-ls8s2" (OuterVolumeSpecName: "kube-api-access-ls8s2") pod "19fb157d-30e1-4432-9d62-0fa70d464148" (UID: "19fb157d-30e1-4432-9d62-0fa70d464148"). InnerVolumeSpecName "kube-api-access-ls8s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.265524 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19fb157d-30e1-4432-9d62-0fa70d464148-config-data" (OuterVolumeSpecName: "config-data") pod "19fb157d-30e1-4432-9d62-0fa70d464148" (UID: "19fb157d-30e1-4432-9d62-0fa70d464148"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.279962 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19fb157d-30e1-4432-9d62-0fa70d464148-scripts" (OuterVolumeSpecName: "scripts") pod "19fb157d-30e1-4432-9d62-0fa70d464148" (UID: "19fb157d-30e1-4432-9d62-0fa70d464148"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.332849 4719 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/19fb157d-30e1-4432-9d62-0fa70d464148-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.333160 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls8s2\" (UniqueName: \"kubernetes.io/projected/19fb157d-30e1-4432-9d62-0fa70d464148-kube-api-access-ls8s2\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.333172 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19fb157d-30e1-4432-9d62-0fa70d464148-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.333181 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19fb157d-30e1-4432-9d62-0fa70d464148-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.333189 4719 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19fb157d-30e1-4432-9d62-0fa70d464148-logs\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.536589 4719 generic.go:334] "Generic (PLEG): container finished" podID="08ea2228-1919-46f9-be38-4111d34aa3bc" containerID="fe72c2b996fdfbf94d0a167f99b32cc52ce7fb82eec9f276b8220416818c6645" exitCode=0 Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.536629 4719 generic.go:334] "Generic (PLEG): container finished" podID="08ea2228-1919-46f9-be38-4111d34aa3bc" containerID="42745f3380988d9b63da764bcb030aa806763c179458fc3441184ccb4b6beffe" exitCode=143 Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.536681 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"08ea2228-1919-46f9-be38-4111d34aa3bc","Type":"ContainerDied","Data":"fe72c2b996fdfbf94d0a167f99b32cc52ce7fb82eec9f276b8220416818c6645"} Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.536695 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.536713 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"08ea2228-1919-46f9-be38-4111d34aa3bc","Type":"ContainerDied","Data":"42745f3380988d9b63da764bcb030aa806763c179458fc3441184ccb4b6beffe"} Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.557400 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b66884797-vvjz5" event={"ID":"19fb157d-30e1-4432-9d62-0fa70d464148","Type":"ContainerDied","Data":"d1de847ef5d248cfd346e67f64d43dc6b837f858f0e102c551c824b00cec39c7"} Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.557479 4719 scope.go:117] "RemoveContainer" containerID="0a1f9c31e8eae1dcfb066516d44c1b096c008dc82f57f5c126e5525a1576f68a" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.557760 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b66884797-vvjz5" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.587814 4719 generic.go:334] "Generic (PLEG): container finished" podID="87773a88-c22e-421e-8d36-d1d48d484dd9" containerID="53ae5d4c5c1400f93ac89e63b8e946a262ec7e795cd040c2840dad93c5fef4f2" exitCode=0 Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.587844 4719 generic.go:334] "Generic (PLEG): container finished" podID="87773a88-c22e-421e-8d36-d1d48d484dd9" containerID="7fbd64f6742148ed0849f0f6ee366f87b7b4453d00a80e327f56e9c1a6e4fade" exitCode=143 Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.587911 4719 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.588775 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"87773a88-c22e-421e-8d36-d1d48d484dd9","Type":"ContainerDied","Data":"53ae5d4c5c1400f93ac89e63b8e946a262ec7e795cd040c2840dad93c5fef4f2"} Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.588803 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"87773a88-c22e-421e-8d36-d1d48d484dd9","Type":"ContainerDied","Data":"7fbd64f6742148ed0849f0f6ee366f87b7b4453d00a80e327f56e9c1a6e4fade"} Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.637702 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08ea2228-1919-46f9-be38-4111d34aa3bc-scripts\") pod \"08ea2228-1919-46f9-be38-4111d34aa3bc\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.637863 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"08ea2228-1919-46f9-be38-4111d34aa3bc\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.637902 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08ea2228-1919-46f9-be38-4111d34aa3bc-config-data\") pod \"08ea2228-1919-46f9-be38-4111d34aa3bc\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.637923 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08ea2228-1919-46f9-be38-4111d34aa3bc-logs\") pod \"08ea2228-1919-46f9-be38-4111d34aa3bc\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.637944 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08ea2228-1919-46f9-be38-4111d34aa3bc-httpd-run\") pod \"08ea2228-1919-46f9-be38-4111d34aa3bc\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.637992 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg6l8\" (UniqueName: \"kubernetes.io/projected/08ea2228-1919-46f9-be38-4111d34aa3bc-kube-api-access-hg6l8\") pod \"08ea2228-1919-46f9-be38-4111d34aa3bc\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.638035 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08ea2228-1919-46f9-be38-4111d34aa3bc-combined-ca-bundle\") pod \"08ea2228-1919-46f9-be38-4111d34aa3bc\" (UID: \"08ea2228-1919-46f9-be38-4111d34aa3bc\") " Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.653313 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08ea2228-1919-46f9-be38-4111d34aa3bc-logs" (OuterVolumeSpecName: "logs") pod "08ea2228-1919-46f9-be38-4111d34aa3bc" (UID: "08ea2228-1919-46f9-be38-4111d34aa3bc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.653546 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08ea2228-1919-46f9-be38-4111d34aa3bc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "08ea2228-1919-46f9-be38-4111d34aa3bc" (UID: "08ea2228-1919-46f9-be38-4111d34aa3bc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.654720 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "08ea2228-1919-46f9-be38-4111d34aa3bc" (UID: "08ea2228-1919-46f9-be38-4111d34aa3bc"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.654822 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ea2228-1919-46f9-be38-4111d34aa3bc-scripts" (OuterVolumeSpecName: "scripts") pod "08ea2228-1919-46f9-be38-4111d34aa3bc" (UID: "08ea2228-1919-46f9-be38-4111d34aa3bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.657102 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08ea2228-1919-46f9-be38-4111d34aa3bc-kube-api-access-hg6l8" (OuterVolumeSpecName: "kube-api-access-hg6l8") pod "08ea2228-1919-46f9-be38-4111d34aa3bc" (UID: "08ea2228-1919-46f9-be38-4111d34aa3bc"). InnerVolumeSpecName "kube-api-access-hg6l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.674241 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b66884797-vvjz5"] Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.699628 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-b66884797-vvjz5"] Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.723103 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ea2228-1919-46f9-be38-4111d34aa3bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08ea2228-1919-46f9-be38-4111d34aa3bc" (UID: "08ea2228-1919-46f9-be38-4111d34aa3bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.739970 4719 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08ea2228-1919-46f9-be38-4111d34aa3bc-logs\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.740000 4719 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08ea2228-1919-46f9-be38-4111d34aa3bc-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.740012 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg6l8\" (UniqueName: \"kubernetes.io/projected/08ea2228-1919-46f9-be38-4111d34aa3bc-kube-api-access-hg6l8\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.740021 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08ea2228-1919-46f9-be38-4111d34aa3bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.740029 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08ea2228-1919-46f9-be38-4111d34aa3bc-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.740048 4719 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.797915 4719 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.800881 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ea2228-1919-46f9-be38-4111d34aa3bc-config-data" (OuterVolumeSpecName: "config-data") pod "08ea2228-1919-46f9-be38-4111d34aa3bc" (UID: "08ea2228-1919-46f9-be38-4111d34aa3bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.843020 4719 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.843060 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08ea2228-1919-46f9-be38-4111d34aa3bc-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.885506 4719 scope.go:117] "RemoveContainer" containerID="1105f4f36eebf376e3165896576deef6174007b5847a89c5cf52e43cc44a1749" Oct 09 15:36:15 crc kubenswrapper[4719]: I1009 15:36:15.947574 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.046608 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"87773a88-c22e-421e-8d36-d1d48d484dd9\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.047075 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87773a88-c22e-421e-8d36-d1d48d484dd9-combined-ca-bundle\") pod \"87773a88-c22e-421e-8d36-d1d48d484dd9\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.047112 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87773a88-c22e-421e-8d36-d1d48d484dd9-config-data\") pod \"87773a88-c22e-421e-8d36-d1d48d484dd9\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.047172 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fwqx\" (UniqueName: \"kubernetes.io/projected/87773a88-c22e-421e-8d36-d1d48d484dd9-kube-api-access-6fwqx\") pod \"87773a88-c22e-421e-8d36-d1d48d484dd9\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.047208 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87773a88-c22e-421e-8d36-d1d48d484dd9-httpd-run\") pod \"87773a88-c22e-421e-8d36-d1d48d484dd9\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.047266 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87773a88-c22e-421e-8d36-d1d48d484dd9-logs\") pod \"87773a88-c22e-421e-8d36-d1d48d484dd9\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.047305 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87773a88-c22e-421e-8d36-d1d48d484dd9-scripts\") pod \"87773a88-c22e-421e-8d36-d1d48d484dd9\" (UID: \"87773a88-c22e-421e-8d36-d1d48d484dd9\") " Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.047810 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87773a88-c22e-421e-8d36-d1d48d484dd9-logs" (OuterVolumeSpecName: "logs") pod "87773a88-c22e-421e-8d36-d1d48d484dd9" (UID: "87773a88-c22e-421e-8d36-d1d48d484dd9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.047939 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87773a88-c22e-421e-8d36-d1d48d484dd9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "87773a88-c22e-421e-8d36-d1d48d484dd9" (UID: "87773a88-c22e-421e-8d36-d1d48d484dd9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.052552 4719 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87773a88-c22e-421e-8d36-d1d48d484dd9-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.052586 4719 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87773a88-c22e-421e-8d36-d1d48d484dd9-logs\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.053148 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87773a88-c22e-421e-8d36-d1d48d484dd9-scripts" (OuterVolumeSpecName: "scripts") pod "87773a88-c22e-421e-8d36-d1d48d484dd9" (UID: "87773a88-c22e-421e-8d36-d1d48d484dd9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.060285 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87773a88-c22e-421e-8d36-d1d48d484dd9-kube-api-access-6fwqx" (OuterVolumeSpecName: "kube-api-access-6fwqx") pod "87773a88-c22e-421e-8d36-d1d48d484dd9" (UID: "87773a88-c22e-421e-8d36-d1d48d484dd9"). InnerVolumeSpecName "kube-api-access-6fwqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.060559 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "87773a88-c22e-421e-8d36-d1d48d484dd9" (UID: "87773a88-c22e-421e-8d36-d1d48d484dd9"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.106496 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87773a88-c22e-421e-8d36-d1d48d484dd9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87773a88-c22e-421e-8d36-d1d48d484dd9" (UID: "87773a88-c22e-421e-8d36-d1d48d484dd9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.118466 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87773a88-c22e-421e-8d36-d1d48d484dd9-config-data" (OuterVolumeSpecName: "config-data") pod "87773a88-c22e-421e-8d36-d1d48d484dd9" (UID: "87773a88-c22e-421e-8d36-d1d48d484dd9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.154408 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87773a88-c22e-421e-8d36-d1d48d484dd9-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.154475 4719 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.154671 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87773a88-c22e-421e-8d36-d1d48d484dd9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.154685 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87773a88-c22e-421e-8d36-d1d48d484dd9-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.154698 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fwqx\" (UniqueName: \"kubernetes.io/projected/87773a88-c22e-421e-8d36-d1d48d484dd9-kube-api-access-6fwqx\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.186772 4719 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.256371 4719 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.532995 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.597519 4719 generic.go:334] "Generic (PLEG): container finished" podID="a2205fae-acbe-4123-936d-ad78cd542565" containerID="d2dcc9015f72d339f90b0e01e06a39cf15ee638599453598e6e0077f826a49f9" exitCode=0 Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.597572 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jk6nr" event={"ID":"a2205fae-acbe-4123-936d-ad78cd542565","Type":"ContainerDied","Data":"d2dcc9015f72d339f90b0e01e06a39cf15ee638599453598e6e0077f826a49f9"} Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.599846 4719 generic.go:334] "Generic (PLEG): container finished" podID="08e04378-245e-4a13-b1de-f11cf96579ef" containerID="d6277e8a831c6da4e6159e08cafb95492961642c0693a9ebcd2a6bee847db126" exitCode=0 Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.599865 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gfc75" event={"ID":"08e04378-245e-4a13-b1de-f11cf96579ef","Type":"ContainerDied","Data":"d6277e8a831c6da4e6159e08cafb95492961642c0693a9ebcd2a6bee847db126"} Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.618977 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"08ea2228-1919-46f9-be38-4111d34aa3bc","Type":"ContainerDied","Data":"aea408ab20854e5a1044c820af9978351f77a0ff739b4cc6cbaf49060989778c"} Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.619023 4719 scope.go:117] "RemoveContainer" containerID="fe72c2b996fdfbf94d0a167f99b32cc52ce7fb82eec9f276b8220416818c6645" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.619124 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.634990 4719 generic.go:334] "Generic (PLEG): container finished" podID="75999b62-ce1b-4a9b-8507-c8af12441083" containerID="34ade78fde967d1fd91220099d799f1a4c62436fb1434170671876fc94b49948" exitCode=1 Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.635023 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"75999b62-ce1b-4a9b-8507-c8af12441083","Type":"ContainerDied","Data":"34ade78fde967d1fd91220099d799f1a4c62436fb1434170671876fc94b49948"} Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.635972 4719 scope.go:117] "RemoveContainer" containerID="34ade78fde967d1fd91220099d799f1a4c62436fb1434170671876fc94b49948" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.640070 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.640070 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"87773a88-c22e-421e-8d36-d1d48d484dd9","Type":"ContainerDied","Data":"e99f4c2e8079b8ecdc9938d4b6289cdff9763330e97d681ad8888affcff4979d"} Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.692194 4719 scope.go:117] "RemoveContainer" containerID="42745f3380988d9b63da764bcb030aa806763c179458fc3441184ccb4b6beffe" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.695414 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.709575 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.748825 4719 scope.go:117] "RemoveContainer" containerID="53ae5d4c5c1400f93ac89e63b8e946a262ec7e795cd040c2840dad93c5fef4f2" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.749794 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 15:36:16 crc kubenswrapper[4719]: E1009 15:36:16.750268 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbae14e7-c975-42ac-bad6-b5ad764a239b" containerName="horizon" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.750282 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbae14e7-c975-42ac-bad6-b5ad764a239b" containerName="horizon" Oct 09 15:36:16 crc kubenswrapper[4719]: E1009 15:36:16.750307 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87773a88-c22e-421e-8d36-d1d48d484dd9" containerName="glance-httpd" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.750315 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="87773a88-c22e-421e-8d36-d1d48d484dd9" containerName="glance-httpd" Oct 09 15:36:16 crc kubenswrapper[4719]: E1009 15:36:16.750338 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19fb157d-30e1-4432-9d62-0fa70d464148" containerName="horizon" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.750359 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="19fb157d-30e1-4432-9d62-0fa70d464148" containerName="horizon" Oct 09 15:36:16 crc kubenswrapper[4719]: E1009 15:36:16.750374 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87773a88-c22e-421e-8d36-d1d48d484dd9" containerName="glance-log" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.750381 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="87773a88-c22e-421e-8d36-d1d48d484dd9" containerName="glance-log" Oct 09 15:36:16 crc kubenswrapper[4719]: E1009 15:36:16.750391 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbae14e7-c975-42ac-bad6-b5ad764a239b" containerName="horizon-log" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.750398 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbae14e7-c975-42ac-bad6-b5ad764a239b" containerName="horizon-log" Oct 09 15:36:16 crc kubenswrapper[4719]: E1009 15:36:16.750408 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08ea2228-1919-46f9-be38-4111d34aa3bc" containerName="glance-log" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.750414 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ea2228-1919-46f9-be38-4111d34aa3bc" containerName="glance-log" Oct 09 15:36:16 crc kubenswrapper[4719]: E1009 15:36:16.750436 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08ea2228-1919-46f9-be38-4111d34aa3bc" containerName="glance-httpd" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.750443 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ea2228-1919-46f9-be38-4111d34aa3bc" containerName="glance-httpd" Oct 09 15:36:16 crc kubenswrapper[4719]: E1009 15:36:16.750456 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19fb157d-30e1-4432-9d62-0fa70d464148" containerName="horizon-log" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.750465 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="19fb157d-30e1-4432-9d62-0fa70d464148" containerName="horizon-log" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.750695 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="08ea2228-1919-46f9-be38-4111d34aa3bc" containerName="glance-httpd" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.750713 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="87773a88-c22e-421e-8d36-d1d48d484dd9" containerName="glance-httpd" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.750726 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="87773a88-c22e-421e-8d36-d1d48d484dd9" containerName="glance-log" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.750743 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="19fb157d-30e1-4432-9d62-0fa70d464148" containerName="horizon" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.750759 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="19fb157d-30e1-4432-9d62-0fa70d464148" containerName="horizon-log" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.750770 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="08ea2228-1919-46f9-be38-4111d34aa3bc" containerName="glance-log" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.750783 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbae14e7-c975-42ac-bad6-b5ad764a239b" containerName="horizon-log" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.750795 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbae14e7-c975-42ac-bad6-b5ad764a239b" containerName="horizon" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.752099 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.755602 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.756492 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zsmzp" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.756700 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.756830 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.774599 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.781301 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.787829 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.793189 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.793257 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.794813 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.797467 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.799620 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.804543 4719 scope.go:117] "RemoveContainer" containerID="7fbd64f6742148ed0849f0f6ee366f87b7b4453d00a80e327f56e9c1a6e4fade" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.804749 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.805265 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.875966 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b11b82a-b620-435a-9274-fb3419c35a72-scripts\") pod \"glance-default-external-api-0\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.876318 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.876365 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0023a9b5-248d-48cc-8560-b109ff59fc04-logs\") pod \"glance-default-internal-api-0\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.876417 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0023a9b5-248d-48cc-8560-b109ff59fc04-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.876443 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkjqh\" (UniqueName: \"kubernetes.io/projected/0023a9b5-248d-48cc-8560-b109ff59fc04-kube-api-access-bkjqh\") pod \"glance-default-internal-api-0\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.876488 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0023a9b5-248d-48cc-8560-b109ff59fc04-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.876515 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0023a9b5-248d-48cc-8560-b109ff59fc04-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.876543 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b11b82a-b620-435a-9274-fb3419c35a72-logs\") pod \"glance-default-external-api-0\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.876572 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z568v\" (UniqueName: \"kubernetes.io/projected/0b11b82a-b620-435a-9274-fb3419c35a72-kube-api-access-z568v\") pod \"glance-default-external-api-0\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.876607 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b11b82a-b620-435a-9274-fb3419c35a72-config-data\") pod \"glance-default-external-api-0\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.876642 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b11b82a-b620-435a-9274-fb3419c35a72-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.876671 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b11b82a-b620-435a-9274-fb3419c35a72-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.876703 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0b11b82a-b620-435a-9274-fb3419c35a72-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.876743 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0023a9b5-248d-48cc-8560-b109ff59fc04-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.876775 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0023a9b5-248d-48cc-8560-b109ff59fc04-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.876801 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.978587 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b11b82a-b620-435a-9274-fb3419c35a72-scripts\") pod \"glance-default-external-api-0\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.978632 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.978654 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0023a9b5-248d-48cc-8560-b109ff59fc04-logs\") pod \"glance-default-internal-api-0\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.978698 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0023a9b5-248d-48cc-8560-b109ff59fc04-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.978716 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkjqh\" (UniqueName: \"kubernetes.io/projected/0023a9b5-248d-48cc-8560-b109ff59fc04-kube-api-access-bkjqh\") pod \"glance-default-internal-api-0\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.978766 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0023a9b5-248d-48cc-8560-b109ff59fc04-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.978784 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0023a9b5-248d-48cc-8560-b109ff59fc04-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.978806 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b11b82a-b620-435a-9274-fb3419c35a72-logs\") pod \"glance-default-external-api-0\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.978829 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z568v\" (UniqueName: \"kubernetes.io/projected/0b11b82a-b620-435a-9274-fb3419c35a72-kube-api-access-z568v\") pod \"glance-default-external-api-0\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.978857 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b11b82a-b620-435a-9274-fb3419c35a72-config-data\") pod \"glance-default-external-api-0\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.978884 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b11b82a-b620-435a-9274-fb3419c35a72-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.978907 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b11b82a-b620-435a-9274-fb3419c35a72-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.978931 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0b11b82a-b620-435a-9274-fb3419c35a72-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.978980 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0023a9b5-248d-48cc-8560-b109ff59fc04-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.979035 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0023a9b5-248d-48cc-8560-b109ff59fc04-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.979055 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.979278 4719 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.980166 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b11b82a-b620-435a-9274-fb3419c35a72-logs\") pod \"glance-default-external-api-0\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.980401 4719 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.980703 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0b11b82a-b620-435a-9274-fb3419c35a72-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.981061 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0023a9b5-248d-48cc-8560-b109ff59fc04-logs\") pod \"glance-default-internal-api-0\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.981875 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0023a9b5-248d-48cc-8560-b109ff59fc04-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.988071 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0023a9b5-248d-48cc-8560-b109ff59fc04-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.989224 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b11b82a-b620-435a-9274-fb3419c35a72-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.993338 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b11b82a-b620-435a-9274-fb3419c35a72-config-data\") pod \"glance-default-external-api-0\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.994260 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0023a9b5-248d-48cc-8560-b109ff59fc04-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.997028 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b11b82a-b620-435a-9274-fb3419c35a72-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.997893 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b11b82a-b620-435a-9274-fb3419c35a72-scripts\") pod \"glance-default-external-api-0\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.999328 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0023a9b5-248d-48cc-8560-b109ff59fc04-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.999342 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z568v\" (UniqueName: \"kubernetes.io/projected/0b11b82a-b620-435a-9274-fb3419c35a72-kube-api-access-z568v\") pod \"glance-default-external-api-0\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " pod="openstack/glance-default-external-api-0" Oct 09 15:36:16 crc kubenswrapper[4719]: I1009 15:36:16.999335 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0023a9b5-248d-48cc-8560-b109ff59fc04-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:36:17 crc kubenswrapper[4719]: I1009 15:36:17.000063 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkjqh\" (UniqueName: \"kubernetes.io/projected/0023a9b5-248d-48cc-8560-b109ff59fc04-kube-api-access-bkjqh\") pod \"glance-default-internal-api-0\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:36:17 crc kubenswrapper[4719]: I1009 15:36:17.024285 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " pod="openstack/glance-default-external-api-0" Oct 09 15:36:17 crc kubenswrapper[4719]: I1009 15:36:17.025500 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:36:17 crc kubenswrapper[4719]: I1009 15:36:17.081589 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 15:36:17 crc kubenswrapper[4719]: I1009 15:36:17.173816 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 15:36:17 crc kubenswrapper[4719]: I1009 15:36:17.184705 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08ea2228-1919-46f9-be38-4111d34aa3bc" path="/var/lib/kubelet/pods/08ea2228-1919-46f9-be38-4111d34aa3bc/volumes" Oct 09 15:36:17 crc kubenswrapper[4719]: I1009 15:36:17.186744 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19fb157d-30e1-4432-9d62-0fa70d464148" path="/var/lib/kubelet/pods/19fb157d-30e1-4432-9d62-0fa70d464148/volumes" Oct 09 15:36:17 crc kubenswrapper[4719]: I1009 15:36:17.194757 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87773a88-c22e-421e-8d36-d1d48d484dd9" path="/var/lib/kubelet/pods/87773a88-c22e-421e-8d36-d1d48d484dd9/volumes" Oct 09 15:36:17 crc kubenswrapper[4719]: I1009 15:36:17.652199 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"75999b62-ce1b-4a9b-8507-c8af12441083","Type":"ContainerStarted","Data":"97c064a8c41ba55bab5a23f641b5b2c165c7994c2526df1d2692173bb8e72c4b"} Oct 09 15:36:17 crc kubenswrapper[4719]: I1009 15:36:17.654798 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 15:36:17 crc kubenswrapper[4719]: I1009 15:36:17.781789 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 15:36:18 crc kubenswrapper[4719]: I1009 15:36:18.269977 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" Oct 09 15:36:18 crc kubenswrapper[4719]: I1009 15:36:18.331876 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd4bb89d9-649p9"] Oct 09 15:36:18 crc kubenswrapper[4719]: I1009 15:36:18.332159 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" podUID="732560f4-c5be-46ab-9266-d59d4fe1a07d" containerName="dnsmasq-dns" containerID="cri-o://6e47fa6062535038986c21a5044357e21d36839618c053e1c5ec902baa5f6aaf" gracePeriod=10 Oct 09 15:36:18 crc kubenswrapper[4719]: I1009 15:36:18.667997 4719 generic.go:334] "Generic (PLEG): container finished" podID="732560f4-c5be-46ab-9266-d59d4fe1a07d" containerID="6e47fa6062535038986c21a5044357e21d36839618c053e1c5ec902baa5f6aaf" exitCode=0 Oct 09 15:36:18 crc kubenswrapper[4719]: I1009 15:36:18.668041 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" event={"ID":"732560f4-c5be-46ab-9266-d59d4fe1a07d","Type":"ContainerDied","Data":"6e47fa6062535038986c21a5044357e21d36839618c053e1c5ec902baa5f6aaf"} Oct 09 15:36:19 crc kubenswrapper[4719]: I1009 15:36:19.024107 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" podUID="732560f4-c5be-46ab-9266-d59d4fe1a07d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: connect: connection refused" Oct 09 15:36:19 crc kubenswrapper[4719]: I1009 15:36:19.668278 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Oct 09 15:36:19 crc kubenswrapper[4719]: I1009 15:36:19.691738 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Oct 09 15:36:19 crc kubenswrapper[4719]: I1009 15:36:19.699223 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Oct 09 15:36:19 crc kubenswrapper[4719]: I1009 15:36:19.714707 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Oct 09 15:36:19 crc kubenswrapper[4719]: E1009 15:36:19.876635 4719 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75999b62_ce1b_4a9b_8507_c8af12441083.slice/crio-97c064a8c41ba55bab5a23f641b5b2c165c7994c2526df1d2692173bb8e72c4b.scope\": RecentStats: unable to find data in memory cache]" Oct 09 15:36:20 crc kubenswrapper[4719]: I1009 15:36:20.709326 4719 generic.go:334] "Generic (PLEG): container finished" podID="75999b62-ce1b-4a9b-8507-c8af12441083" containerID="97c064a8c41ba55bab5a23f641b5b2c165c7994c2526df1d2692173bb8e72c4b" exitCode=1 Oct 09 15:36:20 crc kubenswrapper[4719]: I1009 15:36:20.709389 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"75999b62-ce1b-4a9b-8507-c8af12441083","Type":"ContainerDied","Data":"97c064a8c41ba55bab5a23f641b5b2c165c7994c2526df1d2692173bb8e72c4b"} Oct 09 15:36:20 crc kubenswrapper[4719]: I1009 15:36:20.709867 4719 scope.go:117] "RemoveContainer" containerID="34ade78fde967d1fd91220099d799f1a4c62436fb1434170671876fc94b49948" Oct 09 15:36:20 crc kubenswrapper[4719]: I1009 15:36:20.710239 4719 scope.go:117] "RemoveContainer" containerID="97c064a8c41ba55bab5a23f641b5b2c165c7994c2526df1d2692173bb8e72c4b" Oct 09 15:36:20 crc kubenswrapper[4719]: E1009 15:36:20.710501 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(75999b62-ce1b-4a9b-8507-c8af12441083)\"" pod="openstack/watcher-decision-engine-0" podUID="75999b62-ce1b-4a9b-8507-c8af12441083" Oct 09 15:36:20 crc kubenswrapper[4719]: I1009 15:36:20.718210 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 09 15:36:20 crc kubenswrapper[4719]: I1009 15:36:20.741410 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Oct 09 15:36:21 crc kubenswrapper[4719]: I1009 15:36:21.721710 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gfc75" event={"ID":"08e04378-245e-4a13-b1de-f11cf96579ef","Type":"ContainerDied","Data":"06be8d9bc5f128936eec343d64db74252ddf326eecc6c46a776759ecc3f0fb30"} Oct 09 15:36:21 crc kubenswrapper[4719]: I1009 15:36:21.721749 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06be8d9bc5f128936eec343d64db74252ddf326eecc6c46a776759ecc3f0fb30" Oct 09 15:36:21 crc kubenswrapper[4719]: I1009 15:36:21.725121 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jk6nr" event={"ID":"a2205fae-acbe-4123-936d-ad78cd542565","Type":"ContainerDied","Data":"dbee069014d65bc419765e74aa622ff8ad46a071f900e757d0c6e881d80a2f46"} Oct 09 15:36:21 crc kubenswrapper[4719]: I1009 15:36:21.725175 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbee069014d65bc419765e74aa622ff8ad46a071f900e757d0c6e881d80a2f46" Oct 09 15:36:21 crc kubenswrapper[4719]: I1009 15:36:21.726681 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0023a9b5-248d-48cc-8560-b109ff59fc04","Type":"ContainerStarted","Data":"1c101a108339ad464f44da0bd31fd5587908ecb0cf8f4001dfc1e0070adee85d"} Oct 09 15:36:21 crc kubenswrapper[4719]: I1009 15:36:21.796318 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gfc75" Oct 09 15:36:21 crc kubenswrapper[4719]: I1009 15:36:21.806433 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jk6nr" Oct 09 15:36:21 crc kubenswrapper[4719]: I1009 15:36:21.893447 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc4cj\" (UniqueName: \"kubernetes.io/projected/a2205fae-acbe-4123-936d-ad78cd542565-kube-api-access-cc4cj\") pod \"a2205fae-acbe-4123-936d-ad78cd542565\" (UID: \"a2205fae-acbe-4123-936d-ad78cd542565\") " Oct 09 15:36:21 crc kubenswrapper[4719]: I1009 15:36:21.893554 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2205fae-acbe-4123-936d-ad78cd542565-config\") pod \"a2205fae-acbe-4123-936d-ad78cd542565\" (UID: \"a2205fae-acbe-4123-936d-ad78cd542565\") " Oct 09 15:36:21 crc kubenswrapper[4719]: I1009 15:36:21.893632 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx4ht\" (UniqueName: \"kubernetes.io/projected/08e04378-245e-4a13-b1de-f11cf96579ef-kube-api-access-nx4ht\") pod \"08e04378-245e-4a13-b1de-f11cf96579ef\" (UID: \"08e04378-245e-4a13-b1de-f11cf96579ef\") " Oct 09 15:36:21 crc kubenswrapper[4719]: I1009 15:36:21.893659 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/08e04378-245e-4a13-b1de-f11cf96579ef-db-sync-config-data\") pod \"08e04378-245e-4a13-b1de-f11cf96579ef\" (UID: \"08e04378-245e-4a13-b1de-f11cf96579ef\") " Oct 09 15:36:21 crc kubenswrapper[4719]: I1009 15:36:21.893680 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08e04378-245e-4a13-b1de-f11cf96579ef-combined-ca-bundle\") pod \"08e04378-245e-4a13-b1de-f11cf96579ef\" (UID: \"08e04378-245e-4a13-b1de-f11cf96579ef\") " Oct 09 15:36:21 crc kubenswrapper[4719]: I1009 15:36:21.893727 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2205fae-acbe-4123-936d-ad78cd542565-combined-ca-bundle\") pod \"a2205fae-acbe-4123-936d-ad78cd542565\" (UID: \"a2205fae-acbe-4123-936d-ad78cd542565\") " Oct 09 15:36:21 crc kubenswrapper[4719]: I1009 15:36:21.898559 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08e04378-245e-4a13-b1de-f11cf96579ef-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "08e04378-245e-4a13-b1de-f11cf96579ef" (UID: "08e04378-245e-4a13-b1de-f11cf96579ef"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:21 crc kubenswrapper[4719]: I1009 15:36:21.899232 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08e04378-245e-4a13-b1de-f11cf96579ef-kube-api-access-nx4ht" (OuterVolumeSpecName: "kube-api-access-nx4ht") pod "08e04378-245e-4a13-b1de-f11cf96579ef" (UID: "08e04378-245e-4a13-b1de-f11cf96579ef"). InnerVolumeSpecName "kube-api-access-nx4ht". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:36:21 crc kubenswrapper[4719]: I1009 15:36:21.899849 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2205fae-acbe-4123-936d-ad78cd542565-kube-api-access-cc4cj" (OuterVolumeSpecName: "kube-api-access-cc4cj") pod "a2205fae-acbe-4123-936d-ad78cd542565" (UID: "a2205fae-acbe-4123-936d-ad78cd542565"). InnerVolumeSpecName "kube-api-access-cc4cj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:36:21 crc kubenswrapper[4719]: I1009 15:36:21.929241 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08e04378-245e-4a13-b1de-f11cf96579ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08e04378-245e-4a13-b1de-f11cf96579ef" (UID: "08e04378-245e-4a13-b1de-f11cf96579ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:21 crc kubenswrapper[4719]: I1009 15:36:21.930370 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2205fae-acbe-4123-936d-ad78cd542565-config" (OuterVolumeSpecName: "config") pod "a2205fae-acbe-4123-936d-ad78cd542565" (UID: "a2205fae-acbe-4123-936d-ad78cd542565"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:21 crc kubenswrapper[4719]: I1009 15:36:21.938907 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2205fae-acbe-4123-936d-ad78cd542565-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2205fae-acbe-4123-936d-ad78cd542565" (UID: "a2205fae-acbe-4123-936d-ad78cd542565"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:21 crc kubenswrapper[4719]: I1009 15:36:21.996296 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx4ht\" (UniqueName: \"kubernetes.io/projected/08e04378-245e-4a13-b1de-f11cf96579ef-kube-api-access-nx4ht\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:21 crc kubenswrapper[4719]: I1009 15:36:21.996541 4719 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/08e04378-245e-4a13-b1de-f11cf96579ef-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:21 crc kubenswrapper[4719]: I1009 15:36:21.996663 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08e04378-245e-4a13-b1de-f11cf96579ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:21 crc kubenswrapper[4719]: I1009 15:36:21.996738 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2205fae-acbe-4123-936d-ad78cd542565-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:21 crc kubenswrapper[4719]: I1009 15:36:21.996809 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc4cj\" (UniqueName: \"kubernetes.io/projected/a2205fae-acbe-4123-936d-ad78cd542565-kube-api-access-cc4cj\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:21 crc kubenswrapper[4719]: I1009 15:36:21.996917 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2205fae-acbe-4123-936d-ad78cd542565-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:22 crc kubenswrapper[4719]: I1009 15:36:22.386031 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7457564986-k28cv" podUID="d16f8bb5-9ca5-4042-ae67-756c79d00217" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.161:8443: connect: connection refused" Oct 09 15:36:22 crc kubenswrapper[4719]: I1009 15:36:22.679725 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" Oct 09 15:36:22 crc kubenswrapper[4719]: I1009 15:36:22.770669 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0b11b82a-b620-435a-9274-fb3419c35a72","Type":"ContainerStarted","Data":"488295aac0e63fc321f1a177d8890d16f0c359fef26097a80b69ff56bc60554b"} Oct 09 15:36:22 crc kubenswrapper[4719]: I1009 15:36:22.774102 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" event={"ID":"732560f4-c5be-46ab-9266-d59d4fe1a07d","Type":"ContainerDied","Data":"1259378a83e9331de5a7ba5d07ef0bbcb12f764567c74f67cbde44e3b000038d"} Oct 09 15:36:22 crc kubenswrapper[4719]: I1009 15:36:22.774152 4719 scope.go:117] "RemoveContainer" containerID="6e47fa6062535038986c21a5044357e21d36839618c053e1c5ec902baa5f6aaf" Oct 09 15:36:22 crc kubenswrapper[4719]: I1009 15:36:22.774250 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd4bb89d9-649p9" Oct 09 15:36:22 crc kubenswrapper[4719]: E1009 15:36:22.777238 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="590f0bbf-4518-4aa6-a71f-1f28b5f4e02a" Oct 09 15:36:22 crc kubenswrapper[4719]: I1009 15:36:22.790303 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gfc75" Oct 09 15:36:22 crc kubenswrapper[4719]: I1009 15:36:22.792597 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jk6nr" Oct 09 15:36:22 crc kubenswrapper[4719]: I1009 15:36:22.800561 4719 scope.go:117] "RemoveContainer" containerID="1a75b05f41b40aa4f726787413be2797ea0145036c5b1bab14a954912931dd23" Oct 09 15:36:22 crc kubenswrapper[4719]: I1009 15:36:22.821194 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-ovsdbserver-nb\") pod \"732560f4-c5be-46ab-9266-d59d4fe1a07d\" (UID: \"732560f4-c5be-46ab-9266-d59d4fe1a07d\") " Oct 09 15:36:22 crc kubenswrapper[4719]: I1009 15:36:22.821292 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-dns-swift-storage-0\") pod \"732560f4-c5be-46ab-9266-d59d4fe1a07d\" (UID: \"732560f4-c5be-46ab-9266-d59d4fe1a07d\") " Oct 09 15:36:22 crc kubenswrapper[4719]: I1009 15:36:22.821322 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-config\") pod \"732560f4-c5be-46ab-9266-d59d4fe1a07d\" (UID: \"732560f4-c5be-46ab-9266-d59d4fe1a07d\") " Oct 09 15:36:22 crc kubenswrapper[4719]: I1009 15:36:22.821364 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-ovsdbserver-sb\") pod \"732560f4-c5be-46ab-9266-d59d4fe1a07d\" (UID: \"732560f4-c5be-46ab-9266-d59d4fe1a07d\") " Oct 09 15:36:22 crc kubenswrapper[4719]: I1009 15:36:22.821440 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-dns-svc\") pod \"732560f4-c5be-46ab-9266-d59d4fe1a07d\" (UID: \"732560f4-c5be-46ab-9266-d59d4fe1a07d\") " Oct 09 15:36:22 crc kubenswrapper[4719]: I1009 15:36:22.821573 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjrhl\" (UniqueName: \"kubernetes.io/projected/732560f4-c5be-46ab-9266-d59d4fe1a07d-kube-api-access-qjrhl\") pod \"732560f4-c5be-46ab-9266-d59d4fe1a07d\" (UID: \"732560f4-c5be-46ab-9266-d59d4fe1a07d\") " Oct 09 15:36:22 crc kubenswrapper[4719]: I1009 15:36:22.865115 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/732560f4-c5be-46ab-9266-d59d4fe1a07d-kube-api-access-qjrhl" (OuterVolumeSpecName: "kube-api-access-qjrhl") pod "732560f4-c5be-46ab-9266-d59d4fe1a07d" (UID: "732560f4-c5be-46ab-9266-d59d4fe1a07d"). InnerVolumeSpecName "kube-api-access-qjrhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:36:22 crc kubenswrapper[4719]: I1009 15:36:22.901397 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "732560f4-c5be-46ab-9266-d59d4fe1a07d" (UID: "732560f4-c5be-46ab-9266-d59d4fe1a07d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:36:22 crc kubenswrapper[4719]: I1009 15:36:22.901469 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "732560f4-c5be-46ab-9266-d59d4fe1a07d" (UID: "732560f4-c5be-46ab-9266-d59d4fe1a07d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:36:22 crc kubenswrapper[4719]: I1009 15:36:22.911503 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "732560f4-c5be-46ab-9266-d59d4fe1a07d" (UID: "732560f4-c5be-46ab-9266-d59d4fe1a07d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:36:22 crc kubenswrapper[4719]: I1009 15:36:22.931441 4719 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:22 crc kubenswrapper[4719]: I1009 15:36:22.931486 4719 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:22 crc kubenswrapper[4719]: I1009 15:36:22.931501 4719 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:22 crc kubenswrapper[4719]: I1009 15:36:22.931512 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjrhl\" (UniqueName: \"kubernetes.io/projected/732560f4-c5be-46ab-9266-d59d4fe1a07d-kube-api-access-qjrhl\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.022070 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-config" (OuterVolumeSpecName: "config") pod "732560f4-c5be-46ab-9266-d59d4fe1a07d" (UID: "732560f4-c5be-46ab-9266-d59d4fe1a07d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.040690 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.053004 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "732560f4-c5be-46ab-9266-d59d4fe1a07d" (UID: "732560f4-c5be-46ab-9266-d59d4fe1a07d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.059494 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7ccff5b764-rskpw"] Oct 09 15:36:23 crc kubenswrapper[4719]: E1009 15:36:23.060052 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2205fae-acbe-4123-936d-ad78cd542565" containerName="neutron-db-sync" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.060073 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2205fae-acbe-4123-936d-ad78cd542565" containerName="neutron-db-sync" Oct 09 15:36:23 crc kubenswrapper[4719]: E1009 15:36:23.060091 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="732560f4-c5be-46ab-9266-d59d4fe1a07d" containerName="init" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.060107 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="732560f4-c5be-46ab-9266-d59d4fe1a07d" containerName="init" Oct 09 15:36:23 crc kubenswrapper[4719]: E1009 15:36:23.060122 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e04378-245e-4a13-b1de-f11cf96579ef" containerName="barbican-db-sync" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.060130 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e04378-245e-4a13-b1de-f11cf96579ef" containerName="barbican-db-sync" Oct 09 15:36:23 crc kubenswrapper[4719]: E1009 15:36:23.060155 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="732560f4-c5be-46ab-9266-d59d4fe1a07d" containerName="dnsmasq-dns" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.060163 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="732560f4-c5be-46ab-9266-d59d4fe1a07d" containerName="dnsmasq-dns" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.060530 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="08e04378-245e-4a13-b1de-f11cf96579ef" containerName="barbican-db-sync" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.060565 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2205fae-acbe-4123-936d-ad78cd542565" containerName="neutron-db-sync" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.060578 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="732560f4-c5be-46ab-9266-d59d4fe1a07d" containerName="dnsmasq-dns" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.061761 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7ccff5b764-rskpw" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.067027 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.067909 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-75wcg" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.070826 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.075110 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-68857c4d7f-ns5gc"] Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.077595 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-68857c4d7f-ns5gc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.083210 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.084204 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-68857c4d7f-ns5gc"] Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.092486 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7ccff5b764-rskpw"] Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.142323 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/276c7ea1-10eb-4a7d-9eb1-50c62518b5b4-logs\") pod \"barbican-keystone-listener-7ccff5b764-rskpw\" (UID: \"276c7ea1-10eb-4a7d-9eb1-50c62518b5b4\") " pod="openstack/barbican-keystone-listener-7ccff5b764-rskpw" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.146941 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs9l4\" (UniqueName: \"kubernetes.io/projected/276c7ea1-10eb-4a7d-9eb1-50c62518b5b4-kube-api-access-xs9l4\") pod \"barbican-keystone-listener-7ccff5b764-rskpw\" (UID: \"276c7ea1-10eb-4a7d-9eb1-50c62518b5b4\") " pod="openstack/barbican-keystone-listener-7ccff5b764-rskpw" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.147131 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/276c7ea1-10eb-4a7d-9eb1-50c62518b5b4-config-data\") pod \"barbican-keystone-listener-7ccff5b764-rskpw\" (UID: \"276c7ea1-10eb-4a7d-9eb1-50c62518b5b4\") " pod="openstack/barbican-keystone-listener-7ccff5b764-rskpw" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.147235 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/276c7ea1-10eb-4a7d-9eb1-50c62518b5b4-combined-ca-bundle\") pod \"barbican-keystone-listener-7ccff5b764-rskpw\" (UID: \"276c7ea1-10eb-4a7d-9eb1-50c62518b5b4\") " pod="openstack/barbican-keystone-listener-7ccff5b764-rskpw" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.147270 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/276c7ea1-10eb-4a7d-9eb1-50c62518b5b4-config-data-custom\") pod \"barbican-keystone-listener-7ccff5b764-rskpw\" (UID: \"276c7ea1-10eb-4a7d-9eb1-50c62518b5b4\") " pod="openstack/barbican-keystone-listener-7ccff5b764-rskpw" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.147676 4719 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/732560f4-c5be-46ab-9266-d59d4fe1a07d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.159073 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68884fc79c-l59v5"] Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.163940 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68884fc79c-l59v5" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.258474 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/276c7ea1-10eb-4a7d-9eb1-50c62518b5b4-logs\") pod \"barbican-keystone-listener-7ccff5b764-rskpw\" (UID: \"276c7ea1-10eb-4a7d-9eb1-50c62518b5b4\") " pod="openstack/barbican-keystone-listener-7ccff5b764-rskpw" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.258609 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs9l4\" (UniqueName: \"kubernetes.io/projected/276c7ea1-10eb-4a7d-9eb1-50c62518b5b4-kube-api-access-xs9l4\") pod \"barbican-keystone-listener-7ccff5b764-rskpw\" (UID: \"276c7ea1-10eb-4a7d-9eb1-50c62518b5b4\") " pod="openstack/barbican-keystone-listener-7ccff5b764-rskpw" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.258675 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-config\") pod \"dnsmasq-dns-68884fc79c-l59v5\" (UID: \"df7732ab-4355-4061-9994-87f7ac7e4dd9\") " pod="openstack/dnsmasq-dns-68884fc79c-l59v5" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.258709 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84cbb\" (UniqueName: \"kubernetes.io/projected/e305acce-34be-4503-b643-b60e4201ecfa-kube-api-access-84cbb\") pod \"barbican-worker-68857c4d7f-ns5gc\" (UID: \"e305acce-34be-4503-b643-b60e4201ecfa\") " pod="openstack/barbican-worker-68857c4d7f-ns5gc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.258751 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e305acce-34be-4503-b643-b60e4201ecfa-config-data\") pod \"barbican-worker-68857c4d7f-ns5gc\" (UID: \"e305acce-34be-4503-b643-b60e4201ecfa\") " pod="openstack/barbican-worker-68857c4d7f-ns5gc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.258810 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e305acce-34be-4503-b643-b60e4201ecfa-config-data-custom\") pod \"barbican-worker-68857c4d7f-ns5gc\" (UID: \"e305acce-34be-4503-b643-b60e4201ecfa\") " pod="openstack/barbican-worker-68857c4d7f-ns5gc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.258843 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e305acce-34be-4503-b643-b60e4201ecfa-combined-ca-bundle\") pod \"barbican-worker-68857c4d7f-ns5gc\" (UID: \"e305acce-34be-4503-b643-b60e4201ecfa\") " pod="openstack/barbican-worker-68857c4d7f-ns5gc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.258884 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/276c7ea1-10eb-4a7d-9eb1-50c62518b5b4-config-data\") pod \"barbican-keystone-listener-7ccff5b764-rskpw\" (UID: \"276c7ea1-10eb-4a7d-9eb1-50c62518b5b4\") " pod="openstack/barbican-keystone-listener-7ccff5b764-rskpw" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.259010 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzc55\" (UniqueName: \"kubernetes.io/projected/df7732ab-4355-4061-9994-87f7ac7e4dd9-kube-api-access-qzc55\") pod \"dnsmasq-dns-68884fc79c-l59v5\" (UID: \"df7732ab-4355-4061-9994-87f7ac7e4dd9\") " pod="openstack/dnsmasq-dns-68884fc79c-l59v5" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.259045 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/276c7ea1-10eb-4a7d-9eb1-50c62518b5b4-combined-ca-bundle\") pod \"barbican-keystone-listener-7ccff5b764-rskpw\" (UID: \"276c7ea1-10eb-4a7d-9eb1-50c62518b5b4\") " pod="openstack/barbican-keystone-listener-7ccff5b764-rskpw" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.259086 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/276c7ea1-10eb-4a7d-9eb1-50c62518b5b4-config-data-custom\") pod \"barbican-keystone-listener-7ccff5b764-rskpw\" (UID: \"276c7ea1-10eb-4a7d-9eb1-50c62518b5b4\") " pod="openstack/barbican-keystone-listener-7ccff5b764-rskpw" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.259115 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-dns-swift-storage-0\") pod \"dnsmasq-dns-68884fc79c-l59v5\" (UID: \"df7732ab-4355-4061-9994-87f7ac7e4dd9\") " pod="openstack/dnsmasq-dns-68884fc79c-l59v5" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.259416 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-dns-svc\") pod \"dnsmasq-dns-68884fc79c-l59v5\" (UID: \"df7732ab-4355-4061-9994-87f7ac7e4dd9\") " pod="openstack/dnsmasq-dns-68884fc79c-l59v5" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.259536 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-ovsdbserver-sb\") pod \"dnsmasq-dns-68884fc79c-l59v5\" (UID: \"df7732ab-4355-4061-9994-87f7ac7e4dd9\") " pod="openstack/dnsmasq-dns-68884fc79c-l59v5" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.259662 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-ovsdbserver-nb\") pod \"dnsmasq-dns-68884fc79c-l59v5\" (UID: \"df7732ab-4355-4061-9994-87f7ac7e4dd9\") " pod="openstack/dnsmasq-dns-68884fc79c-l59v5" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.259811 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e305acce-34be-4503-b643-b60e4201ecfa-logs\") pod \"barbican-worker-68857c4d7f-ns5gc\" (UID: \"e305acce-34be-4503-b643-b60e4201ecfa\") " pod="openstack/barbican-worker-68857c4d7f-ns5gc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.259868 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/276c7ea1-10eb-4a7d-9eb1-50c62518b5b4-logs\") pod \"barbican-keystone-listener-7ccff5b764-rskpw\" (UID: \"276c7ea1-10eb-4a7d-9eb1-50c62518b5b4\") " pod="openstack/barbican-keystone-listener-7ccff5b764-rskpw" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.279001 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/276c7ea1-10eb-4a7d-9eb1-50c62518b5b4-config-data-custom\") pod \"barbican-keystone-listener-7ccff5b764-rskpw\" (UID: \"276c7ea1-10eb-4a7d-9eb1-50c62518b5b4\") " pod="openstack/barbican-keystone-listener-7ccff5b764-rskpw" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.290538 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68884fc79c-l59v5"] Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.311736 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs9l4\" (UniqueName: \"kubernetes.io/projected/276c7ea1-10eb-4a7d-9eb1-50c62518b5b4-kube-api-access-xs9l4\") pod \"barbican-keystone-listener-7ccff5b764-rskpw\" (UID: \"276c7ea1-10eb-4a7d-9eb1-50c62518b5b4\") " pod="openstack/barbican-keystone-listener-7ccff5b764-rskpw" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.313324 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/276c7ea1-10eb-4a7d-9eb1-50c62518b5b4-config-data\") pod \"barbican-keystone-listener-7ccff5b764-rskpw\" (UID: \"276c7ea1-10eb-4a7d-9eb1-50c62518b5b4\") " pod="openstack/barbican-keystone-listener-7ccff5b764-rskpw" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.314251 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/276c7ea1-10eb-4a7d-9eb1-50c62518b5b4-combined-ca-bundle\") pod \"barbican-keystone-listener-7ccff5b764-rskpw\" (UID: \"276c7ea1-10eb-4a7d-9eb1-50c62518b5b4\") " pod="openstack/barbican-keystone-listener-7ccff5b764-rskpw" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.362724 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e305acce-34be-4503-b643-b60e4201ecfa-logs\") pod \"barbican-worker-68857c4d7f-ns5gc\" (UID: \"e305acce-34be-4503-b643-b60e4201ecfa\") " pod="openstack/barbican-worker-68857c4d7f-ns5gc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.363084 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-config\") pod \"dnsmasq-dns-68884fc79c-l59v5\" (UID: \"df7732ab-4355-4061-9994-87f7ac7e4dd9\") " pod="openstack/dnsmasq-dns-68884fc79c-l59v5" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.363105 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84cbb\" (UniqueName: \"kubernetes.io/projected/e305acce-34be-4503-b643-b60e4201ecfa-kube-api-access-84cbb\") pod \"barbican-worker-68857c4d7f-ns5gc\" (UID: \"e305acce-34be-4503-b643-b60e4201ecfa\") " pod="openstack/barbican-worker-68857c4d7f-ns5gc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.363132 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e305acce-34be-4503-b643-b60e4201ecfa-config-data\") pod \"barbican-worker-68857c4d7f-ns5gc\" (UID: \"e305acce-34be-4503-b643-b60e4201ecfa\") " pod="openstack/barbican-worker-68857c4d7f-ns5gc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.363173 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e305acce-34be-4503-b643-b60e4201ecfa-config-data-custom\") pod \"barbican-worker-68857c4d7f-ns5gc\" (UID: \"e305acce-34be-4503-b643-b60e4201ecfa\") " pod="openstack/barbican-worker-68857c4d7f-ns5gc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.363196 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e305acce-34be-4503-b643-b60e4201ecfa-combined-ca-bundle\") pod \"barbican-worker-68857c4d7f-ns5gc\" (UID: \"e305acce-34be-4503-b643-b60e4201ecfa\") " pod="openstack/barbican-worker-68857c4d7f-ns5gc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.363248 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzc55\" (UniqueName: \"kubernetes.io/projected/df7732ab-4355-4061-9994-87f7ac7e4dd9-kube-api-access-qzc55\") pod \"dnsmasq-dns-68884fc79c-l59v5\" (UID: \"df7732ab-4355-4061-9994-87f7ac7e4dd9\") " pod="openstack/dnsmasq-dns-68884fc79c-l59v5" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.363274 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-dns-swift-storage-0\") pod \"dnsmasq-dns-68884fc79c-l59v5\" (UID: \"df7732ab-4355-4061-9994-87f7ac7e4dd9\") " pod="openstack/dnsmasq-dns-68884fc79c-l59v5" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.363299 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-dns-svc\") pod \"dnsmasq-dns-68884fc79c-l59v5\" (UID: \"df7732ab-4355-4061-9994-87f7ac7e4dd9\") " pod="openstack/dnsmasq-dns-68884fc79c-l59v5" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.363319 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-ovsdbserver-sb\") pod \"dnsmasq-dns-68884fc79c-l59v5\" (UID: \"df7732ab-4355-4061-9994-87f7ac7e4dd9\") " pod="openstack/dnsmasq-dns-68884fc79c-l59v5" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.363363 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-ovsdbserver-nb\") pod \"dnsmasq-dns-68884fc79c-l59v5\" (UID: \"df7732ab-4355-4061-9994-87f7ac7e4dd9\") " pod="openstack/dnsmasq-dns-68884fc79c-l59v5" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.363414 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e305acce-34be-4503-b643-b60e4201ecfa-logs\") pod \"barbican-worker-68857c4d7f-ns5gc\" (UID: \"e305acce-34be-4503-b643-b60e4201ecfa\") " pod="openstack/barbican-worker-68857c4d7f-ns5gc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.365001 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-dns-swift-storage-0\") pod \"dnsmasq-dns-68884fc79c-l59v5\" (UID: \"df7732ab-4355-4061-9994-87f7ac7e4dd9\") " pod="openstack/dnsmasq-dns-68884fc79c-l59v5" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.367983 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-ovsdbserver-nb\") pod \"dnsmasq-dns-68884fc79c-l59v5\" (UID: \"df7732ab-4355-4061-9994-87f7ac7e4dd9\") " pod="openstack/dnsmasq-dns-68884fc79c-l59v5" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.369055 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-ovsdbserver-sb\") pod \"dnsmasq-dns-68884fc79c-l59v5\" (UID: \"df7732ab-4355-4061-9994-87f7ac7e4dd9\") " pod="openstack/dnsmasq-dns-68884fc79c-l59v5" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.372339 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-dns-svc\") pod \"dnsmasq-dns-68884fc79c-l59v5\" (UID: \"df7732ab-4355-4061-9994-87f7ac7e4dd9\") " pod="openstack/dnsmasq-dns-68884fc79c-l59v5" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.372639 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-config\") pod \"dnsmasq-dns-68884fc79c-l59v5\" (UID: \"df7732ab-4355-4061-9994-87f7ac7e4dd9\") " pod="openstack/dnsmasq-dns-68884fc79c-l59v5" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.377632 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e305acce-34be-4503-b643-b60e4201ecfa-config-data-custom\") pod \"barbican-worker-68857c4d7f-ns5gc\" (UID: \"e305acce-34be-4503-b643-b60e4201ecfa\") " pod="openstack/barbican-worker-68857c4d7f-ns5gc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.378840 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e305acce-34be-4503-b643-b60e4201ecfa-combined-ca-bundle\") pod \"barbican-worker-68857c4d7f-ns5gc\" (UID: \"e305acce-34be-4503-b643-b60e4201ecfa\") " pod="openstack/barbican-worker-68857c4d7f-ns5gc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.383036 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e305acce-34be-4503-b643-b60e4201ecfa-config-data\") pod \"barbican-worker-68857c4d7f-ns5gc\" (UID: \"e305acce-34be-4503-b643-b60e4201ecfa\") " pod="openstack/barbican-worker-68857c4d7f-ns5gc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.384677 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7b877f86f8-fptrv"] Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.387580 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b877f86f8-fptrv" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.398675 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84cbb\" (UniqueName: \"kubernetes.io/projected/e305acce-34be-4503-b643-b60e4201ecfa-kube-api-access-84cbb\") pod \"barbican-worker-68857c4d7f-ns5gc\" (UID: \"e305acce-34be-4503-b643-b60e4201ecfa\") " pod="openstack/barbican-worker-68857c4d7f-ns5gc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.399146 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.400169 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzc55\" (UniqueName: \"kubernetes.io/projected/df7732ab-4355-4061-9994-87f7ac7e4dd9-kube-api-access-qzc55\") pod \"dnsmasq-dns-68884fc79c-l59v5\" (UID: \"df7732ab-4355-4061-9994-87f7ac7e4dd9\") " pod="openstack/dnsmasq-dns-68884fc79c-l59v5" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.404524 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68884fc79c-l59v5"] Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.405379 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68884fc79c-l59v5" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.426719 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd4bb89d9-649p9"] Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.433088 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7ccff5b764-rskpw" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.449405 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-68857c4d7f-ns5gc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.450455 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b877f86f8-fptrv"] Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.459903 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bd4bb89d9-649p9"] Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.464789 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6w78\" (UniqueName: \"kubernetes.io/projected/726781c9-066e-4914-bc4f-e2fc8fcef741-kube-api-access-c6w78\") pod \"barbican-api-7b877f86f8-fptrv\" (UID: \"726781c9-066e-4914-bc4f-e2fc8fcef741\") " pod="openstack/barbican-api-7b877f86f8-fptrv" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.464824 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/726781c9-066e-4914-bc4f-e2fc8fcef741-logs\") pod \"barbican-api-7b877f86f8-fptrv\" (UID: \"726781c9-066e-4914-bc4f-e2fc8fcef741\") " pod="openstack/barbican-api-7b877f86f8-fptrv" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.464858 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/726781c9-066e-4914-bc4f-e2fc8fcef741-config-data\") pod \"barbican-api-7b877f86f8-fptrv\" (UID: \"726781c9-066e-4914-bc4f-e2fc8fcef741\") " pod="openstack/barbican-api-7b877f86f8-fptrv" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.464904 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/726781c9-066e-4914-bc4f-e2fc8fcef741-combined-ca-bundle\") pod \"barbican-api-7b877f86f8-fptrv\" (UID: \"726781c9-066e-4914-bc4f-e2fc8fcef741\") " pod="openstack/barbican-api-7b877f86f8-fptrv" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.464982 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/726781c9-066e-4914-bc4f-e2fc8fcef741-config-data-custom\") pod \"barbican-api-7b877f86f8-fptrv\" (UID: \"726781c9-066e-4914-bc4f-e2fc8fcef741\") " pod="openstack/barbican-api-7b877f86f8-fptrv" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.466558 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d78b7c8c7-8l5xz"] Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.468296 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.473937 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d78b7c8c7-8l5xz"] Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.491463 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-58ddf56cd8-cl6cc"] Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.504110 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-58ddf56cd8-cl6cc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.508285 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.508537 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.508731 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.508869 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vth95" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.514084 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-58ddf56cd8-cl6cc"] Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.566075 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/726781c9-066e-4914-bc4f-e2fc8fcef741-config-data-custom\") pod \"barbican-api-7b877f86f8-fptrv\" (UID: \"726781c9-066e-4914-bc4f-e2fc8fcef741\") " pod="openstack/barbican-api-7b877f86f8-fptrv" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.566123 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdkrb\" (UniqueName: \"kubernetes.io/projected/2f1b276e-3d3d-42c4-a107-53af7102e33e-kube-api-access-mdkrb\") pod \"neutron-58ddf56cd8-cl6cc\" (UID: \"2f1b276e-3d3d-42c4-a107-53af7102e33e\") " pod="openstack/neutron-58ddf56cd8-cl6cc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.566152 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-config\") pod \"dnsmasq-dns-7d78b7c8c7-8l5xz\" (UID: \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\") " pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.566186 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-dns-svc\") pod \"dnsmasq-dns-7d78b7c8c7-8l5xz\" (UID: \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\") " pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.566209 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6w78\" (UniqueName: \"kubernetes.io/projected/726781c9-066e-4914-bc4f-e2fc8fcef741-kube-api-access-c6w78\") pod \"barbican-api-7b877f86f8-fptrv\" (UID: \"726781c9-066e-4914-bc4f-e2fc8fcef741\") " pod="openstack/barbican-api-7b877f86f8-fptrv" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.566225 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2f1b276e-3d3d-42c4-a107-53af7102e33e-httpd-config\") pod \"neutron-58ddf56cd8-cl6cc\" (UID: \"2f1b276e-3d3d-42c4-a107-53af7102e33e\") " pod="openstack/neutron-58ddf56cd8-cl6cc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.566255 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/726781c9-066e-4914-bc4f-e2fc8fcef741-logs\") pod \"barbican-api-7b877f86f8-fptrv\" (UID: \"726781c9-066e-4914-bc4f-e2fc8fcef741\") " pod="openstack/barbican-api-7b877f86f8-fptrv" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.566277 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f1b276e-3d3d-42c4-a107-53af7102e33e-combined-ca-bundle\") pod \"neutron-58ddf56cd8-cl6cc\" (UID: \"2f1b276e-3d3d-42c4-a107-53af7102e33e\") " pod="openstack/neutron-58ddf56cd8-cl6cc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.566305 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/726781c9-066e-4914-bc4f-e2fc8fcef741-config-data\") pod \"barbican-api-7b877f86f8-fptrv\" (UID: \"726781c9-066e-4914-bc4f-e2fc8fcef741\") " pod="openstack/barbican-api-7b877f86f8-fptrv" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.566332 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f1b276e-3d3d-42c4-a107-53af7102e33e-config\") pod \"neutron-58ddf56cd8-cl6cc\" (UID: \"2f1b276e-3d3d-42c4-a107-53af7102e33e\") " pod="openstack/neutron-58ddf56cd8-cl6cc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.566370 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-ovsdbserver-sb\") pod \"dnsmasq-dns-7d78b7c8c7-8l5xz\" (UID: \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\") " pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.566397 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/726781c9-066e-4914-bc4f-e2fc8fcef741-combined-ca-bundle\") pod \"barbican-api-7b877f86f8-fptrv\" (UID: \"726781c9-066e-4914-bc4f-e2fc8fcef741\") " pod="openstack/barbican-api-7b877f86f8-fptrv" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.566423 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f1b276e-3d3d-42c4-a107-53af7102e33e-ovndb-tls-certs\") pod \"neutron-58ddf56cd8-cl6cc\" (UID: \"2f1b276e-3d3d-42c4-a107-53af7102e33e\") " pod="openstack/neutron-58ddf56cd8-cl6cc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.566451 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5bnd\" (UniqueName: \"kubernetes.io/projected/9a01d050-b1bc-4b48-a783-64a36c24ad6e-kube-api-access-z5bnd\") pod \"dnsmasq-dns-7d78b7c8c7-8l5xz\" (UID: \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\") " pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.566479 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-dns-swift-storage-0\") pod \"dnsmasq-dns-7d78b7c8c7-8l5xz\" (UID: \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\") " pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.566496 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-ovsdbserver-nb\") pod \"dnsmasq-dns-7d78b7c8c7-8l5xz\" (UID: \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\") " pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.569404 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/726781c9-066e-4914-bc4f-e2fc8fcef741-logs\") pod \"barbican-api-7b877f86f8-fptrv\" (UID: \"726781c9-066e-4914-bc4f-e2fc8fcef741\") " pod="openstack/barbican-api-7b877f86f8-fptrv" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.571825 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/726781c9-066e-4914-bc4f-e2fc8fcef741-config-data-custom\") pod \"barbican-api-7b877f86f8-fptrv\" (UID: \"726781c9-066e-4914-bc4f-e2fc8fcef741\") " pod="openstack/barbican-api-7b877f86f8-fptrv" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.575566 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/726781c9-066e-4914-bc4f-e2fc8fcef741-combined-ca-bundle\") pod \"barbican-api-7b877f86f8-fptrv\" (UID: \"726781c9-066e-4914-bc4f-e2fc8fcef741\") " pod="openstack/barbican-api-7b877f86f8-fptrv" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.582822 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6w78\" (UniqueName: \"kubernetes.io/projected/726781c9-066e-4914-bc4f-e2fc8fcef741-kube-api-access-c6w78\") pod \"barbican-api-7b877f86f8-fptrv\" (UID: \"726781c9-066e-4914-bc4f-e2fc8fcef741\") " pod="openstack/barbican-api-7b877f86f8-fptrv" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.609705 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/726781c9-066e-4914-bc4f-e2fc8fcef741-config-data\") pod \"barbican-api-7b877f86f8-fptrv\" (UID: \"726781c9-066e-4914-bc4f-e2fc8fcef741\") " pod="openstack/barbican-api-7b877f86f8-fptrv" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.670877 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f1b276e-3d3d-42c4-a107-53af7102e33e-config\") pod \"neutron-58ddf56cd8-cl6cc\" (UID: \"2f1b276e-3d3d-42c4-a107-53af7102e33e\") " pod="openstack/neutron-58ddf56cd8-cl6cc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.670943 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-ovsdbserver-sb\") pod \"dnsmasq-dns-7d78b7c8c7-8l5xz\" (UID: \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\") " pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.670993 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f1b276e-3d3d-42c4-a107-53af7102e33e-ovndb-tls-certs\") pod \"neutron-58ddf56cd8-cl6cc\" (UID: \"2f1b276e-3d3d-42c4-a107-53af7102e33e\") " pod="openstack/neutron-58ddf56cd8-cl6cc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.671021 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5bnd\" (UniqueName: \"kubernetes.io/projected/9a01d050-b1bc-4b48-a783-64a36c24ad6e-kube-api-access-z5bnd\") pod \"dnsmasq-dns-7d78b7c8c7-8l5xz\" (UID: \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\") " pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.671052 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-dns-swift-storage-0\") pod \"dnsmasq-dns-7d78b7c8c7-8l5xz\" (UID: \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\") " pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.671095 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-ovsdbserver-nb\") pod \"dnsmasq-dns-7d78b7c8c7-8l5xz\" (UID: \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\") " pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.671139 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdkrb\" (UniqueName: \"kubernetes.io/projected/2f1b276e-3d3d-42c4-a107-53af7102e33e-kube-api-access-mdkrb\") pod \"neutron-58ddf56cd8-cl6cc\" (UID: \"2f1b276e-3d3d-42c4-a107-53af7102e33e\") " pod="openstack/neutron-58ddf56cd8-cl6cc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.671179 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-config\") pod \"dnsmasq-dns-7d78b7c8c7-8l5xz\" (UID: \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\") " pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.671209 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-dns-svc\") pod \"dnsmasq-dns-7d78b7c8c7-8l5xz\" (UID: \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\") " pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.671251 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2f1b276e-3d3d-42c4-a107-53af7102e33e-httpd-config\") pod \"neutron-58ddf56cd8-cl6cc\" (UID: \"2f1b276e-3d3d-42c4-a107-53af7102e33e\") " pod="openstack/neutron-58ddf56cd8-cl6cc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.671272 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f1b276e-3d3d-42c4-a107-53af7102e33e-combined-ca-bundle\") pod \"neutron-58ddf56cd8-cl6cc\" (UID: \"2f1b276e-3d3d-42c4-a107-53af7102e33e\") " pod="openstack/neutron-58ddf56cd8-cl6cc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.672547 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-dns-swift-storage-0\") pod \"dnsmasq-dns-7d78b7c8c7-8l5xz\" (UID: \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\") " pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.672724 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-ovsdbserver-nb\") pod \"dnsmasq-dns-7d78b7c8c7-8l5xz\" (UID: \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\") " pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.672762 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-config\") pod \"dnsmasq-dns-7d78b7c8c7-8l5xz\" (UID: \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\") " pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.673252 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-dns-svc\") pod \"dnsmasq-dns-7d78b7c8c7-8l5xz\" (UID: \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\") " pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.674070 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-ovsdbserver-sb\") pod \"dnsmasq-dns-7d78b7c8c7-8l5xz\" (UID: \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\") " pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.677751 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f1b276e-3d3d-42c4-a107-53af7102e33e-combined-ca-bundle\") pod \"neutron-58ddf56cd8-cl6cc\" (UID: \"2f1b276e-3d3d-42c4-a107-53af7102e33e\") " pod="openstack/neutron-58ddf56cd8-cl6cc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.678324 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f1b276e-3d3d-42c4-a107-53af7102e33e-config\") pod \"neutron-58ddf56cd8-cl6cc\" (UID: \"2f1b276e-3d3d-42c4-a107-53af7102e33e\") " pod="openstack/neutron-58ddf56cd8-cl6cc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.684452 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2f1b276e-3d3d-42c4-a107-53af7102e33e-httpd-config\") pod \"neutron-58ddf56cd8-cl6cc\" (UID: \"2f1b276e-3d3d-42c4-a107-53af7102e33e\") " pod="openstack/neutron-58ddf56cd8-cl6cc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.686207 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f1b276e-3d3d-42c4-a107-53af7102e33e-ovndb-tls-certs\") pod \"neutron-58ddf56cd8-cl6cc\" (UID: \"2f1b276e-3d3d-42c4-a107-53af7102e33e\") " pod="openstack/neutron-58ddf56cd8-cl6cc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.713787 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdkrb\" (UniqueName: \"kubernetes.io/projected/2f1b276e-3d3d-42c4-a107-53af7102e33e-kube-api-access-mdkrb\") pod \"neutron-58ddf56cd8-cl6cc\" (UID: \"2f1b276e-3d3d-42c4-a107-53af7102e33e\") " pod="openstack/neutron-58ddf56cd8-cl6cc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.722287 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5bnd\" (UniqueName: \"kubernetes.io/projected/9a01d050-b1bc-4b48-a783-64a36c24ad6e-kube-api-access-z5bnd\") pod \"dnsmasq-dns-7d78b7c8c7-8l5xz\" (UID: \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\") " pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.736181 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b877f86f8-fptrv" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.805023 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.826332 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0023a9b5-248d-48cc-8560-b109ff59fc04","Type":"ContainerStarted","Data":"e41e62ba79ff1042f03aff39173579f2a3716ffd22a7956025ae06a45239d4aa"} Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.828835 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-58ddf56cd8-cl6cc" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.831485 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0b11b82a-b620-435a-9274-fb3419c35a72","Type":"ContainerStarted","Data":"128a964196cdbb9e408fa42332db38c9f560920577b0c0d847b4db5d8f54737f"} Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.837760 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a","Type":"ContainerStarted","Data":"222f69f2668e6b227d56fac0fc6789e58800e994b1e7d5f2ca3a77ed0f8b7e3c"} Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.837896 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="590f0bbf-4518-4aa6-a71f-1f28b5f4e02a" containerName="ceilometer-notification-agent" containerID="cri-o://0bb9bc0ec575175f114ad0572c1c1f37cede57bc3aa752d335d6140570036942" gracePeriod=30 Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.837928 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.837948 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="590f0bbf-4518-4aa6-a71f-1f28b5f4e02a" containerName="proxy-httpd" containerID="cri-o://222f69f2668e6b227d56fac0fc6789e58800e994b1e7d5f2ca3a77ed0f8b7e3c" gracePeriod=30 Oct 09 15:36:23 crc kubenswrapper[4719]: I1009 15:36:23.837972 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="590f0bbf-4518-4aa6-a71f-1f28b5f4e02a" containerName="sg-core" containerID="cri-o://9410096f029088d510296b7fd6f73e792d5010bebbcf5a4b606890e48fccba01" gracePeriod=30 Oct 09 15:36:24 crc kubenswrapper[4719]: I1009 15:36:24.050693 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-68857c4d7f-ns5gc"] Oct 09 15:36:24 crc kubenswrapper[4719]: I1009 15:36:24.204177 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7ccff5b764-rskpw"] Oct 09 15:36:24 crc kubenswrapper[4719]: I1009 15:36:24.211872 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68884fc79c-l59v5"] Oct 09 15:36:24 crc kubenswrapper[4719]: W1009 15:36:24.250564 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod276c7ea1_10eb_4a7d_9eb1_50c62518b5b4.slice/crio-4e6767e6b8719b6603953caeeb777b99e496853a17055786d460ac1bdbdfcee7 WatchSource:0}: Error finding container 4e6767e6b8719b6603953caeeb777b99e496853a17055786d460ac1bdbdfcee7: Status 404 returned error can't find the container with id 4e6767e6b8719b6603953caeeb777b99e496853a17055786d460ac1bdbdfcee7 Oct 09 15:36:24 crc kubenswrapper[4719]: W1009 15:36:24.251140 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf7732ab_4355_4061_9994_87f7ac7e4dd9.slice/crio-99ddf264138128a2d1ea5756be04e2ff930ea630be55d60d2041a29b160a859e WatchSource:0}: Error finding container 99ddf264138128a2d1ea5756be04e2ff930ea630be55d60d2041a29b160a859e: Status 404 returned error can't find the container with id 99ddf264138128a2d1ea5756be04e2ff930ea630be55d60d2041a29b160a859e Oct 09 15:36:24 crc kubenswrapper[4719]: I1009 15:36:24.470746 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d78b7c8c7-8l5xz"] Oct 09 15:36:24 crc kubenswrapper[4719]: I1009 15:36:24.480055 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b877f86f8-fptrv"] Oct 09 15:36:24 crc kubenswrapper[4719]: W1009 15:36:24.488028 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod726781c9_066e_4914_bc4f_e2fc8fcef741.slice/crio-e858f51aa82145e7850992c94dcddfe6eaaafeddf302559505e31daebd77c4d4 WatchSource:0}: Error finding container e858f51aa82145e7850992c94dcddfe6eaaafeddf302559505e31daebd77c4d4: Status 404 returned error can't find the container with id e858f51aa82145e7850992c94dcddfe6eaaafeddf302559505e31daebd77c4d4 Oct 09 15:36:24 crc kubenswrapper[4719]: I1009 15:36:24.809714 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-58ddf56cd8-cl6cc"] Oct 09 15:36:24 crc kubenswrapper[4719]: W1009 15:36:24.815520 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f1b276e_3d3d_42c4_a107_53af7102e33e.slice/crio-ead8180d17db68bc8a17000c3b60a41f357807c20491a92afe7b81d03ddc215f WatchSource:0}: Error finding container ead8180d17db68bc8a17000c3b60a41f357807c20491a92afe7b81d03ddc215f: Status 404 returned error can't find the container with id ead8180d17db68bc8a17000c3b60a41f357807c20491a92afe7b81d03ddc215f Oct 09 15:36:24 crc kubenswrapper[4719]: I1009 15:36:24.906689 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" event={"ID":"9a01d050-b1bc-4b48-a783-64a36c24ad6e","Type":"ContainerStarted","Data":"d493e4d2a8a78615e781ce934302464da9fed656147684829148d9876058506c"} Oct 09 15:36:24 crc kubenswrapper[4719]: I1009 15:36:24.955226 4719 generic.go:334] "Generic (PLEG): container finished" podID="590f0bbf-4518-4aa6-a71f-1f28b5f4e02a" containerID="222f69f2668e6b227d56fac0fc6789e58800e994b1e7d5f2ca3a77ed0f8b7e3c" exitCode=0 Oct 09 15:36:24 crc kubenswrapper[4719]: I1009 15:36:24.955472 4719 generic.go:334] "Generic (PLEG): container finished" podID="590f0bbf-4518-4aa6-a71f-1f28b5f4e02a" containerID="9410096f029088d510296b7fd6f73e792d5010bebbcf5a4b606890e48fccba01" exitCode=2 Oct 09 15:36:24 crc kubenswrapper[4719]: I1009 15:36:24.955588 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a","Type":"ContainerDied","Data":"222f69f2668e6b227d56fac0fc6789e58800e994b1e7d5f2ca3a77ed0f8b7e3c"} Oct 09 15:36:24 crc kubenswrapper[4719]: I1009 15:36:24.955675 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a","Type":"ContainerDied","Data":"9410096f029088d510296b7fd6f73e792d5010bebbcf5a4b606890e48fccba01"} Oct 09 15:36:24 crc kubenswrapper[4719]: I1009 15:36:24.968253 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68857c4d7f-ns5gc" event={"ID":"e305acce-34be-4503-b643-b60e4201ecfa","Type":"ContainerStarted","Data":"3f6a81cdaf8f6a106434217799611e9fc67e9e779667f450e855f0eae97ce185"} Oct 09 15:36:24 crc kubenswrapper[4719]: I1009 15:36:24.969910 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b877f86f8-fptrv" event={"ID":"726781c9-066e-4914-bc4f-e2fc8fcef741","Type":"ContainerStarted","Data":"e858f51aa82145e7850992c94dcddfe6eaaafeddf302559505e31daebd77c4d4"} Oct 09 15:36:24 crc kubenswrapper[4719]: I1009 15:36:24.974873 4719 generic.go:334] "Generic (PLEG): container finished" podID="df7732ab-4355-4061-9994-87f7ac7e4dd9" containerID="ec6518a87040c131606f12df82fa5858982b2e63c47387b171518c644374efec" exitCode=0 Oct 09 15:36:24 crc kubenswrapper[4719]: I1009 15:36:24.975104 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68884fc79c-l59v5" event={"ID":"df7732ab-4355-4061-9994-87f7ac7e4dd9","Type":"ContainerDied","Data":"ec6518a87040c131606f12df82fa5858982b2e63c47387b171518c644374efec"} Oct 09 15:36:24 crc kubenswrapper[4719]: I1009 15:36:24.975216 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68884fc79c-l59v5" event={"ID":"df7732ab-4355-4061-9994-87f7ac7e4dd9","Type":"ContainerStarted","Data":"99ddf264138128a2d1ea5756be04e2ff930ea630be55d60d2041a29b160a859e"} Oct 09 15:36:25 crc kubenswrapper[4719]: I1009 15:36:25.037202 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58ddf56cd8-cl6cc" event={"ID":"2f1b276e-3d3d-42c4-a107-53af7102e33e","Type":"ContainerStarted","Data":"ead8180d17db68bc8a17000c3b60a41f357807c20491a92afe7b81d03ddc215f"} Oct 09 15:36:25 crc kubenswrapper[4719]: I1009 15:36:25.049756 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7ccff5b764-rskpw" event={"ID":"276c7ea1-10eb-4a7d-9eb1-50c62518b5b4","Type":"ContainerStarted","Data":"4e6767e6b8719b6603953caeeb777b99e496853a17055786d460ac1bdbdfcee7"} Oct 09 15:36:25 crc kubenswrapper[4719]: I1009 15:36:25.087299 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0023a9b5-248d-48cc-8560-b109ff59fc04","Type":"ContainerStarted","Data":"2bb2cd4b4e048596f8edfc6c6071d249253ba2061983dd8ac534910c2da4e411"} Oct 09 15:36:25 crc kubenswrapper[4719]: I1009 15:36:25.113949 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0b11b82a-b620-435a-9274-fb3419c35a72","Type":"ContainerStarted","Data":"6411b42af443ab4bf4d25c6db21d5e174e6dfbf1dcf8c7e84532a0d83d159fec"} Oct 09 15:36:25 crc kubenswrapper[4719]: I1009 15:36:25.141283 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.141265085 podStartE2EDuration="9.141265085s" podCreationTimestamp="2025-10-09 15:36:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:36:25.133695204 +0000 UTC m=+1090.643406489" watchObservedRunningTime="2025-10-09 15:36:25.141265085 +0000 UTC m=+1090.650976360" Oct 09 15:36:25 crc kubenswrapper[4719]: I1009 15:36:25.158084 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.158065839 podStartE2EDuration="9.158065839s" podCreationTimestamp="2025-10-09 15:36:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:36:25.152721598 +0000 UTC m=+1090.662432883" watchObservedRunningTime="2025-10-09 15:36:25.158065839 +0000 UTC m=+1090.667777124" Oct 09 15:36:25 crc kubenswrapper[4719]: I1009 15:36:25.216992 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="732560f4-c5be-46ab-9266-d59d4fe1a07d" path="/var/lib/kubelet/pods/732560f4-c5be-46ab-9266-d59d4fe1a07d/volumes" Oct 09 15:36:25 crc kubenswrapper[4719]: I1009 15:36:25.878121 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-57c55b4b47-8npb9"] Oct 09 15:36:25 crc kubenswrapper[4719]: I1009 15:36:25.881613 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57c55b4b47-8npb9" Oct 09 15:36:25 crc kubenswrapper[4719]: I1009 15:36:25.891161 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 09 15:36:25 crc kubenswrapper[4719]: I1009 15:36:25.891435 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 09 15:36:25 crc kubenswrapper[4719]: I1009 15:36:25.902758 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57c55b4b47-8npb9"] Oct 09 15:36:25 crc kubenswrapper[4719]: I1009 15:36:25.994125 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4a6c362-de01-454a-a0d8-7ea4c677720c-config\") pod \"neutron-57c55b4b47-8npb9\" (UID: \"f4a6c362-de01-454a-a0d8-7ea4c677720c\") " pod="openstack/neutron-57c55b4b47-8npb9" Oct 09 15:36:25 crc kubenswrapper[4719]: I1009 15:36:25.994229 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6pwj\" (UniqueName: \"kubernetes.io/projected/f4a6c362-de01-454a-a0d8-7ea4c677720c-kube-api-access-l6pwj\") pod \"neutron-57c55b4b47-8npb9\" (UID: \"f4a6c362-de01-454a-a0d8-7ea4c677720c\") " pod="openstack/neutron-57c55b4b47-8npb9" Oct 09 15:36:25 crc kubenswrapper[4719]: I1009 15:36:25.994312 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f4a6c362-de01-454a-a0d8-7ea4c677720c-httpd-config\") pod \"neutron-57c55b4b47-8npb9\" (UID: \"f4a6c362-de01-454a-a0d8-7ea4c677720c\") " pod="openstack/neutron-57c55b4b47-8npb9" Oct 09 15:36:25 crc kubenswrapper[4719]: I1009 15:36:25.994336 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4a6c362-de01-454a-a0d8-7ea4c677720c-ovndb-tls-certs\") pod \"neutron-57c55b4b47-8npb9\" (UID: \"f4a6c362-de01-454a-a0d8-7ea4c677720c\") " pod="openstack/neutron-57c55b4b47-8npb9" Oct 09 15:36:25 crc kubenswrapper[4719]: I1009 15:36:25.994379 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4a6c362-de01-454a-a0d8-7ea4c677720c-internal-tls-certs\") pod \"neutron-57c55b4b47-8npb9\" (UID: \"f4a6c362-de01-454a-a0d8-7ea4c677720c\") " pod="openstack/neutron-57c55b4b47-8npb9" Oct 09 15:36:25 crc kubenswrapper[4719]: I1009 15:36:25.994433 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4a6c362-de01-454a-a0d8-7ea4c677720c-public-tls-certs\") pod \"neutron-57c55b4b47-8npb9\" (UID: \"f4a6c362-de01-454a-a0d8-7ea4c677720c\") " pod="openstack/neutron-57c55b4b47-8npb9" Oct 09 15:36:25 crc kubenswrapper[4719]: I1009 15:36:25.994489 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a6c362-de01-454a-a0d8-7ea4c677720c-combined-ca-bundle\") pod \"neutron-57c55b4b47-8npb9\" (UID: \"f4a6c362-de01-454a-a0d8-7ea4c677720c\") " pod="openstack/neutron-57c55b4b47-8npb9" Oct 09 15:36:26 crc kubenswrapper[4719]: I1009 15:36:26.096618 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a6c362-de01-454a-a0d8-7ea4c677720c-combined-ca-bundle\") pod \"neutron-57c55b4b47-8npb9\" (UID: \"f4a6c362-de01-454a-a0d8-7ea4c677720c\") " pod="openstack/neutron-57c55b4b47-8npb9" Oct 09 15:36:26 crc kubenswrapper[4719]: I1009 15:36:26.096766 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4a6c362-de01-454a-a0d8-7ea4c677720c-config\") pod \"neutron-57c55b4b47-8npb9\" (UID: \"f4a6c362-de01-454a-a0d8-7ea4c677720c\") " pod="openstack/neutron-57c55b4b47-8npb9" Oct 09 15:36:26 crc kubenswrapper[4719]: I1009 15:36:26.096793 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6pwj\" (UniqueName: \"kubernetes.io/projected/f4a6c362-de01-454a-a0d8-7ea4c677720c-kube-api-access-l6pwj\") pod \"neutron-57c55b4b47-8npb9\" (UID: \"f4a6c362-de01-454a-a0d8-7ea4c677720c\") " pod="openstack/neutron-57c55b4b47-8npb9" Oct 09 15:36:26 crc kubenswrapper[4719]: I1009 15:36:26.096848 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f4a6c362-de01-454a-a0d8-7ea4c677720c-httpd-config\") pod \"neutron-57c55b4b47-8npb9\" (UID: \"f4a6c362-de01-454a-a0d8-7ea4c677720c\") " pod="openstack/neutron-57c55b4b47-8npb9" Oct 09 15:36:26 crc kubenswrapper[4719]: I1009 15:36:26.096864 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4a6c362-de01-454a-a0d8-7ea4c677720c-ovndb-tls-certs\") pod \"neutron-57c55b4b47-8npb9\" (UID: \"f4a6c362-de01-454a-a0d8-7ea4c677720c\") " pod="openstack/neutron-57c55b4b47-8npb9" Oct 09 15:36:26 crc kubenswrapper[4719]: I1009 15:36:26.096878 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4a6c362-de01-454a-a0d8-7ea4c677720c-internal-tls-certs\") pod \"neutron-57c55b4b47-8npb9\" (UID: \"f4a6c362-de01-454a-a0d8-7ea4c677720c\") " pod="openstack/neutron-57c55b4b47-8npb9" Oct 09 15:36:26 crc kubenswrapper[4719]: I1009 15:36:26.096911 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4a6c362-de01-454a-a0d8-7ea4c677720c-public-tls-certs\") pod \"neutron-57c55b4b47-8npb9\" (UID: \"f4a6c362-de01-454a-a0d8-7ea4c677720c\") " pod="openstack/neutron-57c55b4b47-8npb9" Oct 09 15:36:26 crc kubenswrapper[4719]: I1009 15:36:26.104213 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4a6c362-de01-454a-a0d8-7ea4c677720c-public-tls-certs\") pod \"neutron-57c55b4b47-8npb9\" (UID: \"f4a6c362-de01-454a-a0d8-7ea4c677720c\") " pod="openstack/neutron-57c55b4b47-8npb9" Oct 09 15:36:26 crc kubenswrapper[4719]: I1009 15:36:26.104836 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4a6c362-de01-454a-a0d8-7ea4c677720c-ovndb-tls-certs\") pod \"neutron-57c55b4b47-8npb9\" (UID: \"f4a6c362-de01-454a-a0d8-7ea4c677720c\") " pod="openstack/neutron-57c55b4b47-8npb9" Oct 09 15:36:26 crc kubenswrapper[4719]: I1009 15:36:26.105680 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a6c362-de01-454a-a0d8-7ea4c677720c-combined-ca-bundle\") pod \"neutron-57c55b4b47-8npb9\" (UID: \"f4a6c362-de01-454a-a0d8-7ea4c677720c\") " pod="openstack/neutron-57c55b4b47-8npb9" Oct 09 15:36:26 crc kubenswrapper[4719]: I1009 15:36:26.107677 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f4a6c362-de01-454a-a0d8-7ea4c677720c-httpd-config\") pod \"neutron-57c55b4b47-8npb9\" (UID: \"f4a6c362-de01-454a-a0d8-7ea4c677720c\") " pod="openstack/neutron-57c55b4b47-8npb9" Oct 09 15:36:26 crc kubenswrapper[4719]: I1009 15:36:26.108664 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4a6c362-de01-454a-a0d8-7ea4c677720c-config\") pod \"neutron-57c55b4b47-8npb9\" (UID: \"f4a6c362-de01-454a-a0d8-7ea4c677720c\") " pod="openstack/neutron-57c55b4b47-8npb9" Oct 09 15:36:26 crc kubenswrapper[4719]: I1009 15:36:26.115956 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4a6c362-de01-454a-a0d8-7ea4c677720c-internal-tls-certs\") pod \"neutron-57c55b4b47-8npb9\" (UID: \"f4a6c362-de01-454a-a0d8-7ea4c677720c\") " pod="openstack/neutron-57c55b4b47-8npb9" Oct 09 15:36:26 crc kubenswrapper[4719]: I1009 15:36:26.120275 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6pwj\" (UniqueName: \"kubernetes.io/projected/f4a6c362-de01-454a-a0d8-7ea4c677720c-kube-api-access-l6pwj\") pod \"neutron-57c55b4b47-8npb9\" (UID: \"f4a6c362-de01-454a-a0d8-7ea4c677720c\") " pod="openstack/neutron-57c55b4b47-8npb9" Oct 09 15:36:26 crc kubenswrapper[4719]: I1009 15:36:26.125291 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b877f86f8-fptrv" event={"ID":"726781c9-066e-4914-bc4f-e2fc8fcef741","Type":"ContainerStarted","Data":"7286c4c73599853a7da2ac66ebab13882f34c6393b4d517865183fbde12f3e53"} Oct 09 15:36:26 crc kubenswrapper[4719]: I1009 15:36:26.126624 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58ddf56cd8-cl6cc" event={"ID":"2f1b276e-3d3d-42c4-a107-53af7102e33e","Type":"ContainerStarted","Data":"8a9a8bd7731aff4e8aa55610e08f7f7d6f7bbfdbd48a34009512be7d9e828436"} Oct 09 15:36:26 crc kubenswrapper[4719]: I1009 15:36:26.127793 4719 generic.go:334] "Generic (PLEG): container finished" podID="9a01d050-b1bc-4b48-a783-64a36c24ad6e" containerID="c835ad72851233f80e8a0b0eb5ed2019a1226f90d7dd9de3d34edb26f6e4d377" exitCode=0 Oct 09 15:36:26 crc kubenswrapper[4719]: I1009 15:36:26.127833 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" event={"ID":"9a01d050-b1bc-4b48-a783-64a36c24ad6e","Type":"ContainerDied","Data":"c835ad72851233f80e8a0b0eb5ed2019a1226f90d7dd9de3d34edb26f6e4d377"} Oct 09 15:36:26 crc kubenswrapper[4719]: I1009 15:36:26.212804 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57c55b4b47-8npb9" Oct 09 15:36:26 crc kubenswrapper[4719]: I1009 15:36:26.792655 4719 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 09 15:36:26 crc kubenswrapper[4719]: I1009 15:36:26.792993 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 09 15:36:26 crc kubenswrapper[4719]: I1009 15:36:26.793004 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 09 15:36:26 crc kubenswrapper[4719]: I1009 15:36:26.793016 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 09 15:36:26 crc kubenswrapper[4719]: I1009 15:36:26.793743 4719 scope.go:117] "RemoveContainer" containerID="97c064a8c41ba55bab5a23f641b5b2c165c7994c2526df1d2692173bb8e72c4b" Oct 09 15:36:26 crc kubenswrapper[4719]: E1009 15:36:26.794009 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(75999b62-ce1b-4a9b-8507-c8af12441083)\"" pod="openstack/watcher-decision-engine-0" podUID="75999b62-ce1b-4a9b-8507-c8af12441083" Oct 09 15:36:27 crc kubenswrapper[4719]: I1009 15:36:27.086285 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 09 15:36:27 crc kubenswrapper[4719]: I1009 15:36:27.086338 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 09 15:36:27 crc kubenswrapper[4719]: I1009 15:36:27.183389 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 09 15:36:27 crc kubenswrapper[4719]: I1009 15:36:27.183467 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 09 15:36:27 crc kubenswrapper[4719]: I1009 15:36:27.183479 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 09 15:36:27 crc kubenswrapper[4719]: I1009 15:36:27.183495 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 09 15:36:27 crc kubenswrapper[4719]: I1009 15:36:27.239732 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 09 15:36:27 crc kubenswrapper[4719]: I1009 15:36:27.249961 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 09 15:36:28 crc kubenswrapper[4719]: I1009 15:36:28.174703 4719 generic.go:334] "Generic (PLEG): container finished" podID="590f0bbf-4518-4aa6-a71f-1f28b5f4e02a" containerID="0bb9bc0ec575175f114ad0572c1c1f37cede57bc3aa752d335d6140570036942" exitCode=0 Oct 09 15:36:28 crc kubenswrapper[4719]: I1009 15:36:28.176746 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a","Type":"ContainerDied","Data":"0bb9bc0ec575175f114ad0572c1c1f37cede57bc3aa752d335d6140570036942"} Oct 09 15:36:28 crc kubenswrapper[4719]: I1009 15:36:28.176780 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 09 15:36:28 crc kubenswrapper[4719]: I1009 15:36:28.176795 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 09 15:36:28 crc kubenswrapper[4719]: I1009 15:36:28.176804 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 09 15:36:28 crc kubenswrapper[4719]: I1009 15:36:28.176812 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 09 15:36:28 crc kubenswrapper[4719]: I1009 15:36:28.557038 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:36:28 crc kubenswrapper[4719]: I1009 15:36:28.558095 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5db5d6b746-l6xlx" Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.245857 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68884fc79c-l59v5" event={"ID":"df7732ab-4355-4061-9994-87f7ac7e4dd9","Type":"ContainerDied","Data":"99ddf264138128a2d1ea5756be04e2ff930ea630be55d60d2041a29b160a859e"} Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.246430 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99ddf264138128a2d1ea5756be04e2ff930ea630be55d60d2041a29b160a859e" Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.273279 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68884fc79c-l59v5" Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.365171 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-config\") pod \"df7732ab-4355-4061-9994-87f7ac7e4dd9\" (UID: \"df7732ab-4355-4061-9994-87f7ac7e4dd9\") " Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.365231 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-dns-svc\") pod \"df7732ab-4355-4061-9994-87f7ac7e4dd9\" (UID: \"df7732ab-4355-4061-9994-87f7ac7e4dd9\") " Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.365306 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzc55\" (UniqueName: \"kubernetes.io/projected/df7732ab-4355-4061-9994-87f7ac7e4dd9-kube-api-access-qzc55\") pod \"df7732ab-4355-4061-9994-87f7ac7e4dd9\" (UID: \"df7732ab-4355-4061-9994-87f7ac7e4dd9\") " Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.365327 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-ovsdbserver-sb\") pod \"df7732ab-4355-4061-9994-87f7ac7e4dd9\" (UID: \"df7732ab-4355-4061-9994-87f7ac7e4dd9\") " Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.365429 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-dns-swift-storage-0\") pod \"df7732ab-4355-4061-9994-87f7ac7e4dd9\" (UID: \"df7732ab-4355-4061-9994-87f7ac7e4dd9\") " Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.365476 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-ovsdbserver-nb\") pod \"df7732ab-4355-4061-9994-87f7ac7e4dd9\" (UID: \"df7732ab-4355-4061-9994-87f7ac7e4dd9\") " Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.390543 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df7732ab-4355-4061-9994-87f7ac7e4dd9-kube-api-access-qzc55" (OuterVolumeSpecName: "kube-api-access-qzc55") pod "df7732ab-4355-4061-9994-87f7ac7e4dd9" (UID: "df7732ab-4355-4061-9994-87f7ac7e4dd9"). InnerVolumeSpecName "kube-api-access-qzc55". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.399188 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.467307 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-combined-ca-bundle\") pod \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.467382 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-scripts\") pod \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.467445 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-run-httpd\") pod \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.467525 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-log-httpd\") pod \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.467550 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-config-data\") pod \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.467696 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-sg-core-conf-yaml\") pod \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.467733 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj48h\" (UniqueName: \"kubernetes.io/projected/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-kube-api-access-kj48h\") pod \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\" (UID: \"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a\") " Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.468375 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzc55\" (UniqueName: \"kubernetes.io/projected/df7732ab-4355-4061-9994-87f7ac7e4dd9-kube-api-access-qzc55\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.468784 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "590f0bbf-4518-4aa6-a71f-1f28b5f4e02a" (UID: "590f0bbf-4518-4aa6-a71f-1f28b5f4e02a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.469156 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "590f0bbf-4518-4aa6-a71f-1f28b5f4e02a" (UID: "590f0bbf-4518-4aa6-a71f-1f28b5f4e02a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.498574 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-scripts" (OuterVolumeSpecName: "scripts") pod "590f0bbf-4518-4aa6-a71f-1f28b5f4e02a" (UID: "590f0bbf-4518-4aa6-a71f-1f28b5f4e02a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.544271 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-kube-api-access-kj48h" (OuterVolumeSpecName: "kube-api-access-kj48h") pod "590f0bbf-4518-4aa6-a71f-1f28b5f4e02a" (UID: "590f0bbf-4518-4aa6-a71f-1f28b5f4e02a"). InnerVolumeSpecName "kube-api-access-kj48h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.570112 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj48h\" (UniqueName: \"kubernetes.io/projected/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-kube-api-access-kj48h\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.570417 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.570426 4719 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.570433 4719 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.610969 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "df7732ab-4355-4061-9994-87f7ac7e4dd9" (UID: "df7732ab-4355-4061-9994-87f7ac7e4dd9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.672554 4719 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.692234 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57c55b4b47-8npb9"] Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.972112 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "df7732ab-4355-4061-9994-87f7ac7e4dd9" (UID: "df7732ab-4355-4061-9994-87f7ac7e4dd9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:36:29 crc kubenswrapper[4719]: I1009 15:36:29.994403 4719 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.015069 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-config" (OuterVolumeSpecName: "config") pod "df7732ab-4355-4061-9994-87f7ac7e4dd9" (UID: "df7732ab-4355-4061-9994-87f7ac7e4dd9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.018909 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "df7732ab-4355-4061-9994-87f7ac7e4dd9" (UID: "df7732ab-4355-4061-9994-87f7ac7e4dd9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.098464 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "590f0bbf-4518-4aa6-a71f-1f28b5f4e02a" (UID: "590f0bbf-4518-4aa6-a71f-1f28b5f4e02a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.098769 4719 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.098793 4719 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.098805 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.103238 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "df7732ab-4355-4061-9994-87f7ac7e4dd9" (UID: "df7732ab-4355-4061-9994-87f7ac7e4dd9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.118604 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "590f0bbf-4518-4aa6-a71f-1f28b5f4e02a" (UID: "590f0bbf-4518-4aa6-a71f-1f28b5f4e02a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.149500 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-config-data" (OuterVolumeSpecName: "config-data") pod "590f0bbf-4518-4aa6-a71f-1f28b5f4e02a" (UID: "590f0bbf-4518-4aa6-a71f-1f28b5f4e02a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.202029 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.202100 4719 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df7732ab-4355-4061-9994-87f7ac7e4dd9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.202115 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.354573 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" event={"ID":"9a01d050-b1bc-4b48-a783-64a36c24ad6e","Type":"ContainerStarted","Data":"3651b71e9e850278ec8a4d9fadb14df578a3764ebad9838f6b067a451a51bfb8"} Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.355193 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.373871 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57c55b4b47-8npb9" event={"ID":"f4a6c362-de01-454a-a0d8-7ea4c677720c","Type":"ContainerStarted","Data":"1cdf86f7c1fcf36571fe216afc939010d84f1302f32fe58d834e1aa91689c97b"} Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.387924 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68857c4d7f-ns5gc" event={"ID":"e305acce-34be-4503-b643-b60e4201ecfa","Type":"ContainerStarted","Data":"e962de7617c52fd9c220e3bfee4a8c5caf10c2a1695c11dc5aff7c054fccf53d"} Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.391093 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b877f86f8-fptrv" event={"ID":"726781c9-066e-4914-bc4f-e2fc8fcef741","Type":"ContainerStarted","Data":"1addc745cbf2c4f77006cdc452693a1541a72e90039e4a0c1b9e1e0d96a2c110"} Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.391614 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b877f86f8-fptrv" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.392086 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b877f86f8-fptrv" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.395061 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" podStartSLOduration=7.395038927 podStartE2EDuration="7.395038927s" podCreationTimestamp="2025-10-09 15:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:36:30.381422354 +0000 UTC m=+1095.891133649" watchObservedRunningTime="2025-10-09 15:36:30.395038927 +0000 UTC m=+1095.904750222" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.396253 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"590f0bbf-4518-4aa6-a71f-1f28b5f4e02a","Type":"ContainerDied","Data":"d8f33df0dcfcb12578c1cfa83bf64cc6d0458f822f348ed5a5aac37ca7cc8ee4"} Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.396294 4719 scope.go:117] "RemoveContainer" containerID="222f69f2668e6b227d56fac0fc6789e58800e994b1e7d5f2ca3a77ed0f8b7e3c" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.396452 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.411675 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58ddf56cd8-cl6cc" event={"ID":"2f1b276e-3d3d-42c4-a107-53af7102e33e","Type":"ContainerStarted","Data":"02b5608bc8584640713c4eedfaf3a3fb38e93cb2d85aed061067b0dff02f0aed"} Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.412666 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-58ddf56cd8-cl6cc" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.427618 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7ccff5b764-rskpw" event={"ID":"276c7ea1-10eb-4a7d-9eb1-50c62518b5b4","Type":"ContainerStarted","Data":"716332be0711dce7eb7e74cb4180e17b13cf588f6ff05fbe746795e754f31d27"} Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.427685 4719 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.427730 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68884fc79c-l59v5" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.429041 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7b877f86f8-fptrv" podStartSLOduration=7.429022066 podStartE2EDuration="7.429022066s" podCreationTimestamp="2025-10-09 15:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:36:30.412080868 +0000 UTC m=+1095.921792173" watchObservedRunningTime="2025-10-09 15:36:30.429022066 +0000 UTC m=+1095.938733351" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.450082 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-58ddf56cd8-cl6cc" podStartSLOduration=7.450061863 podStartE2EDuration="7.450061863s" podCreationTimestamp="2025-10-09 15:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:36:30.445931313 +0000 UTC m=+1095.955642598" watchObservedRunningTime="2025-10-09 15:36:30.450061863 +0000 UTC m=+1095.959773168" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.475711 4719 scope.go:117] "RemoveContainer" containerID="9410096f029088d510296b7fd6f73e792d5010bebbcf5a4b606890e48fccba01" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.579652 4719 scope.go:117] "RemoveContainer" containerID="0bb9bc0ec575175f114ad0572c1c1f37cede57bc3aa752d335d6140570036942" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.580856 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.619531 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.649422 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:36:30 crc kubenswrapper[4719]: E1009 15:36:30.649902 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="590f0bbf-4518-4aa6-a71f-1f28b5f4e02a" containerName="ceilometer-notification-agent" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.649921 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="590f0bbf-4518-4aa6-a71f-1f28b5f4e02a" containerName="ceilometer-notification-agent" Oct 09 15:36:30 crc kubenswrapper[4719]: E1009 15:36:30.649936 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="590f0bbf-4518-4aa6-a71f-1f28b5f4e02a" containerName="proxy-httpd" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.649942 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="590f0bbf-4518-4aa6-a71f-1f28b5f4e02a" containerName="proxy-httpd" Oct 09 15:36:30 crc kubenswrapper[4719]: E1009 15:36:30.649954 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df7732ab-4355-4061-9994-87f7ac7e4dd9" containerName="init" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.649959 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="df7732ab-4355-4061-9994-87f7ac7e4dd9" containerName="init" Oct 09 15:36:30 crc kubenswrapper[4719]: E1009 15:36:30.649968 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="590f0bbf-4518-4aa6-a71f-1f28b5f4e02a" containerName="sg-core" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.649974 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="590f0bbf-4518-4aa6-a71f-1f28b5f4e02a" containerName="sg-core" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.650132 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="590f0bbf-4518-4aa6-a71f-1f28b5f4e02a" containerName="proxy-httpd" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.650147 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="590f0bbf-4518-4aa6-a71f-1f28b5f4e02a" containerName="ceilometer-notification-agent" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.650157 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="590f0bbf-4518-4aa6-a71f-1f28b5f4e02a" containerName="sg-core" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.650173 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="df7732ab-4355-4061-9994-87f7ac7e4dd9" containerName="init" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.652070 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.654551 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.661086 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.715642 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-config-data\") pod \"ceilometer-0\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " pod="openstack/ceilometer-0" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.715692 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-log-httpd\") pod \"ceilometer-0\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " pod="openstack/ceilometer-0" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.715732 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " pod="openstack/ceilometer-0" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.715768 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-run-httpd\") pod \"ceilometer-0\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " pod="openstack/ceilometer-0" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.715836 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-scripts\") pod \"ceilometer-0\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " pod="openstack/ceilometer-0" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.715866 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbwsp\" (UniqueName: \"kubernetes.io/projected/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-kube-api-access-zbwsp\") pod \"ceilometer-0\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " pod="openstack/ceilometer-0" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.715884 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " pod="openstack/ceilometer-0" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.755387 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.785407 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68884fc79c-l59v5"] Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.795641 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68884fc79c-l59v5"] Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.815230 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7b4958cb64-65wft"] Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.816747 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.818489 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-config-data\") pod \"ceilometer-0\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " pod="openstack/ceilometer-0" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.818531 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-log-httpd\") pod \"ceilometer-0\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " pod="openstack/ceilometer-0" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.818569 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " pod="openstack/ceilometer-0" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.818616 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-run-httpd\") pod \"ceilometer-0\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " pod="openstack/ceilometer-0" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.818683 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-scripts\") pod \"ceilometer-0\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " pod="openstack/ceilometer-0" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.818712 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbwsp\" (UniqueName: \"kubernetes.io/projected/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-kube-api-access-zbwsp\") pod \"ceilometer-0\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " pod="openstack/ceilometer-0" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.818731 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " pod="openstack/ceilometer-0" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.821652 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.821865 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.822794 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-log-httpd\") pod \"ceilometer-0\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " pod="openstack/ceilometer-0" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.823344 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-run-httpd\") pod \"ceilometer-0\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " pod="openstack/ceilometer-0" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.837491 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " pod="openstack/ceilometer-0" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.871435 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b4958cb64-65wft"] Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.902386 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbwsp\" (UniqueName: \"kubernetes.io/projected/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-kube-api-access-zbwsp\") pod \"ceilometer-0\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " pod="openstack/ceilometer-0" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.937289 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a621c39-47f2-4b25-ac34-cf712d8b27c3-public-tls-certs\") pod \"barbican-api-7b4958cb64-65wft\" (UID: \"0a621c39-47f2-4b25-ac34-cf712d8b27c3\") " pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.937447 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a621c39-47f2-4b25-ac34-cf712d8b27c3-combined-ca-bundle\") pod \"barbican-api-7b4958cb64-65wft\" (UID: \"0a621c39-47f2-4b25-ac34-cf712d8b27c3\") " pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.937566 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmj99\" (UniqueName: \"kubernetes.io/projected/0a621c39-47f2-4b25-ac34-cf712d8b27c3-kube-api-access-kmj99\") pod \"barbican-api-7b4958cb64-65wft\" (UID: \"0a621c39-47f2-4b25-ac34-cf712d8b27c3\") " pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.937724 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a621c39-47f2-4b25-ac34-cf712d8b27c3-config-data\") pod \"barbican-api-7b4958cb64-65wft\" (UID: \"0a621c39-47f2-4b25-ac34-cf712d8b27c3\") " pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.937784 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a621c39-47f2-4b25-ac34-cf712d8b27c3-config-data-custom\") pod \"barbican-api-7b4958cb64-65wft\" (UID: \"0a621c39-47f2-4b25-ac34-cf712d8b27c3\") " pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.937813 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a621c39-47f2-4b25-ac34-cf712d8b27c3-logs\") pod \"barbican-api-7b4958cb64-65wft\" (UID: \"0a621c39-47f2-4b25-ac34-cf712d8b27c3\") " pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:30 crc kubenswrapper[4719]: I1009 15:36:30.937857 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a621c39-47f2-4b25-ac34-cf712d8b27c3-internal-tls-certs\") pod \"barbican-api-7b4958cb64-65wft\" (UID: \"0a621c39-47f2-4b25-ac34-cf712d8b27c3\") " pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.050522 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a621c39-47f2-4b25-ac34-cf712d8b27c3-config-data\") pod \"barbican-api-7b4958cb64-65wft\" (UID: \"0a621c39-47f2-4b25-ac34-cf712d8b27c3\") " pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.050569 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a621c39-47f2-4b25-ac34-cf712d8b27c3-config-data-custom\") pod \"barbican-api-7b4958cb64-65wft\" (UID: \"0a621c39-47f2-4b25-ac34-cf712d8b27c3\") " pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.050589 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a621c39-47f2-4b25-ac34-cf712d8b27c3-logs\") pod \"barbican-api-7b4958cb64-65wft\" (UID: \"0a621c39-47f2-4b25-ac34-cf712d8b27c3\") " pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.050610 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a621c39-47f2-4b25-ac34-cf712d8b27c3-internal-tls-certs\") pod \"barbican-api-7b4958cb64-65wft\" (UID: \"0a621c39-47f2-4b25-ac34-cf712d8b27c3\") " pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.050656 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a621c39-47f2-4b25-ac34-cf712d8b27c3-public-tls-certs\") pod \"barbican-api-7b4958cb64-65wft\" (UID: \"0a621c39-47f2-4b25-ac34-cf712d8b27c3\") " pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.050699 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a621c39-47f2-4b25-ac34-cf712d8b27c3-combined-ca-bundle\") pod \"barbican-api-7b4958cb64-65wft\" (UID: \"0a621c39-47f2-4b25-ac34-cf712d8b27c3\") " pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.050743 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmj99\" (UniqueName: \"kubernetes.io/projected/0a621c39-47f2-4b25-ac34-cf712d8b27c3-kube-api-access-kmj99\") pod \"barbican-api-7b4958cb64-65wft\" (UID: \"0a621c39-47f2-4b25-ac34-cf712d8b27c3\") " pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.058277 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a621c39-47f2-4b25-ac34-cf712d8b27c3-logs\") pod \"barbican-api-7b4958cb64-65wft\" (UID: \"0a621c39-47f2-4b25-ac34-cf712d8b27c3\") " pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.062686 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a621c39-47f2-4b25-ac34-cf712d8b27c3-config-data-custom\") pod \"barbican-api-7b4958cb64-65wft\" (UID: \"0a621c39-47f2-4b25-ac34-cf712d8b27c3\") " pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.064952 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a621c39-47f2-4b25-ac34-cf712d8b27c3-combined-ca-bundle\") pod \"barbican-api-7b4958cb64-65wft\" (UID: \"0a621c39-47f2-4b25-ac34-cf712d8b27c3\") " pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.067081 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a621c39-47f2-4b25-ac34-cf712d8b27c3-config-data\") pod \"barbican-api-7b4958cb64-65wft\" (UID: \"0a621c39-47f2-4b25-ac34-cf712d8b27c3\") " pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.075027 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " pod="openstack/ceilometer-0" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.075423 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-config-data\") pod \"ceilometer-0\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " pod="openstack/ceilometer-0" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.097700 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-scripts\") pod \"ceilometer-0\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " pod="openstack/ceilometer-0" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.106677 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a621c39-47f2-4b25-ac34-cf712d8b27c3-public-tls-certs\") pod \"barbican-api-7b4958cb64-65wft\" (UID: \"0a621c39-47f2-4b25-ac34-cf712d8b27c3\") " pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.107194 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a621c39-47f2-4b25-ac34-cf712d8b27c3-internal-tls-certs\") pod \"barbican-api-7b4958cb64-65wft\" (UID: \"0a621c39-47f2-4b25-ac34-cf712d8b27c3\") " pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.117818 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmj99\" (UniqueName: \"kubernetes.io/projected/0a621c39-47f2-4b25-ac34-cf712d8b27c3-kube-api-access-kmj99\") pod \"barbican-api-7b4958cb64-65wft\" (UID: \"0a621c39-47f2-4b25-ac34-cf712d8b27c3\") " pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.167986 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.272300 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="590f0bbf-4518-4aa6-a71f-1f28b5f4e02a" path="/var/lib/kubelet/pods/590f0bbf-4518-4aa6-a71f-1f28b5f4e02a/volumes" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.273416 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df7732ab-4355-4061-9994-87f7ac7e4dd9" path="/var/lib/kubelet/pods/df7732ab-4355-4061-9994-87f7ac7e4dd9/volumes" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.276010 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.459672 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ztgbm" event={"ID":"e899b0de-03a2-44a5-a165-25c988e8489d","Type":"ContainerStarted","Data":"aa349a351ba258207f2c6d303e05f5727dac541b5a9c2d7f825f913a1963c5ea"} Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.506489 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7ccff5b764-rskpw" event={"ID":"276c7ea1-10eb-4a7d-9eb1-50c62518b5b4","Type":"ContainerStarted","Data":"5ca252ca6f48c6daca6aaf40a6b6d4e9cba766392b746d75fcaf273e03838450"} Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.520409 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57c55b4b47-8npb9" event={"ID":"f4a6c362-de01-454a-a0d8-7ea4c677720c","Type":"ContainerStarted","Data":"7a4c0c3cdffb480fe11cd2c367a42b876278cc0d33dac7b2085105f0d0b52cd3"} Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.528216 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68857c4d7f-ns5gc" event={"ID":"e305acce-34be-4503-b643-b60e4201ecfa","Type":"ContainerStarted","Data":"a0fce183546f5def134e094c95ca27199202d0437434a1f5e673a32b455cea09"} Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.558908 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-ztgbm" podStartSLOduration=15.475213185 podStartE2EDuration="1m2.55888991s" podCreationTimestamp="2025-10-09 15:35:29 +0000 UTC" firstStartedPulling="2025-10-09 15:35:42.004848629 +0000 UTC m=+1047.514559914" lastFinishedPulling="2025-10-09 15:36:29.088525354 +0000 UTC m=+1094.598236639" observedRunningTime="2025-10-09 15:36:31.501840389 +0000 UTC m=+1097.011551694" watchObservedRunningTime="2025-10-09 15:36:31.55888991 +0000 UTC m=+1097.068601195" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.621442 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7ccff5b764-rskpw" podStartSLOduration=4.746545624 podStartE2EDuration="9.621417565s" podCreationTimestamp="2025-10-09 15:36:22 +0000 UTC" firstStartedPulling="2025-10-09 15:36:24.25798462 +0000 UTC m=+1089.767695905" lastFinishedPulling="2025-10-09 15:36:29.132856561 +0000 UTC m=+1094.642567846" observedRunningTime="2025-10-09 15:36:31.53684984 +0000 UTC m=+1097.046561125" watchObservedRunningTime="2025-10-09 15:36:31.621417565 +0000 UTC m=+1097.131128850" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.630676 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-68857c4d7f-ns5gc" podStartSLOduration=4.674391613 podStartE2EDuration="9.630658149s" podCreationTimestamp="2025-10-09 15:36:22 +0000 UTC" firstStartedPulling="2025-10-09 15:36:24.109153044 +0000 UTC m=+1089.618864329" lastFinishedPulling="2025-10-09 15:36:29.06541958 +0000 UTC m=+1094.575130865" observedRunningTime="2025-10-09 15:36:31.5743116 +0000 UTC m=+1097.084022895" watchObservedRunningTime="2025-10-09 15:36:31.630658149 +0000 UTC m=+1097.140369424" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.872517 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-69dbc5fbc7-t286g" Oct 09 15:36:31 crc kubenswrapper[4719]: I1009 15:36:31.964050 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b4958cb64-65wft"] Oct 09 15:36:31 crc kubenswrapper[4719]: W1009 15:36:31.972429 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a621c39_47f2_4b25_ac34_cf712d8b27c3.slice/crio-aae8982a1e75eb95b0a04becab5e3528e498f34afcfe28d66fdcd2786f1eb8f6 WatchSource:0}: Error finding container aae8982a1e75eb95b0a04becab5e3528e498f34afcfe28d66fdcd2786f1eb8f6: Status 404 returned error can't find the container with id aae8982a1e75eb95b0a04becab5e3528e498f34afcfe28d66fdcd2786f1eb8f6 Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.127467 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:36:32 crc kubenswrapper[4719]: W1009 15:36:32.150735 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bfb5b45_7c9f_46aa_99ad_4011a81bb196.slice/crio-ce00b6775b0901280024ab84e319001b04b547e806bae0d76fc136d868fddacf WatchSource:0}: Error finding container ce00b6775b0901280024ab84e319001b04b547e806bae0d76fc136d868fddacf: Status 404 returned error can't find the container with id ce00b6775b0901280024ab84e319001b04b547e806bae0d76fc136d868fddacf Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.256534 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.257901 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.265510 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.265708 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.265726 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-7lsmf" Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.284759 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.319518 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/893b05af-4bf3-4c76-940c-3ed1cceb7e18-openstack-config\") pod \"openstackclient\" (UID: \"893b05af-4bf3-4c76-940c-3ed1cceb7e18\") " pod="openstack/openstackclient" Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.319883 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/893b05af-4bf3-4c76-940c-3ed1cceb7e18-openstack-config-secret\") pod \"openstackclient\" (UID: \"893b05af-4bf3-4c76-940c-3ed1cceb7e18\") " pod="openstack/openstackclient" Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.320074 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn9d2\" (UniqueName: \"kubernetes.io/projected/893b05af-4bf3-4c76-940c-3ed1cceb7e18-kube-api-access-rn9d2\") pod \"openstackclient\" (UID: \"893b05af-4bf3-4c76-940c-3ed1cceb7e18\") " pod="openstack/openstackclient" Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.320211 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/893b05af-4bf3-4c76-940c-3ed1cceb7e18-combined-ca-bundle\") pod \"openstackclient\" (UID: \"893b05af-4bf3-4c76-940c-3ed1cceb7e18\") " pod="openstack/openstackclient" Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.382202 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7457564986-k28cv" podUID="d16f8bb5-9ca5-4042-ae67-756c79d00217" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.161:8443: connect: connection refused" Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.382639 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7457564986-k28cv" Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.422018 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/893b05af-4bf3-4c76-940c-3ed1cceb7e18-openstack-config\") pod \"openstackclient\" (UID: \"893b05af-4bf3-4c76-940c-3ed1cceb7e18\") " pod="openstack/openstackclient" Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.422104 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/893b05af-4bf3-4c76-940c-3ed1cceb7e18-openstack-config-secret\") pod \"openstackclient\" (UID: \"893b05af-4bf3-4c76-940c-3ed1cceb7e18\") " pod="openstack/openstackclient" Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.422195 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn9d2\" (UniqueName: \"kubernetes.io/projected/893b05af-4bf3-4c76-940c-3ed1cceb7e18-kube-api-access-rn9d2\") pod \"openstackclient\" (UID: \"893b05af-4bf3-4c76-940c-3ed1cceb7e18\") " pod="openstack/openstackclient" Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.422225 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/893b05af-4bf3-4c76-940c-3ed1cceb7e18-combined-ca-bundle\") pod \"openstackclient\" (UID: \"893b05af-4bf3-4c76-940c-3ed1cceb7e18\") " pod="openstack/openstackclient" Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.424249 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/893b05af-4bf3-4c76-940c-3ed1cceb7e18-openstack-config\") pod \"openstackclient\" (UID: \"893b05af-4bf3-4c76-940c-3ed1cceb7e18\") " pod="openstack/openstackclient" Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.425783 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/893b05af-4bf3-4c76-940c-3ed1cceb7e18-openstack-config-secret\") pod \"openstackclient\" (UID: \"893b05af-4bf3-4c76-940c-3ed1cceb7e18\") " pod="openstack/openstackclient" Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.441057 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/893b05af-4bf3-4c76-940c-3ed1cceb7e18-combined-ca-bundle\") pod \"openstackclient\" (UID: \"893b05af-4bf3-4c76-940c-3ed1cceb7e18\") " pod="openstack/openstackclient" Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.450964 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn9d2\" (UniqueName: \"kubernetes.io/projected/893b05af-4bf3-4c76-940c-3ed1cceb7e18-kube-api-access-rn9d2\") pod \"openstackclient\" (UID: \"893b05af-4bf3-4c76-940c-3ed1cceb7e18\") " pod="openstack/openstackclient" Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.542515 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bfb5b45-7c9f-46aa-99ad-4011a81bb196","Type":"ContainerStarted","Data":"ce00b6775b0901280024ab84e319001b04b547e806bae0d76fc136d868fddacf"} Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.544850 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b4958cb64-65wft" event={"ID":"0a621c39-47f2-4b25-ac34-cf712d8b27c3","Type":"ContainerStarted","Data":"625e897c68fd1da4d003e9f689c552e4b540eebd7ad5424a08948989e581d8fc"} Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.544886 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b4958cb64-65wft" event={"ID":"0a621c39-47f2-4b25-ac34-cf712d8b27c3","Type":"ContainerStarted","Data":"aae8982a1e75eb95b0a04becab5e3528e498f34afcfe28d66fdcd2786f1eb8f6"} Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.551903 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57c55b4b47-8npb9" event={"ID":"f4a6c362-de01-454a-a0d8-7ea4c677720c","Type":"ContainerStarted","Data":"2ec48bda176000ce6e8a3eaa5422a6344cbae1833b83580655e4198890c5d3a3"} Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.552644 4719 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.553484 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-57c55b4b47-8npb9" Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.594886 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-57c55b4b47-8npb9" podStartSLOduration=7.594864283 podStartE2EDuration="7.594864283s" podCreationTimestamp="2025-10-09 15:36:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:36:32.589769761 +0000 UTC m=+1098.099481056" watchObservedRunningTime="2025-10-09 15:36:32.594864283 +0000 UTC m=+1098.104575558" Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.666245 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.667958 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.673969 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.928389 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 09 15:36:32 crc kubenswrapper[4719]: I1009 15:36:32.928890 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 09 15:36:33 crc kubenswrapper[4719]: I1009 15:36:33.524479 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 09 15:36:33 crc kubenswrapper[4719]: I1009 15:36:33.576784 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"893b05af-4bf3-4c76-940c-3ed1cceb7e18","Type":"ContainerStarted","Data":"a2b6f94d76604cea6f04e6a29283e742b60f0992974c16acc7b5541ffa638aa4"} Oct 09 15:36:33 crc kubenswrapper[4719]: I1009 15:36:33.589765 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bfb5b45-7c9f-46aa-99ad-4011a81bb196","Type":"ContainerStarted","Data":"e69caa09e16808f6ca6d9bad1a134dbd5ce2e6e06249d20d1636d4f99f3dc051"} Oct 09 15:36:33 crc kubenswrapper[4719]: I1009 15:36:33.589817 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bfb5b45-7c9f-46aa-99ad-4011a81bb196","Type":"ContainerStarted","Data":"73825614b680bbd5ce91c83fcca82f1194c982a9e5220e5af0c1a7c518e59c9d"} Oct 09 15:36:33 crc kubenswrapper[4719]: I1009 15:36:33.596061 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b4958cb64-65wft" event={"ID":"0a621c39-47f2-4b25-ac34-cf712d8b27c3","Type":"ContainerStarted","Data":"9803f4e4bdd3729f2bc07e8e42d3a50f62c48f891999c15551025eae9405fbb7"} Oct 09 15:36:33 crc kubenswrapper[4719]: I1009 15:36:33.596244 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:33 crc kubenswrapper[4719]: I1009 15:36:33.597241 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:33 crc kubenswrapper[4719]: I1009 15:36:33.630282 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7b4958cb64-65wft" podStartSLOduration=3.630259407 podStartE2EDuration="3.630259407s" podCreationTimestamp="2025-10-09 15:36:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:36:33.625895629 +0000 UTC m=+1099.135606924" watchObservedRunningTime="2025-10-09 15:36:33.630259407 +0000 UTC m=+1099.139970692" Oct 09 15:36:33 crc kubenswrapper[4719]: I1009 15:36:33.964590 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b877f86f8-fptrv" Oct 09 15:36:34 crc kubenswrapper[4719]: I1009 15:36:34.623134 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bfb5b45-7c9f-46aa-99ad-4011a81bb196","Type":"ContainerStarted","Data":"18c33dd5b6d35bf57afe74f800ff13811b3b8acb9407caced78a862213cd541b"} Oct 09 15:36:36 crc kubenswrapper[4719]: I1009 15:36:36.643140 4719 generic.go:334] "Generic (PLEG): container finished" podID="d16f8bb5-9ca5-4042-ae67-756c79d00217" containerID="b066db9dcb69b83cf18678779a349eae16c407271c43f6f1d333a2cd48b9227c" exitCode=137 Oct 09 15:36:36 crc kubenswrapper[4719]: I1009 15:36:36.643695 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7457564986-k28cv" event={"ID":"d16f8bb5-9ca5-4042-ae67-756c79d00217","Type":"ContainerDied","Data":"b066db9dcb69b83cf18678779a349eae16c407271c43f6f1d333a2cd48b9227c"} Oct 09 15:36:36 crc kubenswrapper[4719]: I1009 15:36:36.653696 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bfb5b45-7c9f-46aa-99ad-4011a81bb196","Type":"ContainerStarted","Data":"6badb6bbb59aa7562a1599a36734b995ec0a0f476f59d31df6c1876731ae2a89"} Oct 09 15:36:36 crc kubenswrapper[4719]: I1009 15:36:36.655189 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 15:36:36 crc kubenswrapper[4719]: I1009 15:36:36.692648 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.97809161 podStartE2EDuration="6.69262715s" podCreationTimestamp="2025-10-09 15:36:30 +0000 UTC" firstStartedPulling="2025-10-09 15:36:32.158519458 +0000 UTC m=+1097.668230743" lastFinishedPulling="2025-10-09 15:36:35.873054998 +0000 UTC m=+1101.382766283" observedRunningTime="2025-10-09 15:36:36.679492483 +0000 UTC m=+1102.189203768" watchObservedRunningTime="2025-10-09 15:36:36.69262715 +0000 UTC m=+1102.202338435" Oct 09 15:36:36 crc kubenswrapper[4719]: I1009 15:36:36.948404 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7457564986-k28cv" Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.057460 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d16f8bb5-9ca5-4042-ae67-756c79d00217-horizon-tls-certs\") pod \"d16f8bb5-9ca5-4042-ae67-756c79d00217\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.057536 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d16f8bb5-9ca5-4042-ae67-756c79d00217-config-data\") pod \"d16f8bb5-9ca5-4042-ae67-756c79d00217\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.057589 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d16f8bb5-9ca5-4042-ae67-756c79d00217-combined-ca-bundle\") pod \"d16f8bb5-9ca5-4042-ae67-756c79d00217\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.057630 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d16f8bb5-9ca5-4042-ae67-756c79d00217-horizon-secret-key\") pod \"d16f8bb5-9ca5-4042-ae67-756c79d00217\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.057707 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d16f8bb5-9ca5-4042-ae67-756c79d00217-logs\") pod \"d16f8bb5-9ca5-4042-ae67-756c79d00217\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.057743 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d16f8bb5-9ca5-4042-ae67-756c79d00217-scripts\") pod \"d16f8bb5-9ca5-4042-ae67-756c79d00217\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.057844 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvf64\" (UniqueName: \"kubernetes.io/projected/d16f8bb5-9ca5-4042-ae67-756c79d00217-kube-api-access-cvf64\") pod \"d16f8bb5-9ca5-4042-ae67-756c79d00217\" (UID: \"d16f8bb5-9ca5-4042-ae67-756c79d00217\") " Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.064502 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d16f8bb5-9ca5-4042-ae67-756c79d00217-logs" (OuterVolumeSpecName: "logs") pod "d16f8bb5-9ca5-4042-ae67-756c79d00217" (UID: "d16f8bb5-9ca5-4042-ae67-756c79d00217"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.067662 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d16f8bb5-9ca5-4042-ae67-756c79d00217-kube-api-access-cvf64" (OuterVolumeSpecName: "kube-api-access-cvf64") pod "d16f8bb5-9ca5-4042-ae67-756c79d00217" (UID: "d16f8bb5-9ca5-4042-ae67-756c79d00217"). InnerVolumeSpecName "kube-api-access-cvf64". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.076504 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d16f8bb5-9ca5-4042-ae67-756c79d00217-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d16f8bb5-9ca5-4042-ae67-756c79d00217" (UID: "d16f8bb5-9ca5-4042-ae67-756c79d00217"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.115081 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d16f8bb5-9ca5-4042-ae67-756c79d00217-config-data" (OuterVolumeSpecName: "config-data") pod "d16f8bb5-9ca5-4042-ae67-756c79d00217" (UID: "d16f8bb5-9ca5-4042-ae67-756c79d00217"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.115777 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d16f8bb5-9ca5-4042-ae67-756c79d00217-scripts" (OuterVolumeSpecName: "scripts") pod "d16f8bb5-9ca5-4042-ae67-756c79d00217" (UID: "d16f8bb5-9ca5-4042-ae67-756c79d00217"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.152394 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d16f8bb5-9ca5-4042-ae67-756c79d00217-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "d16f8bb5-9ca5-4042-ae67-756c79d00217" (UID: "d16f8bb5-9ca5-4042-ae67-756c79d00217"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.172170 4719 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d16f8bb5-9ca5-4042-ae67-756c79d00217-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.172206 4719 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d16f8bb5-9ca5-4042-ae67-756c79d00217-logs\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.172218 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d16f8bb5-9ca5-4042-ae67-756c79d00217-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.172231 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvf64\" (UniqueName: \"kubernetes.io/projected/d16f8bb5-9ca5-4042-ae67-756c79d00217-kube-api-access-cvf64\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.172246 4719 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d16f8bb5-9ca5-4042-ae67-756c79d00217-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.172257 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d16f8bb5-9ca5-4042-ae67-756c79d00217-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.197837 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d16f8bb5-9ca5-4042-ae67-756c79d00217-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d16f8bb5-9ca5-4042-ae67-756c79d00217" (UID: "d16f8bb5-9ca5-4042-ae67-756c79d00217"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.279737 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d16f8bb5-9ca5-4042-ae67-756c79d00217-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.666052 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7457564986-k28cv" Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.666547 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7457564986-k28cv" event={"ID":"d16f8bb5-9ca5-4042-ae67-756c79d00217","Type":"ContainerDied","Data":"806e4e8aa65d1fa1409cea50489da243031e292e11d4828f43fa4665990bd000"} Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.666576 4719 scope.go:117] "RemoveContainer" containerID="10745d1f988a6d6acb4c497cc78d9b04e38ddb779ac8f9940910e552683c668d" Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.691970 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7457564986-k28cv"] Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.700475 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7457564986-k28cv"] Oct 09 15:36:37 crc kubenswrapper[4719]: I1009 15:36:37.882070 4719 scope.go:117] "RemoveContainer" containerID="b066db9dcb69b83cf18678779a349eae16c407271c43f6f1d333a2cd48b9227c" Oct 09 15:36:38 crc kubenswrapper[4719]: I1009 15:36:38.756813 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b877f86f8-fptrv" Oct 09 15:36:38 crc kubenswrapper[4719]: I1009 15:36:38.807601 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" Oct 09 15:36:38 crc kubenswrapper[4719]: I1009 15:36:38.894340 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d4b4465c7-xmnmh"] Oct 09 15:36:38 crc kubenswrapper[4719]: I1009 15:36:38.894586 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" podUID="6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0" containerName="dnsmasq-dns" containerID="cri-o://d8b8706ff6e4dce5089ae88b4d5c4c18a1e7a5a458e508a9d46bebba8b725611" gracePeriod=10 Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.188001 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d16f8bb5-9ca5-4042-ae67-756c79d00217" path="/var/lib/kubelet/pods/d16f8bb5-9ca5-4042-ae67-756c79d00217/volumes" Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.651566 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.706545 4719 generic.go:334] "Generic (PLEG): container finished" podID="6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0" containerID="d8b8706ff6e4dce5089ae88b4d5c4c18a1e7a5a458e508a9d46bebba8b725611" exitCode=0 Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.706591 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" event={"ID":"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0","Type":"ContainerDied","Data":"d8b8706ff6e4dce5089ae88b4d5c4c18a1e7a5a458e508a9d46bebba8b725611"} Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.706619 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" event={"ID":"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0","Type":"ContainerDied","Data":"e183ff5ab72e1be831938d030ccd662c2692774bec76608f2bc7a1f81801ffab"} Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.706636 4719 scope.go:117] "RemoveContainer" containerID="d8b8706ff6e4dce5089ae88b4d5c4c18a1e7a5a458e508a9d46bebba8b725611" Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.706742 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d4b4465c7-xmnmh" Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.748269 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-dns-swift-storage-0\") pod \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\" (UID: \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\") " Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.748332 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-config\") pod \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\" (UID: \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\") " Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.748384 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-ovsdbserver-sb\") pod \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\" (UID: \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\") " Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.748551 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-ovsdbserver-nb\") pod \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\" (UID: \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\") " Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.748606 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-dns-svc\") pod \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\" (UID: \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\") " Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.748690 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm9wb\" (UniqueName: \"kubernetes.io/projected/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-kube-api-access-vm9wb\") pod \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\" (UID: \"6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0\") " Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.766833 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-kube-api-access-vm9wb" (OuterVolumeSpecName: "kube-api-access-vm9wb") pod "6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0" (UID: "6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0"). InnerVolumeSpecName "kube-api-access-vm9wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.794025 4719 scope.go:117] "RemoveContainer" containerID="83a913a45cd58f2dd977bd09b3e7ae5d581cb6a2864604e7f86296b9f951b410" Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.835932 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0" (UID: "6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.850686 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm9wb\" (UniqueName: \"kubernetes.io/projected/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-kube-api-access-vm9wb\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.850714 4719 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.853131 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0" (UID: "6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.867071 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0" (UID: "6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.881366 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0" (UID: "6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.898413 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-config" (OuterVolumeSpecName: "config") pod "6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0" (UID: "6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.932513 4719 scope.go:117] "RemoveContainer" containerID="d8b8706ff6e4dce5089ae88b4d5c4c18a1e7a5a458e508a9d46bebba8b725611" Oct 09 15:36:39 crc kubenswrapper[4719]: E1009 15:36:39.932939 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8b8706ff6e4dce5089ae88b4d5c4c18a1e7a5a458e508a9d46bebba8b725611\": container with ID starting with d8b8706ff6e4dce5089ae88b4d5c4c18a1e7a5a458e508a9d46bebba8b725611 not found: ID does not exist" containerID="d8b8706ff6e4dce5089ae88b4d5c4c18a1e7a5a458e508a9d46bebba8b725611" Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.932972 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8b8706ff6e4dce5089ae88b4d5c4c18a1e7a5a458e508a9d46bebba8b725611"} err="failed to get container status \"d8b8706ff6e4dce5089ae88b4d5c4c18a1e7a5a458e508a9d46bebba8b725611\": rpc error: code = NotFound desc = could not find container \"d8b8706ff6e4dce5089ae88b4d5c4c18a1e7a5a458e508a9d46bebba8b725611\": container with ID starting with d8b8706ff6e4dce5089ae88b4d5c4c18a1e7a5a458e508a9d46bebba8b725611 not found: ID does not exist" Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.932992 4719 scope.go:117] "RemoveContainer" containerID="83a913a45cd58f2dd977bd09b3e7ae5d581cb6a2864604e7f86296b9f951b410" Oct 09 15:36:39 crc kubenswrapper[4719]: E1009 15:36:39.933190 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83a913a45cd58f2dd977bd09b3e7ae5d581cb6a2864604e7f86296b9f951b410\": container with ID starting with 83a913a45cd58f2dd977bd09b3e7ae5d581cb6a2864604e7f86296b9f951b410 not found: ID does not exist" containerID="83a913a45cd58f2dd977bd09b3e7ae5d581cb6a2864604e7f86296b9f951b410" Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.933215 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83a913a45cd58f2dd977bd09b3e7ae5d581cb6a2864604e7f86296b9f951b410"} err="failed to get container status \"83a913a45cd58f2dd977bd09b3e7ae5d581cb6a2864604e7f86296b9f951b410\": rpc error: code = NotFound desc = could not find container \"83a913a45cd58f2dd977bd09b3e7ae5d581cb6a2864604e7f86296b9f951b410\": container with ID starting with 83a913a45cd58f2dd977bd09b3e7ae5d581cb6a2864604e7f86296b9f951b410 not found: ID does not exist" Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.953269 4719 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.953315 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.953329 4719 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:39 crc kubenswrapper[4719]: I1009 15:36:39.953342 4719 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:40 crc kubenswrapper[4719]: I1009 15:36:40.085584 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d4b4465c7-xmnmh"] Oct 09 15:36:40 crc kubenswrapper[4719]: I1009 15:36:40.093946 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d4b4465c7-xmnmh"] Oct 09 15:36:40 crc kubenswrapper[4719]: I1009 15:36:40.161546 4719 scope.go:117] "RemoveContainer" containerID="97c064a8c41ba55bab5a23f641b5b2c165c7994c2526df1d2692173bb8e72c4b" Oct 09 15:36:40 crc kubenswrapper[4719]: I1009 15:36:40.740366 4719 generic.go:334] "Generic (PLEG): container finished" podID="e899b0de-03a2-44a5-a165-25c988e8489d" containerID="aa349a351ba258207f2c6d303e05f5727dac541b5a9c2d7f825f913a1963c5ea" exitCode=0 Oct 09 15:36:40 crc kubenswrapper[4719]: I1009 15:36:40.740648 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ztgbm" event={"ID":"e899b0de-03a2-44a5-a165-25c988e8489d","Type":"ContainerDied","Data":"aa349a351ba258207f2c6d303e05f5727dac541b5a9c2d7f825f913a1963c5ea"} Oct 09 15:36:40 crc kubenswrapper[4719]: I1009 15:36:40.754721 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"75999b62-ce1b-4a9b-8507-c8af12441083","Type":"ContainerStarted","Data":"1b0d8ce3fa12f7379def00ca2131e79585f6bdff6f8c5cc63816c53109df6822"} Oct 09 15:36:41 crc kubenswrapper[4719]: I1009 15:36:41.170618 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0" path="/var/lib/kubelet/pods/6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0/volumes" Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.255942 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.259572 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9bfb5b45-7c9f-46aa-99ad-4011a81bb196" containerName="sg-core" containerID="cri-o://18c33dd5b6d35bf57afe74f800ff13811b3b8acb9407caced78a862213cd541b" gracePeriod=30 Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.259634 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9bfb5b45-7c9f-46aa-99ad-4011a81bb196" containerName="proxy-httpd" containerID="cri-o://6badb6bbb59aa7562a1599a36734b995ec0a0f476f59d31df6c1876731ae2a89" gracePeriod=30 Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.259719 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9bfb5b45-7c9f-46aa-99ad-4011a81bb196" containerName="ceilometer-central-agent" containerID="cri-o://73825614b680bbd5ce91c83fcca82f1194c982a9e5220e5af0c1a7c518e59c9d" gracePeriod=30 Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.259814 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9bfb5b45-7c9f-46aa-99ad-4011a81bb196" containerName="ceilometer-notification-agent" containerID="cri-o://e69caa09e16808f6ca6d9bad1a134dbd5ce2e6e06249d20d1636d4f99f3dc051" gracePeriod=30 Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.269884 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ztgbm" Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.405335 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e899b0de-03a2-44a5-a165-25c988e8489d-combined-ca-bundle\") pod \"e899b0de-03a2-44a5-a165-25c988e8489d\" (UID: \"e899b0de-03a2-44a5-a165-25c988e8489d\") " Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.405683 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e899b0de-03a2-44a5-a165-25c988e8489d-db-sync-config-data\") pod \"e899b0de-03a2-44a5-a165-25c988e8489d\" (UID: \"e899b0de-03a2-44a5-a165-25c988e8489d\") " Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.405747 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e899b0de-03a2-44a5-a165-25c988e8489d-etc-machine-id\") pod \"e899b0de-03a2-44a5-a165-25c988e8489d\" (UID: \"e899b0de-03a2-44a5-a165-25c988e8489d\") " Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.405823 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e899b0de-03a2-44a5-a165-25c988e8489d-config-data\") pod \"e899b0de-03a2-44a5-a165-25c988e8489d\" (UID: \"e899b0de-03a2-44a5-a165-25c988e8489d\") " Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.405893 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnxsf\" (UniqueName: \"kubernetes.io/projected/e899b0de-03a2-44a5-a165-25c988e8489d-kube-api-access-qnxsf\") pod \"e899b0de-03a2-44a5-a165-25c988e8489d\" (UID: \"e899b0de-03a2-44a5-a165-25c988e8489d\") " Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.405974 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e899b0de-03a2-44a5-a165-25c988e8489d-scripts\") pod \"e899b0de-03a2-44a5-a165-25c988e8489d\" (UID: \"e899b0de-03a2-44a5-a165-25c988e8489d\") " Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.406088 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e899b0de-03a2-44a5-a165-25c988e8489d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e899b0de-03a2-44a5-a165-25c988e8489d" (UID: "e899b0de-03a2-44a5-a165-25c988e8489d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.406562 4719 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e899b0de-03a2-44a5-a165-25c988e8489d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.419683 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e899b0de-03a2-44a5-a165-25c988e8489d-scripts" (OuterVolumeSpecName: "scripts") pod "e899b0de-03a2-44a5-a165-25c988e8489d" (UID: "e899b0de-03a2-44a5-a165-25c988e8489d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.423628 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e899b0de-03a2-44a5-a165-25c988e8489d-kube-api-access-qnxsf" (OuterVolumeSpecName: "kube-api-access-qnxsf") pod "e899b0de-03a2-44a5-a165-25c988e8489d" (UID: "e899b0de-03a2-44a5-a165-25c988e8489d"). InnerVolumeSpecName "kube-api-access-qnxsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.424489 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e899b0de-03a2-44a5-a165-25c988e8489d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e899b0de-03a2-44a5-a165-25c988e8489d" (UID: "e899b0de-03a2-44a5-a165-25c988e8489d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.439552 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e899b0de-03a2-44a5-a165-25c988e8489d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e899b0de-03a2-44a5-a165-25c988e8489d" (UID: "e899b0de-03a2-44a5-a165-25c988e8489d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.471465 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e899b0de-03a2-44a5-a165-25c988e8489d-config-data" (OuterVolumeSpecName: "config-data") pod "e899b0de-03a2-44a5-a165-25c988e8489d" (UID: "e899b0de-03a2-44a5-a165-25c988e8489d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.508120 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e899b0de-03a2-44a5-a165-25c988e8489d-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.508336 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnxsf\" (UniqueName: \"kubernetes.io/projected/e899b0de-03a2-44a5-a165-25c988e8489d-kube-api-access-qnxsf\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.508441 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e899b0de-03a2-44a5-a165-25c988e8489d-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.508512 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e899b0de-03a2-44a5-a165-25c988e8489d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.508567 4719 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e899b0de-03a2-44a5-a165-25c988e8489d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.715881 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.781425 4719 generic.go:334] "Generic (PLEG): container finished" podID="9bfb5b45-7c9f-46aa-99ad-4011a81bb196" containerID="6badb6bbb59aa7562a1599a36734b995ec0a0f476f59d31df6c1876731ae2a89" exitCode=0 Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.781464 4719 generic.go:334] "Generic (PLEG): container finished" podID="9bfb5b45-7c9f-46aa-99ad-4011a81bb196" containerID="18c33dd5b6d35bf57afe74f800ff13811b3b8acb9407caced78a862213cd541b" exitCode=2 Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.781473 4719 generic.go:334] "Generic (PLEG): container finished" podID="9bfb5b45-7c9f-46aa-99ad-4011a81bb196" containerID="e69caa09e16808f6ca6d9bad1a134dbd5ce2e6e06249d20d1636d4f99f3dc051" exitCode=0 Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.781528 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bfb5b45-7c9f-46aa-99ad-4011a81bb196","Type":"ContainerDied","Data":"6badb6bbb59aa7562a1599a36734b995ec0a0f476f59d31df6c1876731ae2a89"} Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.781610 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bfb5b45-7c9f-46aa-99ad-4011a81bb196","Type":"ContainerDied","Data":"18c33dd5b6d35bf57afe74f800ff13811b3b8acb9407caced78a862213cd541b"} Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.781622 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bfb5b45-7c9f-46aa-99ad-4011a81bb196","Type":"ContainerDied","Data":"e69caa09e16808f6ca6d9bad1a134dbd5ce2e6e06249d20d1636d4f99f3dc051"} Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.784095 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ztgbm" event={"ID":"e899b0de-03a2-44a5-a165-25c988e8489d","Type":"ContainerDied","Data":"8e9fd52b958b37cc0d5f9bf89a297dd1eaa3e2a34c04dea47a43b7ff8ed5b5d3"} Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.784222 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e9fd52b958b37cc0d5f9bf89a297dd1eaa3e2a34c04dea47a43b7ff8ed5b5d3" Oct 09 15:36:42 crc kubenswrapper[4719]: I1009 15:36:42.784140 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ztgbm" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.088340 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 15:36:43 crc kubenswrapper[4719]: E1009 15:36:43.088744 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e899b0de-03a2-44a5-a165-25c988e8489d" containerName="cinder-db-sync" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.088759 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="e899b0de-03a2-44a5-a165-25c988e8489d" containerName="cinder-db-sync" Oct 09 15:36:43 crc kubenswrapper[4719]: E1009 15:36:43.088778 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0" containerName="dnsmasq-dns" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.088785 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0" containerName="dnsmasq-dns" Oct 09 15:36:43 crc kubenswrapper[4719]: E1009 15:36:43.088801 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d16f8bb5-9ca5-4042-ae67-756c79d00217" containerName="horizon" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.088806 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="d16f8bb5-9ca5-4042-ae67-756c79d00217" containerName="horizon" Oct 09 15:36:43 crc kubenswrapper[4719]: E1009 15:36:43.088823 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d16f8bb5-9ca5-4042-ae67-756c79d00217" containerName="horizon-log" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.088828 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="d16f8bb5-9ca5-4042-ae67-756c79d00217" containerName="horizon-log" Oct 09 15:36:43 crc kubenswrapper[4719]: E1009 15:36:43.088835 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0" containerName="init" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.088841 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0" containerName="init" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.089027 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="d16f8bb5-9ca5-4042-ae67-756c79d00217" containerName="horizon" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.089043 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e2921ad-95e7-4789-94e8-1fb6dbe6aaa0" containerName="dnsmasq-dns" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.089055 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="d16f8bb5-9ca5-4042-ae67-756c79d00217" containerName="horizon-log" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.089063 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="e899b0de-03a2-44a5-a165-25c988e8489d" containerName="cinder-db-sync" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.096571 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.098157 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-kjw8r" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.098449 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.099196 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.110169 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.128743 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.198954 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77b756999f-5ptkd"] Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.200699 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77b756999f-5ptkd" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.221070 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-556bc79449-9bdkb"] Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.223305 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.223633 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\") " pod="openstack/cinder-scheduler-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.223686 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-config-data\") pod \"cinder-scheduler-0\" (UID: \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\") " pod="openstack/cinder-scheduler-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.223755 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\") " pod="openstack/cinder-scheduler-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.223815 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\") " pod="openstack/cinder-scheduler-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.223861 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-scripts\") pod \"cinder-scheduler-0\" (UID: \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\") " pod="openstack/cinder-scheduler-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.223909 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdcjb\" (UniqueName: \"kubernetes.io/projected/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-kube-api-access-wdcjb\") pod \"cinder-scheduler-0\" (UID: \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\") " pod="openstack/cinder-scheduler-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.232540 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.233432 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.233690 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.257528 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-556bc79449-9bdkb"] Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.280730 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77b756999f-5ptkd"] Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.325722 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-ovsdbserver-sb\") pod \"dnsmasq-dns-77b756999f-5ptkd\" (UID: \"f7540714-0deb-4ba6-8709-846457e19966\") " pod="openstack/dnsmasq-dns-77b756999f-5ptkd" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.325770 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cbef595-0a78-4655-85ca-b329f51067af-log-httpd\") pod \"swift-proxy-556bc79449-9bdkb\" (UID: \"6cbef595-0a78-4655-85ca-b329f51067af\") " pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.325855 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\") " pod="openstack/cinder-scheduler-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.325888 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cbef595-0a78-4655-85ca-b329f51067af-run-httpd\") pod \"swift-proxy-556bc79449-9bdkb\" (UID: \"6cbef595-0a78-4655-85ca-b329f51067af\") " pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.325930 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cbef595-0a78-4655-85ca-b329f51067af-config-data\") pod \"swift-proxy-556bc79449-9bdkb\" (UID: \"6cbef595-0a78-4655-85ca-b329f51067af\") " pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.325968 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\") " pod="openstack/cinder-scheduler-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.325986 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6cbef595-0a78-4655-85ca-b329f51067af-etc-swift\") pod \"swift-proxy-556bc79449-9bdkb\" (UID: \"6cbef595-0a78-4655-85ca-b329f51067af\") " pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.326045 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cbef595-0a78-4655-85ca-b329f51067af-public-tls-certs\") pod \"swift-proxy-556bc79449-9bdkb\" (UID: \"6cbef595-0a78-4655-85ca-b329f51067af\") " pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.326080 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cbef595-0a78-4655-85ca-b329f51067af-combined-ca-bundle\") pod \"swift-proxy-556bc79449-9bdkb\" (UID: \"6cbef595-0a78-4655-85ca-b329f51067af\") " pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.326120 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-scripts\") pod \"cinder-scheduler-0\" (UID: \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\") " pod="openstack/cinder-scheduler-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.326157 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzms5\" (UniqueName: \"kubernetes.io/projected/6cbef595-0a78-4655-85ca-b329f51067af-kube-api-access-qzms5\") pod \"swift-proxy-556bc79449-9bdkb\" (UID: \"6cbef595-0a78-4655-85ca-b329f51067af\") " pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.326178 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cbef595-0a78-4655-85ca-b329f51067af-internal-tls-certs\") pod \"swift-proxy-556bc79449-9bdkb\" (UID: \"6cbef595-0a78-4655-85ca-b329f51067af\") " pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.326207 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdcjb\" (UniqueName: \"kubernetes.io/projected/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-kube-api-access-wdcjb\") pod \"cinder-scheduler-0\" (UID: \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\") " pod="openstack/cinder-scheduler-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.326232 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-dns-svc\") pod \"dnsmasq-dns-77b756999f-5ptkd\" (UID: \"f7540714-0deb-4ba6-8709-846457e19966\") " pod="openstack/dnsmasq-dns-77b756999f-5ptkd" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.326251 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-ovsdbserver-nb\") pod \"dnsmasq-dns-77b756999f-5ptkd\" (UID: \"f7540714-0deb-4ba6-8709-846457e19966\") " pod="openstack/dnsmasq-dns-77b756999f-5ptkd" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.326301 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df5zw\" (UniqueName: \"kubernetes.io/projected/f7540714-0deb-4ba6-8709-846457e19966-kube-api-access-df5zw\") pod \"dnsmasq-dns-77b756999f-5ptkd\" (UID: \"f7540714-0deb-4ba6-8709-846457e19966\") " pod="openstack/dnsmasq-dns-77b756999f-5ptkd" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.326322 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\") " pod="openstack/cinder-scheduler-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.326339 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-config\") pod \"dnsmasq-dns-77b756999f-5ptkd\" (UID: \"f7540714-0deb-4ba6-8709-846457e19966\") " pod="openstack/dnsmasq-dns-77b756999f-5ptkd" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.326426 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-config-data\") pod \"cinder-scheduler-0\" (UID: \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\") " pod="openstack/cinder-scheduler-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.326445 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-dns-swift-storage-0\") pod \"dnsmasq-dns-77b756999f-5ptkd\" (UID: \"f7540714-0deb-4ba6-8709-846457e19966\") " pod="openstack/dnsmasq-dns-77b756999f-5ptkd" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.328284 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\") " pod="openstack/cinder-scheduler-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.333142 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-config-data\") pod \"cinder-scheduler-0\" (UID: \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\") " pod="openstack/cinder-scheduler-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.336218 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\") " pod="openstack/cinder-scheduler-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.336575 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-scripts\") pod \"cinder-scheduler-0\" (UID: \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\") " pod="openstack/cinder-scheduler-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.339007 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\") " pod="openstack/cinder-scheduler-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.340950 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.380290 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdcjb\" (UniqueName: \"kubernetes.io/projected/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-kube-api-access-wdcjb\") pod \"cinder-scheduler-0\" (UID: \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\") " pod="openstack/cinder-scheduler-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.382379 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.386779 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.431295 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cbef595-0a78-4655-85ca-b329f51067af-config-data\") pod \"swift-proxy-556bc79449-9bdkb\" (UID: \"6cbef595-0a78-4655-85ca-b329f51067af\") " pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.431395 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6cbef595-0a78-4655-85ca-b329f51067af-etc-swift\") pod \"swift-proxy-556bc79449-9bdkb\" (UID: \"6cbef595-0a78-4655-85ca-b329f51067af\") " pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.431428 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cbef595-0a78-4655-85ca-b329f51067af-public-tls-certs\") pod \"swift-proxy-556bc79449-9bdkb\" (UID: \"6cbef595-0a78-4655-85ca-b329f51067af\") " pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.431464 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cbef595-0a78-4655-85ca-b329f51067af-combined-ca-bundle\") pod \"swift-proxy-556bc79449-9bdkb\" (UID: \"6cbef595-0a78-4655-85ca-b329f51067af\") " pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.431499 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzms5\" (UniqueName: \"kubernetes.io/projected/6cbef595-0a78-4655-85ca-b329f51067af-kube-api-access-qzms5\") pod \"swift-proxy-556bc79449-9bdkb\" (UID: \"6cbef595-0a78-4655-85ca-b329f51067af\") " pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.431522 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cbef595-0a78-4655-85ca-b329f51067af-internal-tls-certs\") pod \"swift-proxy-556bc79449-9bdkb\" (UID: \"6cbef595-0a78-4655-85ca-b329f51067af\") " pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.431569 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-dns-svc\") pod \"dnsmasq-dns-77b756999f-5ptkd\" (UID: \"f7540714-0deb-4ba6-8709-846457e19966\") " pod="openstack/dnsmasq-dns-77b756999f-5ptkd" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.431595 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-ovsdbserver-nb\") pod \"dnsmasq-dns-77b756999f-5ptkd\" (UID: \"f7540714-0deb-4ba6-8709-846457e19966\") " pod="openstack/dnsmasq-dns-77b756999f-5ptkd" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.431633 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df5zw\" (UniqueName: \"kubernetes.io/projected/f7540714-0deb-4ba6-8709-846457e19966-kube-api-access-df5zw\") pod \"dnsmasq-dns-77b756999f-5ptkd\" (UID: \"f7540714-0deb-4ba6-8709-846457e19966\") " pod="openstack/dnsmasq-dns-77b756999f-5ptkd" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.431663 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-config\") pod \"dnsmasq-dns-77b756999f-5ptkd\" (UID: \"f7540714-0deb-4ba6-8709-846457e19966\") " pod="openstack/dnsmasq-dns-77b756999f-5ptkd" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.431690 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-dns-swift-storage-0\") pod \"dnsmasq-dns-77b756999f-5ptkd\" (UID: \"f7540714-0deb-4ba6-8709-846457e19966\") " pod="openstack/dnsmasq-dns-77b756999f-5ptkd" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.431730 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-ovsdbserver-sb\") pod \"dnsmasq-dns-77b756999f-5ptkd\" (UID: \"f7540714-0deb-4ba6-8709-846457e19966\") " pod="openstack/dnsmasq-dns-77b756999f-5ptkd" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.431762 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cbef595-0a78-4655-85ca-b329f51067af-log-httpd\") pod \"swift-proxy-556bc79449-9bdkb\" (UID: \"6cbef595-0a78-4655-85ca-b329f51067af\") " pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.431815 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cbef595-0a78-4655-85ca-b329f51067af-run-httpd\") pod \"swift-proxy-556bc79449-9bdkb\" (UID: \"6cbef595-0a78-4655-85ca-b329f51067af\") " pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.434889 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-ovsdbserver-nb\") pod \"dnsmasq-dns-77b756999f-5ptkd\" (UID: \"f7540714-0deb-4ba6-8709-846457e19966\") " pod="openstack/dnsmasq-dns-77b756999f-5ptkd" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.435256 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-dns-swift-storage-0\") pod \"dnsmasq-dns-77b756999f-5ptkd\" (UID: \"f7540714-0deb-4ba6-8709-846457e19966\") " pod="openstack/dnsmasq-dns-77b756999f-5ptkd" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.436310 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-config\") pod \"dnsmasq-dns-77b756999f-5ptkd\" (UID: \"f7540714-0deb-4ba6-8709-846457e19966\") " pod="openstack/dnsmasq-dns-77b756999f-5ptkd" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.436720 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cbef595-0a78-4655-85ca-b329f51067af-log-httpd\") pod \"swift-proxy-556bc79449-9bdkb\" (UID: \"6cbef595-0a78-4655-85ca-b329f51067af\") " pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.442663 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-ovsdbserver-sb\") pod \"dnsmasq-dns-77b756999f-5ptkd\" (UID: \"f7540714-0deb-4ba6-8709-846457e19966\") " pod="openstack/dnsmasq-dns-77b756999f-5ptkd" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.443006 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.449174 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cbef595-0a78-4655-85ca-b329f51067af-run-httpd\") pod \"swift-proxy-556bc79449-9bdkb\" (UID: \"6cbef595-0a78-4655-85ca-b329f51067af\") " pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.449592 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cbef595-0a78-4655-85ca-b329f51067af-config-data\") pod \"swift-proxy-556bc79449-9bdkb\" (UID: \"6cbef595-0a78-4655-85ca-b329f51067af\") " pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.449702 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cbef595-0a78-4655-85ca-b329f51067af-combined-ca-bundle\") pod \"swift-proxy-556bc79449-9bdkb\" (UID: \"6cbef595-0a78-4655-85ca-b329f51067af\") " pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.449799 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-dns-svc\") pod \"dnsmasq-dns-77b756999f-5ptkd\" (UID: \"f7540714-0deb-4ba6-8709-846457e19966\") " pod="openstack/dnsmasq-dns-77b756999f-5ptkd" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.450161 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.461243 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cbef595-0a78-4655-85ca-b329f51067af-public-tls-certs\") pod \"swift-proxy-556bc79449-9bdkb\" (UID: \"6cbef595-0a78-4655-85ca-b329f51067af\") " pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.461463 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cbef595-0a78-4655-85ca-b329f51067af-internal-tls-certs\") pod \"swift-proxy-556bc79449-9bdkb\" (UID: \"6cbef595-0a78-4655-85ca-b329f51067af\") " pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.465240 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6cbef595-0a78-4655-85ca-b329f51067af-etc-swift\") pod \"swift-proxy-556bc79449-9bdkb\" (UID: \"6cbef595-0a78-4655-85ca-b329f51067af\") " pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.469113 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzms5\" (UniqueName: \"kubernetes.io/projected/6cbef595-0a78-4655-85ca-b329f51067af-kube-api-access-qzms5\") pod \"swift-proxy-556bc79449-9bdkb\" (UID: \"6cbef595-0a78-4655-85ca-b329f51067af\") " pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.471973 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df5zw\" (UniqueName: \"kubernetes.io/projected/f7540714-0deb-4ba6-8709-846457e19966-kube-api-access-df5zw\") pod \"dnsmasq-dns-77b756999f-5ptkd\" (UID: \"f7540714-0deb-4ba6-8709-846457e19966\") " pod="openstack/dnsmasq-dns-77b756999f-5ptkd" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.534367 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/622c0120-b0e7-4bb7-a961-78b842f153eb-config-data-custom\") pod \"cinder-api-0\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " pod="openstack/cinder-api-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.534550 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/622c0120-b0e7-4bb7-a961-78b842f153eb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " pod="openstack/cinder-api-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.534627 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/622c0120-b0e7-4bb7-a961-78b842f153eb-config-data\") pod \"cinder-api-0\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " pod="openstack/cinder-api-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.534973 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/622c0120-b0e7-4bb7-a961-78b842f153eb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " pod="openstack/cinder-api-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.535033 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/622c0120-b0e7-4bb7-a961-78b842f153eb-scripts\") pod \"cinder-api-0\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " pod="openstack/cinder-api-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.535113 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/622c0120-b0e7-4bb7-a961-78b842f153eb-logs\") pod \"cinder-api-0\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " pod="openstack/cinder-api-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.535169 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg4dh\" (UniqueName: \"kubernetes.io/projected/622c0120-b0e7-4bb7-a961-78b842f153eb-kube-api-access-sg4dh\") pod \"cinder-api-0\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " pod="openstack/cinder-api-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.561630 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77b756999f-5ptkd" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.571670 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.584937 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b4958cb64-65wft" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.637911 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/622c0120-b0e7-4bb7-a961-78b842f153eb-config-data-custom\") pod \"cinder-api-0\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " pod="openstack/cinder-api-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.637997 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/622c0120-b0e7-4bb7-a961-78b842f153eb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " pod="openstack/cinder-api-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.638135 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/622c0120-b0e7-4bb7-a961-78b842f153eb-config-data\") pod \"cinder-api-0\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " pod="openstack/cinder-api-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.638273 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/622c0120-b0e7-4bb7-a961-78b842f153eb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " pod="openstack/cinder-api-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.638307 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/622c0120-b0e7-4bb7-a961-78b842f153eb-scripts\") pod \"cinder-api-0\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " pod="openstack/cinder-api-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.638379 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/622c0120-b0e7-4bb7-a961-78b842f153eb-logs\") pod \"cinder-api-0\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " pod="openstack/cinder-api-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.638414 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg4dh\" (UniqueName: \"kubernetes.io/projected/622c0120-b0e7-4bb7-a961-78b842f153eb-kube-api-access-sg4dh\") pod \"cinder-api-0\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " pod="openstack/cinder-api-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.640719 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/622c0120-b0e7-4bb7-a961-78b842f153eb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " pod="openstack/cinder-api-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.647784 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/622c0120-b0e7-4bb7-a961-78b842f153eb-logs\") pod \"cinder-api-0\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " pod="openstack/cinder-api-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.662896 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/622c0120-b0e7-4bb7-a961-78b842f153eb-scripts\") pod \"cinder-api-0\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " pod="openstack/cinder-api-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.666124 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/622c0120-b0e7-4bb7-a961-78b842f153eb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " pod="openstack/cinder-api-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.666716 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/622c0120-b0e7-4bb7-a961-78b842f153eb-config-data-custom\") pod \"cinder-api-0\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " pod="openstack/cinder-api-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.667689 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7b877f86f8-fptrv"] Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.667929 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7b877f86f8-fptrv" podUID="726781c9-066e-4914-bc4f-e2fc8fcef741" containerName="barbican-api-log" containerID="cri-o://7286c4c73599853a7da2ac66ebab13882f34c6393b4d517865183fbde12f3e53" gracePeriod=30 Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.668708 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7b877f86f8-fptrv" podUID="726781c9-066e-4914-bc4f-e2fc8fcef741" containerName="barbican-api" containerID="cri-o://1addc745cbf2c4f77006cdc452693a1541a72e90039e4a0c1b9e1e0d96a2c110" gracePeriod=30 Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.673302 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/622c0120-b0e7-4bb7-a961-78b842f153eb-config-data\") pod \"cinder-api-0\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " pod="openstack/cinder-api-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.673888 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg4dh\" (UniqueName: \"kubernetes.io/projected/622c0120-b0e7-4bb7-a961-78b842f153eb-kube-api-access-sg4dh\") pod \"cinder-api-0\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " pod="openstack/cinder-api-0" Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.837594 4719 generic.go:334] "Generic (PLEG): container finished" podID="9bfb5b45-7c9f-46aa-99ad-4011a81bb196" containerID="73825614b680bbd5ce91c83fcca82f1194c982a9e5220e5af0c1a7c518e59c9d" exitCode=0 Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.837643 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bfb5b45-7c9f-46aa-99ad-4011a81bb196","Type":"ContainerDied","Data":"73825614b680bbd5ce91c83fcca82f1194c982a9e5220e5af0c1a7c518e59c9d"} Oct 09 15:36:43 crc kubenswrapper[4719]: I1009 15:36:43.842804 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 15:36:44 crc kubenswrapper[4719]: I1009 15:36:44.850772 4719 generic.go:334] "Generic (PLEG): container finished" podID="726781c9-066e-4914-bc4f-e2fc8fcef741" containerID="7286c4c73599853a7da2ac66ebab13882f34c6393b4d517865183fbde12f3e53" exitCode=143 Oct 09 15:36:44 crc kubenswrapper[4719]: I1009 15:36:44.851051 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b877f86f8-fptrv" event={"ID":"726781c9-066e-4914-bc4f-e2fc8fcef741","Type":"ContainerDied","Data":"7286c4c73599853a7da2ac66ebab13882f34c6393b4d517865183fbde12f3e53"} Oct 09 15:36:45 crc kubenswrapper[4719]: I1009 15:36:45.308842 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7b877f86f8-fptrv" podUID="726781c9-066e-4914-bc4f-e2fc8fcef741" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": read tcp 10.217.0.2:33814->10.217.0.178:9311: read: connection reset by peer" Oct 09 15:36:45 crc kubenswrapper[4719]: I1009 15:36:45.309091 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7b877f86f8-fptrv" podUID="726781c9-066e-4914-bc4f-e2fc8fcef741" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": read tcp 10.217.0.2:33800->10.217.0.178:9311: read: connection reset by peer" Oct 09 15:36:45 crc kubenswrapper[4719]: I1009 15:36:45.862009 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 09 15:36:45 crc kubenswrapper[4719]: I1009 15:36:45.867670 4719 generic.go:334] "Generic (PLEG): container finished" podID="726781c9-066e-4914-bc4f-e2fc8fcef741" containerID="1addc745cbf2c4f77006cdc452693a1541a72e90039e4a0c1b9e1e0d96a2c110" exitCode=0 Oct 09 15:36:45 crc kubenswrapper[4719]: I1009 15:36:45.867801 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b877f86f8-fptrv" event={"ID":"726781c9-066e-4914-bc4f-e2fc8fcef741","Type":"ContainerDied","Data":"1addc745cbf2c4f77006cdc452693a1541a72e90039e4a0c1b9e1e0d96a2c110"} Oct 09 15:36:45 crc kubenswrapper[4719]: I1009 15:36:45.870407 4719 generic.go:334] "Generic (PLEG): container finished" podID="75999b62-ce1b-4a9b-8507-c8af12441083" containerID="1b0d8ce3fa12f7379def00ca2131e79585f6bdff6f8c5cc63816c53109df6822" exitCode=1 Oct 09 15:36:45 crc kubenswrapper[4719]: I1009 15:36:45.870454 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"75999b62-ce1b-4a9b-8507-c8af12441083","Type":"ContainerDied","Data":"1b0d8ce3fa12f7379def00ca2131e79585f6bdff6f8c5cc63816c53109df6822"} Oct 09 15:36:45 crc kubenswrapper[4719]: I1009 15:36:45.870491 4719 scope.go:117] "RemoveContainer" containerID="97c064a8c41ba55bab5a23f641b5b2c165c7994c2526df1d2692173bb8e72c4b" Oct 09 15:36:45 crc kubenswrapper[4719]: I1009 15:36:45.871270 4719 scope.go:117] "RemoveContainer" containerID="1b0d8ce3fa12f7379def00ca2131e79585f6bdff6f8c5cc63816c53109df6822" Oct 09 15:36:45 crc kubenswrapper[4719]: E1009 15:36:45.872247 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(75999b62-ce1b-4a9b-8507-c8af12441083)\"" pod="openstack/watcher-decision-engine-0" podUID="75999b62-ce1b-4a9b-8507-c8af12441083" Oct 09 15:36:46 crc kubenswrapper[4719]: I1009 15:36:46.563669 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-xbrpr"] Oct 09 15:36:46 crc kubenswrapper[4719]: I1009 15:36:46.565446 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xbrpr" Oct 09 15:36:46 crc kubenswrapper[4719]: I1009 15:36:46.577943 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xbrpr"] Oct 09 15:36:46 crc kubenswrapper[4719]: I1009 15:36:46.617190 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcmfj\" (UniqueName: \"kubernetes.io/projected/e78db87f-acd4-471d-82f8-e854df1b36ea-kube-api-access-wcmfj\") pod \"nova-api-db-create-xbrpr\" (UID: \"e78db87f-acd4-471d-82f8-e854df1b36ea\") " pod="openstack/nova-api-db-create-xbrpr" Oct 09 15:36:46 crc kubenswrapper[4719]: I1009 15:36:46.670201 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-wrbqp"] Oct 09 15:36:46 crc kubenswrapper[4719]: I1009 15:36:46.671493 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wrbqp" Oct 09 15:36:46 crc kubenswrapper[4719]: I1009 15:36:46.692526 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-wrbqp"] Oct 09 15:36:46 crc kubenswrapper[4719]: I1009 15:36:46.719947 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcmfj\" (UniqueName: \"kubernetes.io/projected/e78db87f-acd4-471d-82f8-e854df1b36ea-kube-api-access-wcmfj\") pod \"nova-api-db-create-xbrpr\" (UID: \"e78db87f-acd4-471d-82f8-e854df1b36ea\") " pod="openstack/nova-api-db-create-xbrpr" Oct 09 15:36:46 crc kubenswrapper[4719]: I1009 15:36:46.720006 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6mdg\" (UniqueName: \"kubernetes.io/projected/c3b4bbf2-4c3a-41bf-bc92-8d267af7a236-kube-api-access-m6mdg\") pod \"nova-cell0-db-create-wrbqp\" (UID: \"c3b4bbf2-4c3a-41bf-bc92-8d267af7a236\") " pod="openstack/nova-cell0-db-create-wrbqp" Oct 09 15:36:46 crc kubenswrapper[4719]: I1009 15:36:46.740230 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcmfj\" (UniqueName: \"kubernetes.io/projected/e78db87f-acd4-471d-82f8-e854df1b36ea-kube-api-access-wcmfj\") pod \"nova-api-db-create-xbrpr\" (UID: \"e78db87f-acd4-471d-82f8-e854df1b36ea\") " pod="openstack/nova-api-db-create-xbrpr" Oct 09 15:36:46 crc kubenswrapper[4719]: I1009 15:36:46.792500 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 09 15:36:46 crc kubenswrapper[4719]: I1009 15:36:46.792552 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 09 15:36:46 crc kubenswrapper[4719]: I1009 15:36:46.822467 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6mdg\" (UniqueName: \"kubernetes.io/projected/c3b4bbf2-4c3a-41bf-bc92-8d267af7a236-kube-api-access-m6mdg\") pod \"nova-cell0-db-create-wrbqp\" (UID: \"c3b4bbf2-4c3a-41bf-bc92-8d267af7a236\") " pod="openstack/nova-cell0-db-create-wrbqp" Oct 09 15:36:46 crc kubenswrapper[4719]: I1009 15:36:46.852530 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6mdg\" (UniqueName: \"kubernetes.io/projected/c3b4bbf2-4c3a-41bf-bc92-8d267af7a236-kube-api-access-m6mdg\") pod \"nova-cell0-db-create-wrbqp\" (UID: \"c3b4bbf2-4c3a-41bf-bc92-8d267af7a236\") " pod="openstack/nova-cell0-db-create-wrbqp" Oct 09 15:36:46 crc kubenswrapper[4719]: I1009 15:36:46.878442 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-rlr7g"] Oct 09 15:36:46 crc kubenswrapper[4719]: I1009 15:36:46.879746 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rlr7g" Oct 09 15:36:46 crc kubenswrapper[4719]: I1009 15:36:46.883405 4719 scope.go:117] "RemoveContainer" containerID="1b0d8ce3fa12f7379def00ca2131e79585f6bdff6f8c5cc63816c53109df6822" Oct 09 15:36:46 crc kubenswrapper[4719]: E1009 15:36:46.883646 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(75999b62-ce1b-4a9b-8507-c8af12441083)\"" pod="openstack/watcher-decision-engine-0" podUID="75999b62-ce1b-4a9b-8507-c8af12441083" Oct 09 15:36:46 crc kubenswrapper[4719]: I1009 15:36:46.889862 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-rlr7g"] Oct 09 15:36:46 crc kubenswrapper[4719]: I1009 15:36:46.901310 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xbrpr" Oct 09 15:36:46 crc kubenswrapper[4719]: I1009 15:36:46.924026 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cgmc\" (UniqueName: \"kubernetes.io/projected/2352adae-2de1-4980-8391-53cf9c4c14a2-kube-api-access-8cgmc\") pod \"nova-cell1-db-create-rlr7g\" (UID: \"2352adae-2de1-4980-8391-53cf9c4c14a2\") " pod="openstack/nova-cell1-db-create-rlr7g" Oct 09 15:36:46 crc kubenswrapper[4719]: I1009 15:36:46.993247 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wrbqp" Oct 09 15:36:47 crc kubenswrapper[4719]: I1009 15:36:47.026075 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cgmc\" (UniqueName: \"kubernetes.io/projected/2352adae-2de1-4980-8391-53cf9c4c14a2-kube-api-access-8cgmc\") pod \"nova-cell1-db-create-rlr7g\" (UID: \"2352adae-2de1-4980-8391-53cf9c4c14a2\") " pod="openstack/nova-cell1-db-create-rlr7g" Oct 09 15:36:47 crc kubenswrapper[4719]: I1009 15:36:47.047639 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cgmc\" (UniqueName: \"kubernetes.io/projected/2352adae-2de1-4980-8391-53cf9c4c14a2-kube-api-access-8cgmc\") pod \"nova-cell1-db-create-rlr7g\" (UID: \"2352adae-2de1-4980-8391-53cf9c4c14a2\") " pod="openstack/nova-cell1-db-create-rlr7g" Oct 09 15:36:47 crc kubenswrapper[4719]: I1009 15:36:47.226182 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rlr7g" Oct 09 15:36:48 crc kubenswrapper[4719]: I1009 15:36:48.737554 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7b877f86f8-fptrv" podUID="726781c9-066e-4914-bc4f-e2fc8fcef741" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": dial tcp 10.217.0.178:9311: connect: connection refused" Oct 09 15:36:48 crc kubenswrapper[4719]: I1009 15:36:48.739712 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7b877f86f8-fptrv" podUID="726781c9-066e-4914-bc4f-e2fc8fcef741" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": dial tcp 10.217.0.178:9311: connect: connection refused" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.290902 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.418537 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-scripts\") pod \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.418943 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-run-httpd\") pod \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.419054 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-combined-ca-bundle\") pod \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.419138 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbwsp\" (UniqueName: \"kubernetes.io/projected/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-kube-api-access-zbwsp\") pod \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.419277 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-log-httpd\") pod \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.419454 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-config-data\") pod \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.419551 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-sg-core-conf-yaml\") pod \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\" (UID: \"9bfb5b45-7c9f-46aa-99ad-4011a81bb196\") " Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.419658 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9bfb5b45-7c9f-46aa-99ad-4011a81bb196" (UID: "9bfb5b45-7c9f-46aa-99ad-4011a81bb196"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.419925 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9bfb5b45-7c9f-46aa-99ad-4011a81bb196" (UID: "9bfb5b45-7c9f-46aa-99ad-4011a81bb196"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.420226 4719 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.420297 4719 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.424564 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-scripts" (OuterVolumeSpecName: "scripts") pod "9bfb5b45-7c9f-46aa-99ad-4011a81bb196" (UID: "9bfb5b45-7c9f-46aa-99ad-4011a81bb196"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.425329 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-kube-api-access-zbwsp" (OuterVolumeSpecName: "kube-api-access-zbwsp") pod "9bfb5b45-7c9f-46aa-99ad-4011a81bb196" (UID: "9bfb5b45-7c9f-46aa-99ad-4011a81bb196"). InnerVolumeSpecName "kube-api-access-zbwsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.453974 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9bfb5b45-7c9f-46aa-99ad-4011a81bb196" (UID: "9bfb5b45-7c9f-46aa-99ad-4011a81bb196"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.472311 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b877f86f8-fptrv" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.521879 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/726781c9-066e-4914-bc4f-e2fc8fcef741-config-data-custom\") pod \"726781c9-066e-4914-bc4f-e2fc8fcef741\" (UID: \"726781c9-066e-4914-bc4f-e2fc8fcef741\") " Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.522072 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/726781c9-066e-4914-bc4f-e2fc8fcef741-config-data\") pod \"726781c9-066e-4914-bc4f-e2fc8fcef741\" (UID: \"726781c9-066e-4914-bc4f-e2fc8fcef741\") " Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.522175 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/726781c9-066e-4914-bc4f-e2fc8fcef741-logs\") pod \"726781c9-066e-4914-bc4f-e2fc8fcef741\" (UID: \"726781c9-066e-4914-bc4f-e2fc8fcef741\") " Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.522208 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6w78\" (UniqueName: \"kubernetes.io/projected/726781c9-066e-4914-bc4f-e2fc8fcef741-kube-api-access-c6w78\") pod \"726781c9-066e-4914-bc4f-e2fc8fcef741\" (UID: \"726781c9-066e-4914-bc4f-e2fc8fcef741\") " Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.522258 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/726781c9-066e-4914-bc4f-e2fc8fcef741-combined-ca-bundle\") pod \"726781c9-066e-4914-bc4f-e2fc8fcef741\" (UID: \"726781c9-066e-4914-bc4f-e2fc8fcef741\") " Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.522625 4719 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.522635 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.522644 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbwsp\" (UniqueName: \"kubernetes.io/projected/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-kube-api-access-zbwsp\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.548939 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/726781c9-066e-4914-bc4f-e2fc8fcef741-logs" (OuterVolumeSpecName: "logs") pod "726781c9-066e-4914-bc4f-e2fc8fcef741" (UID: "726781c9-066e-4914-bc4f-e2fc8fcef741"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.557370 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/726781c9-066e-4914-bc4f-e2fc8fcef741-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "726781c9-066e-4914-bc4f-e2fc8fcef741" (UID: "726781c9-066e-4914-bc4f-e2fc8fcef741"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.563424 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/726781c9-066e-4914-bc4f-e2fc8fcef741-kube-api-access-c6w78" (OuterVolumeSpecName: "kube-api-access-c6w78") pod "726781c9-066e-4914-bc4f-e2fc8fcef741" (UID: "726781c9-066e-4914-bc4f-e2fc8fcef741"). InnerVolumeSpecName "kube-api-access-c6w78". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.573532 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9bfb5b45-7c9f-46aa-99ad-4011a81bb196" (UID: "9bfb5b45-7c9f-46aa-99ad-4011a81bb196"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.585915 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/726781c9-066e-4914-bc4f-e2fc8fcef741-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "726781c9-066e-4914-bc4f-e2fc8fcef741" (UID: "726781c9-066e-4914-bc4f-e2fc8fcef741"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.592429 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-config-data" (OuterVolumeSpecName: "config-data") pod "9bfb5b45-7c9f-46aa-99ad-4011a81bb196" (UID: "9bfb5b45-7c9f-46aa-99ad-4011a81bb196"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.600163 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/726781c9-066e-4914-bc4f-e2fc8fcef741-config-data" (OuterVolumeSpecName: "config-data") pod "726781c9-066e-4914-bc4f-e2fc8fcef741" (UID: "726781c9-066e-4914-bc4f-e2fc8fcef741"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.624985 4719 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/726781c9-066e-4914-bc4f-e2fc8fcef741-logs\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.625019 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6w78\" (UniqueName: \"kubernetes.io/projected/726781c9-066e-4914-bc4f-e2fc8fcef741-kube-api-access-c6w78\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.625031 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/726781c9-066e-4914-bc4f-e2fc8fcef741-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.625039 4719 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/726781c9-066e-4914-bc4f-e2fc8fcef741-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.625050 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.625059 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/726781c9-066e-4914-bc4f-e2fc8fcef741-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.625067 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bfb5b45-7c9f-46aa-99ad-4011a81bb196-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.976499 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"893b05af-4bf3-4c76-940c-3ed1cceb7e18","Type":"ContainerStarted","Data":"9aec7474f2b940c6c4a5ed070a7cb487f04ab70b90f5416156324fbfc456fdf0"} Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.979943 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b877f86f8-fptrv" event={"ID":"726781c9-066e-4914-bc4f-e2fc8fcef741","Type":"ContainerDied","Data":"e858f51aa82145e7850992c94dcddfe6eaaafeddf302559505e31daebd77c4d4"} Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.979978 4719 scope.go:117] "RemoveContainer" containerID="1addc745cbf2c4f77006cdc452693a1541a72e90039e4a0c1b9e1e0d96a2c110" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.980040 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b877f86f8-fptrv" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.985087 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:36:51 crc kubenswrapper[4719]: I1009 15:36:51.987455 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bfb5b45-7c9f-46aa-99ad-4011a81bb196","Type":"ContainerDied","Data":"ce00b6775b0901280024ab84e319001b04b547e806bae0d76fc136d868fddacf"} Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.006846 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.641730499 podStartE2EDuration="20.006825937s" podCreationTimestamp="2025-10-09 15:36:32 +0000 UTC" firstStartedPulling="2025-10-09 15:36:33.528805137 +0000 UTC m=+1099.038516422" lastFinishedPulling="2025-10-09 15:36:50.893900575 +0000 UTC m=+1116.403611860" observedRunningTime="2025-10-09 15:36:52.003773379 +0000 UTC m=+1117.513484674" watchObservedRunningTime="2025-10-09 15:36:52.006825937 +0000 UTC m=+1117.516537222" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.050910 4719 scope.go:117] "RemoveContainer" containerID="7286c4c73599853a7da2ac66ebab13882f34c6393b4d517865183fbde12f3e53" Oct 09 15:36:52 crc kubenswrapper[4719]: W1009 15:36:52.064447 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode78db87f_acd4_471d_82f8_e854df1b36ea.slice/crio-f9cbccb1512dc622fb27b51c2417261c98bb0af86e707670c0981161b00e9b16 WatchSource:0}: Error finding container f9cbccb1512dc622fb27b51c2417261c98bb0af86e707670c0981161b00e9b16: Status 404 returned error can't find the container with id f9cbccb1512dc622fb27b51c2417261c98bb0af86e707670c0981161b00e9b16 Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.065830 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xbrpr"] Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.101713 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7b877f86f8-fptrv"] Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.120535 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7b877f86f8-fptrv"] Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.140004 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.155221 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.181034 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.193410 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-wrbqp"] Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.206537 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.218042 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:36:52 crc kubenswrapper[4719]: E1009 15:36:52.220498 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="726781c9-066e-4914-bc4f-e2fc8fcef741" containerName="barbican-api-log" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.220521 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="726781c9-066e-4914-bc4f-e2fc8fcef741" containerName="barbican-api-log" Oct 09 15:36:52 crc kubenswrapper[4719]: E1009 15:36:52.220538 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bfb5b45-7c9f-46aa-99ad-4011a81bb196" containerName="proxy-httpd" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.220546 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bfb5b45-7c9f-46aa-99ad-4011a81bb196" containerName="proxy-httpd" Oct 09 15:36:52 crc kubenswrapper[4719]: E1009 15:36:52.220564 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bfb5b45-7c9f-46aa-99ad-4011a81bb196" containerName="ceilometer-notification-agent" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.220573 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bfb5b45-7c9f-46aa-99ad-4011a81bb196" containerName="ceilometer-notification-agent" Oct 09 15:36:52 crc kubenswrapper[4719]: E1009 15:36:52.220587 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bfb5b45-7c9f-46aa-99ad-4011a81bb196" containerName="sg-core" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.220594 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bfb5b45-7c9f-46aa-99ad-4011a81bb196" containerName="sg-core" Oct 09 15:36:52 crc kubenswrapper[4719]: E1009 15:36:52.220622 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bfb5b45-7c9f-46aa-99ad-4011a81bb196" containerName="ceilometer-central-agent" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.220630 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bfb5b45-7c9f-46aa-99ad-4011a81bb196" containerName="ceilometer-central-agent" Oct 09 15:36:52 crc kubenswrapper[4719]: E1009 15:36:52.220644 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="726781c9-066e-4914-bc4f-e2fc8fcef741" containerName="barbican-api" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.220651 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="726781c9-066e-4914-bc4f-e2fc8fcef741" containerName="barbican-api" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.220887 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bfb5b45-7c9f-46aa-99ad-4011a81bb196" containerName="ceilometer-central-agent" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.220907 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="726781c9-066e-4914-bc4f-e2fc8fcef741" containerName="barbican-api-log" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.220921 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bfb5b45-7c9f-46aa-99ad-4011a81bb196" containerName="ceilometer-notification-agent" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.220943 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bfb5b45-7c9f-46aa-99ad-4011a81bb196" containerName="sg-core" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.220965 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="726781c9-066e-4914-bc4f-e2fc8fcef741" containerName="barbican-api" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.221249 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bfb5b45-7c9f-46aa-99ad-4011a81bb196" containerName="proxy-httpd" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.225183 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.227186 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.227605 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.231647 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77b756999f-5ptkd"] Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.248788 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-rlr7g"] Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.257287 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.310763 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-556bc79449-9bdkb"] Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.312514 4719 scope.go:117] "RemoveContainer" containerID="6badb6bbb59aa7562a1599a36734b995ec0a0f476f59d31df6c1876731ae2a89" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.358485 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/891bb60f-cf37-4c12-8907-4ff654886e06-config-data\") pod \"ceilometer-0\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " pod="openstack/ceilometer-0" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.359525 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/891bb60f-cf37-4c12-8907-4ff654886e06-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " pod="openstack/ceilometer-0" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.359578 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/891bb60f-cf37-4c12-8907-4ff654886e06-log-httpd\") pod \"ceilometer-0\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " pod="openstack/ceilometer-0" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.359645 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/891bb60f-cf37-4c12-8907-4ff654886e06-scripts\") pod \"ceilometer-0\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " pod="openstack/ceilometer-0" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.359670 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/891bb60f-cf37-4c12-8907-4ff654886e06-run-httpd\") pod \"ceilometer-0\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " pod="openstack/ceilometer-0" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.359715 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/891bb60f-cf37-4c12-8907-4ff654886e06-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " pod="openstack/ceilometer-0" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.359749 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mht27\" (UniqueName: \"kubernetes.io/projected/891bb60f-cf37-4c12-8907-4ff654886e06-kube-api-access-mht27\") pod \"ceilometer-0\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " pod="openstack/ceilometer-0" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.461908 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/891bb60f-cf37-4c12-8907-4ff654886e06-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " pod="openstack/ceilometer-0" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.462309 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mht27\" (UniqueName: \"kubernetes.io/projected/891bb60f-cf37-4c12-8907-4ff654886e06-kube-api-access-mht27\") pod \"ceilometer-0\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " pod="openstack/ceilometer-0" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.462379 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/891bb60f-cf37-4c12-8907-4ff654886e06-config-data\") pod \"ceilometer-0\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " pod="openstack/ceilometer-0" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.462452 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/891bb60f-cf37-4c12-8907-4ff654886e06-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " pod="openstack/ceilometer-0" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.462512 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/891bb60f-cf37-4c12-8907-4ff654886e06-log-httpd\") pod \"ceilometer-0\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " pod="openstack/ceilometer-0" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.462593 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/891bb60f-cf37-4c12-8907-4ff654886e06-scripts\") pod \"ceilometer-0\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " pod="openstack/ceilometer-0" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.462628 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/891bb60f-cf37-4c12-8907-4ff654886e06-run-httpd\") pod \"ceilometer-0\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " pod="openstack/ceilometer-0" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.463010 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/891bb60f-cf37-4c12-8907-4ff654886e06-run-httpd\") pod \"ceilometer-0\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " pod="openstack/ceilometer-0" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.463869 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/891bb60f-cf37-4c12-8907-4ff654886e06-log-httpd\") pod \"ceilometer-0\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " pod="openstack/ceilometer-0" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.466376 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/891bb60f-cf37-4c12-8907-4ff654886e06-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " pod="openstack/ceilometer-0" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.466726 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/891bb60f-cf37-4c12-8907-4ff654886e06-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " pod="openstack/ceilometer-0" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.472467 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/891bb60f-cf37-4c12-8907-4ff654886e06-config-data\") pod \"ceilometer-0\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " pod="openstack/ceilometer-0" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.480978 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/891bb60f-cf37-4c12-8907-4ff654886e06-scripts\") pod \"ceilometer-0\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " pod="openstack/ceilometer-0" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.490333 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mht27\" (UniqueName: \"kubernetes.io/projected/891bb60f-cf37-4c12-8907-4ff654886e06-kube-api-access-mht27\") pod \"ceilometer-0\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " pod="openstack/ceilometer-0" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.535891 4719 scope.go:117] "RemoveContainer" containerID="18c33dd5b6d35bf57afe74f800ff13811b3b8acb9407caced78a862213cd541b" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.569924 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.593724 4719 scope.go:117] "RemoveContainer" containerID="e69caa09e16808f6ca6d9bad1a134dbd5ce2e6e06249d20d1636d4f99f3dc051" Oct 09 15:36:52 crc kubenswrapper[4719]: I1009 15:36:52.643064 4719 scope.go:117] "RemoveContainer" containerID="73825614b680bbd5ce91c83fcca82f1194c982a9e5220e5af0c1a7c518e59c9d" Oct 09 15:36:53 crc kubenswrapper[4719]: I1009 15:36:53.007459 4719 generic.go:334] "Generic (PLEG): container finished" podID="2352adae-2de1-4980-8391-53cf9c4c14a2" containerID="f2da0bb2674d9b973871e6dfdeb35437e6c8bdc74878e04a986d14a0da562bb9" exitCode=0 Oct 09 15:36:53 crc kubenswrapper[4719]: I1009 15:36:53.007507 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-rlr7g" event={"ID":"2352adae-2de1-4980-8391-53cf9c4c14a2","Type":"ContainerDied","Data":"f2da0bb2674d9b973871e6dfdeb35437e6c8bdc74878e04a986d14a0da562bb9"} Oct 09 15:36:53 crc kubenswrapper[4719]: I1009 15:36:53.007895 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-rlr7g" event={"ID":"2352adae-2de1-4980-8391-53cf9c4c14a2","Type":"ContainerStarted","Data":"756b4dc63c8888e75361598a6d718da2f47f19a1a2c99297ee897b260a3ff3bd"} Oct 09 15:36:53 crc kubenswrapper[4719]: I1009 15:36:53.034581 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77b756999f-5ptkd" event={"ID":"f7540714-0deb-4ba6-8709-846457e19966","Type":"ContainerStarted","Data":"d8521268a7bed8a58a83e7f4b2e0bb8e6726efbeceff9348c0e2edea9734ea88"} Oct 09 15:36:53 crc kubenswrapper[4719]: I1009 15:36:53.036359 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a4255094-4bc0-4cc5-bf14-c663dd9e17e7","Type":"ContainerStarted","Data":"ff48cb35b869bde6dca57bc93d145117af2601cc9b4278ea01817dc74a03c68b"} Oct 09 15:36:53 crc kubenswrapper[4719]: I1009 15:36:53.040748 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-556bc79449-9bdkb" event={"ID":"6cbef595-0a78-4655-85ca-b329f51067af","Type":"ContainerStarted","Data":"74b9422c5646d19013f62962c6e9389c28540a10f6cf55f53d4e3a2bf2bfb0a3"} Oct 09 15:36:53 crc kubenswrapper[4719]: I1009 15:36:53.040776 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-556bc79449-9bdkb" event={"ID":"6cbef595-0a78-4655-85ca-b329f51067af","Type":"ContainerStarted","Data":"6058abbb0511736eae1ce1e6554507666eac4fe616071ff4484fb981c2d35d6d"} Oct 09 15:36:53 crc kubenswrapper[4719]: I1009 15:36:53.054413 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"622c0120-b0e7-4bb7-a961-78b842f153eb","Type":"ContainerStarted","Data":"03eaccc70d052cd9b39d772e85c66475b5935e8de634a0aef2e341fdbe34d7e4"} Oct 09 15:36:53 crc kubenswrapper[4719]: I1009 15:36:53.058269 4719 generic.go:334] "Generic (PLEG): container finished" podID="c3b4bbf2-4c3a-41bf-bc92-8d267af7a236" containerID="d6e48031f32a43b042e0809dc21ad822e1e42e18482eb44ff54f17a57b726d49" exitCode=0 Oct 09 15:36:53 crc kubenswrapper[4719]: I1009 15:36:53.058446 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-wrbqp" event={"ID":"c3b4bbf2-4c3a-41bf-bc92-8d267af7a236","Type":"ContainerDied","Data":"d6e48031f32a43b042e0809dc21ad822e1e42e18482eb44ff54f17a57b726d49"} Oct 09 15:36:53 crc kubenswrapper[4719]: I1009 15:36:53.058506 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-wrbqp" event={"ID":"c3b4bbf2-4c3a-41bf-bc92-8d267af7a236","Type":"ContainerStarted","Data":"e719aabf1932a807a2b5b30b75fc4d9b20a574b570f6e210815db21dd6c559c2"} Oct 09 15:36:53 crc kubenswrapper[4719]: I1009 15:36:53.065451 4719 generic.go:334] "Generic (PLEG): container finished" podID="e78db87f-acd4-471d-82f8-e854df1b36ea" containerID="d2a7b4859cbaf068dc0aa79909bca5a2ecb32678b5d80d753c30064df2cd59cb" exitCode=0 Oct 09 15:36:53 crc kubenswrapper[4719]: I1009 15:36:53.065554 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xbrpr" event={"ID":"e78db87f-acd4-471d-82f8-e854df1b36ea","Type":"ContainerDied","Data":"d2a7b4859cbaf068dc0aa79909bca5a2ecb32678b5d80d753c30064df2cd59cb"} Oct 09 15:36:53 crc kubenswrapper[4719]: I1009 15:36:53.065596 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xbrpr" event={"ID":"e78db87f-acd4-471d-82f8-e854df1b36ea","Type":"ContainerStarted","Data":"f9cbccb1512dc622fb27b51c2417261c98bb0af86e707670c0981161b00e9b16"} Oct 09 15:36:53 crc kubenswrapper[4719]: W1009 15:36:53.183584 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod891bb60f_cf37_4c12_8907_4ff654886e06.slice/crio-622025d6a904faf315c1110529e3540e812e054d78f2276615fdfaf9968bd624 WatchSource:0}: Error finding container 622025d6a904faf315c1110529e3540e812e054d78f2276615fdfaf9968bd624: Status 404 returned error can't find the container with id 622025d6a904faf315c1110529e3540e812e054d78f2276615fdfaf9968bd624 Oct 09 15:36:53 crc kubenswrapper[4719]: I1009 15:36:53.183838 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="726781c9-066e-4914-bc4f-e2fc8fcef741" path="/var/lib/kubelet/pods/726781c9-066e-4914-bc4f-e2fc8fcef741/volumes" Oct 09 15:36:53 crc kubenswrapper[4719]: I1009 15:36:53.185087 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bfb5b45-7c9f-46aa-99ad-4011a81bb196" path="/var/lib/kubelet/pods/9bfb5b45-7c9f-46aa-99ad-4011a81bb196/volumes" Oct 09 15:36:53 crc kubenswrapper[4719]: I1009 15:36:53.186065 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:36:53 crc kubenswrapper[4719]: I1009 15:36:53.849203 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-58ddf56cd8-cl6cc" Oct 09 15:36:54 crc kubenswrapper[4719]: I1009 15:36:54.097573 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-556bc79449-9bdkb" event={"ID":"6cbef595-0a78-4655-85ca-b329f51067af","Type":"ContainerStarted","Data":"749d099d6ea6b4f0b0fe0f0626d9bcc8c1e3d2837a38f8b859da6d9c94a2b5a5"} Oct 09 15:36:54 crc kubenswrapper[4719]: I1009 15:36:54.097972 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:54 crc kubenswrapper[4719]: I1009 15:36:54.097999 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:54 crc kubenswrapper[4719]: I1009 15:36:54.114547 4719 generic.go:334] "Generic (PLEG): container finished" podID="f7540714-0deb-4ba6-8709-846457e19966" containerID="8acc01f937a88ec70793212478f7312dc7f746927a49e045013394d8fa0b2e6c" exitCode=0 Oct 09 15:36:54 crc kubenswrapper[4719]: I1009 15:36:54.114611 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77b756999f-5ptkd" event={"ID":"f7540714-0deb-4ba6-8709-846457e19966","Type":"ContainerDied","Data":"8acc01f937a88ec70793212478f7312dc7f746927a49e045013394d8fa0b2e6c"} Oct 09 15:36:54 crc kubenswrapper[4719]: I1009 15:36:54.127726 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-556bc79449-9bdkb" podStartSLOduration=11.12770618 podStartE2EDuration="11.12770618s" podCreationTimestamp="2025-10-09 15:36:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:36:54.121906904 +0000 UTC m=+1119.631618189" watchObservedRunningTime="2025-10-09 15:36:54.12770618 +0000 UTC m=+1119.637417465" Oct 09 15:36:54 crc kubenswrapper[4719]: I1009 15:36:54.128589 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"622c0120-b0e7-4bb7-a961-78b842f153eb","Type":"ContainerStarted","Data":"d429756686e39ac036782fcedc325f6eff7b7234a0f3c90370cf6a4b0caca2bb"} Oct 09 15:36:54 crc kubenswrapper[4719]: I1009 15:36:54.157924 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"891bb60f-cf37-4c12-8907-4ff654886e06","Type":"ContainerStarted","Data":"d5b6a17d0e627a0cb3e428fc34c217b8819f0837c1b622fc4459e5dc5f2680c9"} Oct 09 15:36:54 crc kubenswrapper[4719]: I1009 15:36:54.157966 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"891bb60f-cf37-4c12-8907-4ff654886e06","Type":"ContainerStarted","Data":"72fd53e4f308ec26610a590d41cd0191d547dc131090ec90c4dcb57428e273d4"} Oct 09 15:36:54 crc kubenswrapper[4719]: I1009 15:36:54.157975 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"891bb60f-cf37-4c12-8907-4ff654886e06","Type":"ContainerStarted","Data":"622025d6a904faf315c1110529e3540e812e054d78f2276615fdfaf9968bd624"} Oct 09 15:36:54 crc kubenswrapper[4719]: I1009 15:36:54.168664 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a4255094-4bc0-4cc5-bf14-c663dd9e17e7","Type":"ContainerStarted","Data":"f3a970ec68d524e6953aecb25820acfbab374f1d8635c01863349942ce51a813"} Oct 09 15:36:54 crc kubenswrapper[4719]: I1009 15:36:54.536543 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wrbqp" Oct 09 15:36:54 crc kubenswrapper[4719]: I1009 15:36:54.608533 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6mdg\" (UniqueName: \"kubernetes.io/projected/c3b4bbf2-4c3a-41bf-bc92-8d267af7a236-kube-api-access-m6mdg\") pod \"c3b4bbf2-4c3a-41bf-bc92-8d267af7a236\" (UID: \"c3b4bbf2-4c3a-41bf-bc92-8d267af7a236\") " Oct 09 15:36:54 crc kubenswrapper[4719]: I1009 15:36:54.624023 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3b4bbf2-4c3a-41bf-bc92-8d267af7a236-kube-api-access-m6mdg" (OuterVolumeSpecName: "kube-api-access-m6mdg") pod "c3b4bbf2-4c3a-41bf-bc92-8d267af7a236" (UID: "c3b4bbf2-4c3a-41bf-bc92-8d267af7a236"). InnerVolumeSpecName "kube-api-access-m6mdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:36:54 crc kubenswrapper[4719]: I1009 15:36:54.710947 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6mdg\" (UniqueName: \"kubernetes.io/projected/c3b4bbf2-4c3a-41bf-bc92-8d267af7a236-kube-api-access-m6mdg\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.045402 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xbrpr" Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.054549 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rlr7g" Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.127420 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcmfj\" (UniqueName: \"kubernetes.io/projected/e78db87f-acd4-471d-82f8-e854df1b36ea-kube-api-access-wcmfj\") pod \"e78db87f-acd4-471d-82f8-e854df1b36ea\" (UID: \"e78db87f-acd4-471d-82f8-e854df1b36ea\") " Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.127564 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cgmc\" (UniqueName: \"kubernetes.io/projected/2352adae-2de1-4980-8391-53cf9c4c14a2-kube-api-access-8cgmc\") pod \"2352adae-2de1-4980-8391-53cf9c4c14a2\" (UID: \"2352adae-2de1-4980-8391-53cf9c4c14a2\") " Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.136918 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e78db87f-acd4-471d-82f8-e854df1b36ea-kube-api-access-wcmfj" (OuterVolumeSpecName: "kube-api-access-wcmfj") pod "e78db87f-acd4-471d-82f8-e854df1b36ea" (UID: "e78db87f-acd4-471d-82f8-e854df1b36ea"). InnerVolumeSpecName "kube-api-access-wcmfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.157318 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2352adae-2de1-4980-8391-53cf9c4c14a2-kube-api-access-8cgmc" (OuterVolumeSpecName: "kube-api-access-8cgmc") pod "2352adae-2de1-4980-8391-53cf9c4c14a2" (UID: "2352adae-2de1-4980-8391-53cf9c4c14a2"). InnerVolumeSpecName "kube-api-access-8cgmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.185936 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-rlr7g" event={"ID":"2352adae-2de1-4980-8391-53cf9c4c14a2","Type":"ContainerDied","Data":"756b4dc63c8888e75361598a6d718da2f47f19a1a2c99297ee897b260a3ff3bd"} Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.185979 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="756b4dc63c8888e75361598a6d718da2f47f19a1a2c99297ee897b260a3ff3bd" Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.186043 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rlr7g" Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.195134 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77b756999f-5ptkd" event={"ID":"f7540714-0deb-4ba6-8709-846457e19966","Type":"ContainerStarted","Data":"583be44f9b958a7b642c9052de211535ca718085fd44ce13e329ad6434301c54"} Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.195188 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77b756999f-5ptkd" Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.205872 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"622c0120-b0e7-4bb7-a961-78b842f153eb","Type":"ContainerStarted","Data":"75af50c215da2170b0dc7b0a5e3cb81b13e56d442eb7c7a94571e5ea6a30d748"} Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.206010 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="622c0120-b0e7-4bb7-a961-78b842f153eb" containerName="cinder-api-log" containerID="cri-o://d429756686e39ac036782fcedc325f6eff7b7234a0f3c90370cf6a4b0caca2bb" gracePeriod=30 Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.206157 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="622c0120-b0e7-4bb7-a961-78b842f153eb" containerName="cinder-api" containerID="cri-o://75af50c215da2170b0dc7b0a5e3cb81b13e56d442eb7c7a94571e5ea6a30d748" gracePeriod=30 Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.206275 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.216899 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"891bb60f-cf37-4c12-8907-4ff654886e06","Type":"ContainerStarted","Data":"e6f5f5da90afc5862246a3cb280c2cfb165cb727b1b3378acf1935e1e68a96c6"} Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.231995 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cgmc\" (UniqueName: \"kubernetes.io/projected/2352adae-2de1-4980-8391-53cf9c4c14a2-kube-api-access-8cgmc\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.232038 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcmfj\" (UniqueName: \"kubernetes.io/projected/e78db87f-acd4-471d-82f8-e854df1b36ea-kube-api-access-wcmfj\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.241726 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a4255094-4bc0-4cc5-bf14-c663dd9e17e7","Type":"ContainerStarted","Data":"94e8511f687e02172c7beb1f4b16f33db67db61bf7681ee68958a7379e05e09f"} Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.259564 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wrbqp" Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.259625 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-wrbqp" event={"ID":"c3b4bbf2-4c3a-41bf-bc92-8d267af7a236","Type":"ContainerDied","Data":"e719aabf1932a807a2b5b30b75fc4d9b20a574b570f6e210815db21dd6c559c2"} Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.259662 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e719aabf1932a807a2b5b30b75fc4d9b20a574b570f6e210815db21dd6c559c2" Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.266573 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77b756999f-5ptkd" podStartSLOduration=12.266553652 podStartE2EDuration="12.266553652s" podCreationTimestamp="2025-10-09 15:36:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:36:55.263544586 +0000 UTC m=+1120.773255871" watchObservedRunningTime="2025-10-09 15:36:55.266553652 +0000 UTC m=+1120.776264937" Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.267412 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xbrpr" Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.267526 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xbrpr" event={"ID":"e78db87f-acd4-471d-82f8-e854df1b36ea","Type":"ContainerDied","Data":"f9cbccb1512dc622fb27b51c2417261c98bb0af86e707670c0981161b00e9b16"} Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.267620 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9cbccb1512dc622fb27b51c2417261c98bb0af86e707670c0981161b00e9b16" Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.286748 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=11.801173405 podStartE2EDuration="12.286710918s" podCreationTimestamp="2025-10-09 15:36:43 +0000 UTC" firstStartedPulling="2025-10-09 15:36:52.108637819 +0000 UTC m=+1117.618349104" lastFinishedPulling="2025-10-09 15:36:52.594175332 +0000 UTC m=+1118.103886617" observedRunningTime="2025-10-09 15:36:55.278917468 +0000 UTC m=+1120.788628763" watchObservedRunningTime="2025-10-09 15:36:55.286710918 +0000 UTC m=+1120.796422203" Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.310133 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=12.310113446999999 podStartE2EDuration="12.310113447s" podCreationTimestamp="2025-10-09 15:36:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:36:55.302128991 +0000 UTC m=+1120.811840276" watchObservedRunningTime="2025-10-09 15:36:55.310113447 +0000 UTC m=+1120.819824732" Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.880288 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.946088 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/622c0120-b0e7-4bb7-a961-78b842f153eb-scripts\") pod \"622c0120-b0e7-4bb7-a961-78b842f153eb\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.946192 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/622c0120-b0e7-4bb7-a961-78b842f153eb-combined-ca-bundle\") pod \"622c0120-b0e7-4bb7-a961-78b842f153eb\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.946224 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/622c0120-b0e7-4bb7-a961-78b842f153eb-logs\") pod \"622c0120-b0e7-4bb7-a961-78b842f153eb\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.946266 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/622c0120-b0e7-4bb7-a961-78b842f153eb-config-data\") pod \"622c0120-b0e7-4bb7-a961-78b842f153eb\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.946370 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg4dh\" (UniqueName: \"kubernetes.io/projected/622c0120-b0e7-4bb7-a961-78b842f153eb-kube-api-access-sg4dh\") pod \"622c0120-b0e7-4bb7-a961-78b842f153eb\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.946395 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/622c0120-b0e7-4bb7-a961-78b842f153eb-etc-machine-id\") pod \"622c0120-b0e7-4bb7-a961-78b842f153eb\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.946497 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/622c0120-b0e7-4bb7-a961-78b842f153eb-config-data-custom\") pod \"622c0120-b0e7-4bb7-a961-78b842f153eb\" (UID: \"622c0120-b0e7-4bb7-a961-78b842f153eb\") " Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.947316 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/622c0120-b0e7-4bb7-a961-78b842f153eb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "622c0120-b0e7-4bb7-a961-78b842f153eb" (UID: "622c0120-b0e7-4bb7-a961-78b842f153eb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.947400 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/622c0120-b0e7-4bb7-a961-78b842f153eb-logs" (OuterVolumeSpecName: "logs") pod "622c0120-b0e7-4bb7-a961-78b842f153eb" (UID: "622c0120-b0e7-4bb7-a961-78b842f153eb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.952672 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/622c0120-b0e7-4bb7-a961-78b842f153eb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "622c0120-b0e7-4bb7-a961-78b842f153eb" (UID: "622c0120-b0e7-4bb7-a961-78b842f153eb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.959626 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/622c0120-b0e7-4bb7-a961-78b842f153eb-scripts" (OuterVolumeSpecName: "scripts") pod "622c0120-b0e7-4bb7-a961-78b842f153eb" (UID: "622c0120-b0e7-4bb7-a961-78b842f153eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.973057 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/622c0120-b0e7-4bb7-a961-78b842f153eb-kube-api-access-sg4dh" (OuterVolumeSpecName: "kube-api-access-sg4dh") pod "622c0120-b0e7-4bb7-a961-78b842f153eb" (UID: "622c0120-b0e7-4bb7-a961-78b842f153eb"). InnerVolumeSpecName "kube-api-access-sg4dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:36:55 crc kubenswrapper[4719]: I1009 15:36:55.994631 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/622c0120-b0e7-4bb7-a961-78b842f153eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "622c0120-b0e7-4bb7-a961-78b842f153eb" (UID: "622c0120-b0e7-4bb7-a961-78b842f153eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.018518 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/622c0120-b0e7-4bb7-a961-78b842f153eb-config-data" (OuterVolumeSpecName: "config-data") pod "622c0120-b0e7-4bb7-a961-78b842f153eb" (UID: "622c0120-b0e7-4bb7-a961-78b842f153eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.048846 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/622c0120-b0e7-4bb7-a961-78b842f153eb-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.048878 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg4dh\" (UniqueName: \"kubernetes.io/projected/622c0120-b0e7-4bb7-a961-78b842f153eb-kube-api-access-sg4dh\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.048890 4719 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/622c0120-b0e7-4bb7-a961-78b842f153eb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.048898 4719 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/622c0120-b0e7-4bb7-a961-78b842f153eb-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.048906 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/622c0120-b0e7-4bb7-a961-78b842f153eb-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.048915 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/622c0120-b0e7-4bb7-a961-78b842f153eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.048924 4719 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/622c0120-b0e7-4bb7-a961-78b842f153eb-logs\") on node \"crc\" DevicePath \"\"" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.241891 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-57c55b4b47-8npb9" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.284589 4719 generic.go:334] "Generic (PLEG): container finished" podID="622c0120-b0e7-4bb7-a961-78b842f153eb" containerID="75af50c215da2170b0dc7b0a5e3cb81b13e56d442eb7c7a94571e5ea6a30d748" exitCode=0 Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.284626 4719 generic.go:334] "Generic (PLEG): container finished" podID="622c0120-b0e7-4bb7-a961-78b842f153eb" containerID="d429756686e39ac036782fcedc325f6eff7b7234a0f3c90370cf6a4b0caca2bb" exitCode=143 Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.285244 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.285448 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"622c0120-b0e7-4bb7-a961-78b842f153eb","Type":"ContainerDied","Data":"75af50c215da2170b0dc7b0a5e3cb81b13e56d442eb7c7a94571e5ea6a30d748"} Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.285500 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"622c0120-b0e7-4bb7-a961-78b842f153eb","Type":"ContainerDied","Data":"d429756686e39ac036782fcedc325f6eff7b7234a0f3c90370cf6a4b0caca2bb"} Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.285514 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"622c0120-b0e7-4bb7-a961-78b842f153eb","Type":"ContainerDied","Data":"03eaccc70d052cd9b39d772e85c66475b5935e8de634a0aef2e341fdbe34d7e4"} Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.285531 4719 scope.go:117] "RemoveContainer" containerID="75af50c215da2170b0dc7b0a5e3cb81b13e56d442eb7c7a94571e5ea6a30d748" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.354819 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-58ddf56cd8-cl6cc"] Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.355193 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-58ddf56cd8-cl6cc" podUID="2f1b276e-3d3d-42c4-a107-53af7102e33e" containerName="neutron-api" containerID="cri-o://8a9a8bd7731aff4e8aa55610e08f7f7d6f7bbfdbd48a34009512be7d9e828436" gracePeriod=30 Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.355751 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-58ddf56cd8-cl6cc" podUID="2f1b276e-3d3d-42c4-a107-53af7102e33e" containerName="neutron-httpd" containerID="cri-o://02b5608bc8584640713c4eedfaf3a3fb38e93cb2d85aed061067b0dff02f0aed" gracePeriod=30 Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.356289 4719 scope.go:117] "RemoveContainer" containerID="d429756686e39ac036782fcedc325f6eff7b7234a0f3c90370cf6a4b0caca2bb" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.379151 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.395167 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.409126 4719 scope.go:117] "RemoveContainer" containerID="75af50c215da2170b0dc7b0a5e3cb81b13e56d442eb7c7a94571e5ea6a30d748" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.411134 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 09 15:36:56 crc kubenswrapper[4719]: E1009 15:36:56.411600 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e78db87f-acd4-471d-82f8-e854df1b36ea" containerName="mariadb-database-create" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.411616 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="e78db87f-acd4-471d-82f8-e854df1b36ea" containerName="mariadb-database-create" Oct 09 15:36:56 crc kubenswrapper[4719]: E1009 15:36:56.411630 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="622c0120-b0e7-4bb7-a961-78b842f153eb" containerName="cinder-api-log" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.411638 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="622c0120-b0e7-4bb7-a961-78b842f153eb" containerName="cinder-api-log" Oct 09 15:36:56 crc kubenswrapper[4719]: E1009 15:36:56.411665 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2352adae-2de1-4980-8391-53cf9c4c14a2" containerName="mariadb-database-create" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.411671 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="2352adae-2de1-4980-8391-53cf9c4c14a2" containerName="mariadb-database-create" Oct 09 15:36:56 crc kubenswrapper[4719]: E1009 15:36:56.411682 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3b4bbf2-4c3a-41bf-bc92-8d267af7a236" containerName="mariadb-database-create" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.411688 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b4bbf2-4c3a-41bf-bc92-8d267af7a236" containerName="mariadb-database-create" Oct 09 15:36:56 crc kubenswrapper[4719]: E1009 15:36:56.411697 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="622c0120-b0e7-4bb7-a961-78b842f153eb" containerName="cinder-api" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.411702 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="622c0120-b0e7-4bb7-a961-78b842f153eb" containerName="cinder-api" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.411907 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="e78db87f-acd4-471d-82f8-e854df1b36ea" containerName="mariadb-database-create" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.411919 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="622c0120-b0e7-4bb7-a961-78b842f153eb" containerName="cinder-api-log" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.411925 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3b4bbf2-4c3a-41bf-bc92-8d267af7a236" containerName="mariadb-database-create" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.411947 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="622c0120-b0e7-4bb7-a961-78b842f153eb" containerName="cinder-api" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.411956 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="2352adae-2de1-4980-8391-53cf9c4c14a2" containerName="mariadb-database-create" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.412980 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: E1009 15:36:56.413669 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75af50c215da2170b0dc7b0a5e3cb81b13e56d442eb7c7a94571e5ea6a30d748\": container with ID starting with 75af50c215da2170b0dc7b0a5e3cb81b13e56d442eb7c7a94571e5ea6a30d748 not found: ID does not exist" containerID="75af50c215da2170b0dc7b0a5e3cb81b13e56d442eb7c7a94571e5ea6a30d748" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.413699 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75af50c215da2170b0dc7b0a5e3cb81b13e56d442eb7c7a94571e5ea6a30d748"} err="failed to get container status \"75af50c215da2170b0dc7b0a5e3cb81b13e56d442eb7c7a94571e5ea6a30d748\": rpc error: code = NotFound desc = could not find container \"75af50c215da2170b0dc7b0a5e3cb81b13e56d442eb7c7a94571e5ea6a30d748\": container with ID starting with 75af50c215da2170b0dc7b0a5e3cb81b13e56d442eb7c7a94571e5ea6a30d748 not found: ID does not exist" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.413721 4719 scope.go:117] "RemoveContainer" containerID="d429756686e39ac036782fcedc325f6eff7b7234a0f3c90370cf6a4b0caca2bb" Oct 09 15:36:56 crc kubenswrapper[4719]: E1009 15:36:56.418036 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d429756686e39ac036782fcedc325f6eff7b7234a0f3c90370cf6a4b0caca2bb\": container with ID starting with d429756686e39ac036782fcedc325f6eff7b7234a0f3c90370cf6a4b0caca2bb not found: ID does not exist" containerID="d429756686e39ac036782fcedc325f6eff7b7234a0f3c90370cf6a4b0caca2bb" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.418070 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d429756686e39ac036782fcedc325f6eff7b7234a0f3c90370cf6a4b0caca2bb"} err="failed to get container status \"d429756686e39ac036782fcedc325f6eff7b7234a0f3c90370cf6a4b0caca2bb\": rpc error: code = NotFound desc = could not find container \"d429756686e39ac036782fcedc325f6eff7b7234a0f3c90370cf6a4b0caca2bb\": container with ID starting with d429756686e39ac036782fcedc325f6eff7b7234a0f3c90370cf6a4b0caca2bb not found: ID does not exist" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.418096 4719 scope.go:117] "RemoveContainer" containerID="75af50c215da2170b0dc7b0a5e3cb81b13e56d442eb7c7a94571e5ea6a30d748" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.418466 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.418674 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.419963 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75af50c215da2170b0dc7b0a5e3cb81b13e56d442eb7c7a94571e5ea6a30d748"} err="failed to get container status \"75af50c215da2170b0dc7b0a5e3cb81b13e56d442eb7c7a94571e5ea6a30d748\": rpc error: code = NotFound desc = could not find container \"75af50c215da2170b0dc7b0a5e3cb81b13e56d442eb7c7a94571e5ea6a30d748\": container with ID starting with 75af50c215da2170b0dc7b0a5e3cb81b13e56d442eb7c7a94571e5ea6a30d748 not found: ID does not exist" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.419986 4719 scope.go:117] "RemoveContainer" containerID="d429756686e39ac036782fcedc325f6eff7b7234a0f3c90370cf6a4b0caca2bb" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.423961 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d429756686e39ac036782fcedc325f6eff7b7234a0f3c90370cf6a4b0caca2bb"} err="failed to get container status \"d429756686e39ac036782fcedc325f6eff7b7234a0f3c90370cf6a4b0caca2bb\": rpc error: code = NotFound desc = could not find container \"d429756686e39ac036782fcedc325f6eff7b7234a0f3c90370cf6a4b0caca2bb\": container with ID starting with d429756686e39ac036782fcedc325f6eff7b7234a0f3c90370cf6a4b0caca2bb not found: ID does not exist" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.438323 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.441066 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.456776 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c7kx\" (UniqueName: \"kubernetes.io/projected/f5b332c9-c154-4ef0-8921-4e329b4b504a-kube-api-access-7c7kx\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.456824 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5b332c9-c154-4ef0-8921-4e329b4b504a-logs\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.456867 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5b332c9-c154-4ef0-8921-4e329b4b504a-config-data-custom\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.456916 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5b332c9-c154-4ef0-8921-4e329b4b504a-config-data\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.456972 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5b332c9-c154-4ef0-8921-4e329b4b504a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.456993 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5b332c9-c154-4ef0-8921-4e329b4b504a-scripts\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.457013 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f5b332c9-c154-4ef0-8921-4e329b4b504a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.457031 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5b332c9-c154-4ef0-8921-4e329b4b504a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.457050 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5b332c9-c154-4ef0-8921-4e329b4b504a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.558734 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5b332c9-c154-4ef0-8921-4e329b4b504a-config-data\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.559525 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5b332c9-c154-4ef0-8921-4e329b4b504a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.559558 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5b332c9-c154-4ef0-8921-4e329b4b504a-scripts\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.559596 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f5b332c9-c154-4ef0-8921-4e329b4b504a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.559618 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5b332c9-c154-4ef0-8921-4e329b4b504a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.559639 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5b332c9-c154-4ef0-8921-4e329b4b504a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.559721 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c7kx\" (UniqueName: \"kubernetes.io/projected/f5b332c9-c154-4ef0-8921-4e329b4b504a-kube-api-access-7c7kx\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.559744 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5b332c9-c154-4ef0-8921-4e329b4b504a-logs\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.559786 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5b332c9-c154-4ef0-8921-4e329b4b504a-config-data-custom\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.564213 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f5b332c9-c154-4ef0-8921-4e329b4b504a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.564637 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5b332c9-c154-4ef0-8921-4e329b4b504a-logs\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.567485 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5b332c9-c154-4ef0-8921-4e329b4b504a-config-data-custom\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.571162 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5b332c9-c154-4ef0-8921-4e329b4b504a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.572101 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5b332c9-c154-4ef0-8921-4e329b4b504a-config-data\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.572131 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5b332c9-c154-4ef0-8921-4e329b4b504a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.572661 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5b332c9-c154-4ef0-8921-4e329b4b504a-scripts\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.574142 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5b332c9-c154-4ef0-8921-4e329b4b504a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.595338 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c7kx\" (UniqueName: \"kubernetes.io/projected/f5b332c9-c154-4ef0-8921-4e329b4b504a-kube-api-access-7c7kx\") pod \"cinder-api-0\" (UID: \"f5b332c9-c154-4ef0-8921-4e329b4b504a\") " pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.751800 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.792376 4719 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.792465 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.793317 4719 scope.go:117] "RemoveContainer" containerID="1b0d8ce3fa12f7379def00ca2131e79585f6bdff6f8c5cc63816c53109df6822" Oct 09 15:36:56 crc kubenswrapper[4719]: E1009 15:36:56.793646 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(75999b62-ce1b-4a9b-8507-c8af12441083)\"" pod="openstack/watcher-decision-engine-0" podUID="75999b62-ce1b-4a9b-8507-c8af12441083" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.818711 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-8326-account-create-lvblm"] Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.821748 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8326-account-create-lvblm" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.824522 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.869801 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9c2x\" (UniqueName: \"kubernetes.io/projected/52a46418-61ee-44a3-b1e3-128f043ad33d-kube-api-access-n9c2x\") pod \"nova-api-8326-account-create-lvblm\" (UID: \"52a46418-61ee-44a3-b1e3-128f043ad33d\") " pod="openstack/nova-api-8326-account-create-lvblm" Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.873129 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8326-account-create-lvblm"] Oct 09 15:36:56 crc kubenswrapper[4719]: I1009 15:36:56.971736 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9c2x\" (UniqueName: \"kubernetes.io/projected/52a46418-61ee-44a3-b1e3-128f043ad33d-kube-api-access-n9c2x\") pod \"nova-api-8326-account-create-lvblm\" (UID: \"52a46418-61ee-44a3-b1e3-128f043ad33d\") " pod="openstack/nova-api-8326-account-create-lvblm" Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.002108 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9c2x\" (UniqueName: \"kubernetes.io/projected/52a46418-61ee-44a3-b1e3-128f043ad33d-kube-api-access-n9c2x\") pod \"nova-api-8326-account-create-lvblm\" (UID: \"52a46418-61ee-44a3-b1e3-128f043ad33d\") " pod="openstack/nova-api-8326-account-create-lvblm" Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.027133 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-8488-account-create-g9bpn"] Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.028438 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8488-account-create-g9bpn" Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.030094 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.037171 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8488-account-create-g9bpn"] Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.074134 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggg6d\" (UniqueName: \"kubernetes.io/projected/492c63ff-0d48-43ff-964c-dcfa64728450-kube-api-access-ggg6d\") pod \"nova-cell0-8488-account-create-g9bpn\" (UID: \"492c63ff-0d48-43ff-964c-dcfa64728450\") " pod="openstack/nova-cell0-8488-account-create-g9bpn" Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.177313 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggg6d\" (UniqueName: \"kubernetes.io/projected/492c63ff-0d48-43ff-964c-dcfa64728450-kube-api-access-ggg6d\") pod \"nova-cell0-8488-account-create-g9bpn\" (UID: \"492c63ff-0d48-43ff-964c-dcfa64728450\") " pod="openstack/nova-cell0-8488-account-create-g9bpn" Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.178465 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="622c0120-b0e7-4bb7-a961-78b842f153eb" path="/var/lib/kubelet/pods/622c0120-b0e7-4bb7-a961-78b842f153eb/volumes" Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.195723 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8326-account-create-lvblm" Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.204633 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggg6d\" (UniqueName: \"kubernetes.io/projected/492c63ff-0d48-43ff-964c-dcfa64728450-kube-api-access-ggg6d\") pod \"nova-cell0-8488-account-create-g9bpn\" (UID: \"492c63ff-0d48-43ff-964c-dcfa64728450\") " pod="openstack/nova-cell0-8488-account-create-g9bpn" Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.238840 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-b7f5-account-create-gknm7"] Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.240221 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b7f5-account-create-gknm7" Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.243387 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.261039 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b7f5-account-create-gknm7"] Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.280097 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzrm8\" (UniqueName: \"kubernetes.io/projected/f57fc9e0-e7f0-43da-aa68-e507aef3750a-kube-api-access-vzrm8\") pod \"nova-cell1-b7f5-account-create-gknm7\" (UID: \"f57fc9e0-e7f0-43da-aa68-e507aef3750a\") " pod="openstack/nova-cell1-b7f5-account-create-gknm7" Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.300649 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.356844 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"891bb60f-cf37-4c12-8907-4ff654886e06","Type":"ContainerStarted","Data":"e57d1a52928c238a2adebb6b4548f25b8224b9fd8a83e5909154fd3923b4b22d"} Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.359128 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.362491 4719 generic.go:334] "Generic (PLEG): container finished" podID="2f1b276e-3d3d-42c4-a107-53af7102e33e" containerID="02b5608bc8584640713c4eedfaf3a3fb38e93cb2d85aed061067b0dff02f0aed" exitCode=0 Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.362548 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58ddf56cd8-cl6cc" event={"ID":"2f1b276e-3d3d-42c4-a107-53af7102e33e","Type":"ContainerDied","Data":"02b5608bc8584640713c4eedfaf3a3fb38e93cb2d85aed061067b0dff02f0aed"} Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.375004 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8488-account-create-g9bpn" Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.381461 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzrm8\" (UniqueName: \"kubernetes.io/projected/f57fc9e0-e7f0-43da-aa68-e507aef3750a-kube-api-access-vzrm8\") pod \"nova-cell1-b7f5-account-create-gknm7\" (UID: \"f57fc9e0-e7f0-43da-aa68-e507aef3750a\") " pod="openstack/nova-cell1-b7f5-account-create-gknm7" Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.391810 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.395990073 podStartE2EDuration="5.391720881s" podCreationTimestamp="2025-10-09 15:36:52 +0000 UTC" firstStartedPulling="2025-10-09 15:36:53.185760714 +0000 UTC m=+1118.695471999" lastFinishedPulling="2025-10-09 15:36:56.181491522 +0000 UTC m=+1121.691202807" observedRunningTime="2025-10-09 15:36:57.375065538 +0000 UTC m=+1122.884776843" watchObservedRunningTime="2025-10-09 15:36:57.391720881 +0000 UTC m=+1122.901432166" Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.400415 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzrm8\" (UniqueName: \"kubernetes.io/projected/f57fc9e0-e7f0-43da-aa68-e507aef3750a-kube-api-access-vzrm8\") pod \"nova-cell1-b7f5-account-create-gknm7\" (UID: \"f57fc9e0-e7f0-43da-aa68-e507aef3750a\") " pod="openstack/nova-cell1-b7f5-account-create-gknm7" Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.573668 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b7f5-account-create-gknm7" Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.751471 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8326-account-create-lvblm"] Oct 09 15:36:57 crc kubenswrapper[4719]: I1009 15:36:57.891456 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8488-account-create-g9bpn"] Oct 09 15:36:57 crc kubenswrapper[4719]: W1009 15:36:57.894105 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod492c63ff_0d48_43ff_964c_dcfa64728450.slice/crio-dda93e173f6a270f1d2521495e689c9f5a8ed1e676464eb68189fa678761eabf WatchSource:0}: Error finding container dda93e173f6a270f1d2521495e689c9f5a8ed1e676464eb68189fa678761eabf: Status 404 returned error can't find the container with id dda93e173f6a270f1d2521495e689c9f5a8ed1e676464eb68189fa678761eabf Oct 09 15:36:58 crc kubenswrapper[4719]: I1009 15:36:58.114729 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b7f5-account-create-gknm7"] Oct 09 15:36:58 crc kubenswrapper[4719]: I1009 15:36:58.389573 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8488-account-create-g9bpn" event={"ID":"492c63ff-0d48-43ff-964c-dcfa64728450","Type":"ContainerStarted","Data":"59c34ae01bc6bb0027e3d13a01e3f5fe42d9237382758c7f1e8cfb2ef7371a2e"} Oct 09 15:36:58 crc kubenswrapper[4719]: I1009 15:36:58.389626 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8488-account-create-g9bpn" event={"ID":"492c63ff-0d48-43ff-964c-dcfa64728450","Type":"ContainerStarted","Data":"dda93e173f6a270f1d2521495e689c9f5a8ed1e676464eb68189fa678761eabf"} Oct 09 15:36:58 crc kubenswrapper[4719]: I1009 15:36:58.392812 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8326-account-create-lvblm" event={"ID":"52a46418-61ee-44a3-b1e3-128f043ad33d","Type":"ContainerStarted","Data":"77244bc6605fd24aa78812800ddb48aa5ee1251fc43de8a1b1bca2ba99147999"} Oct 09 15:36:58 crc kubenswrapper[4719]: I1009 15:36:58.392856 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8326-account-create-lvblm" event={"ID":"52a46418-61ee-44a3-b1e3-128f043ad33d","Type":"ContainerStarted","Data":"c084ba75fd1d0650fec5372f6dd51c2c42aa842e67a9bf2ac580d0abcb396b65"} Oct 09 15:36:58 crc kubenswrapper[4719]: I1009 15:36:58.411187 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-8488-account-create-g9bpn" podStartSLOduration=2.411170049 podStartE2EDuration="2.411170049s" podCreationTimestamp="2025-10-09 15:36:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:36:58.40307167 +0000 UTC m=+1123.912782975" watchObservedRunningTime="2025-10-09 15:36:58.411170049 +0000 UTC m=+1123.920881334" Oct 09 15:36:58 crc kubenswrapper[4719]: I1009 15:36:58.422478 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-8326-account-create-lvblm" podStartSLOduration=2.422463101 podStartE2EDuration="2.422463101s" podCreationTimestamp="2025-10-09 15:36:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:36:58.42087661 +0000 UTC m=+1123.930587905" watchObservedRunningTime="2025-10-09 15:36:58.422463101 +0000 UTC m=+1123.932174386" Oct 09 15:36:58 crc kubenswrapper[4719]: I1009 15:36:58.424714 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f5b332c9-c154-4ef0-8921-4e329b4b504a","Type":"ContainerStarted","Data":"a9aa4293ae6fe24fffbf70e1557364c00574e2b06d84c9873bfd99ce876a6fa8"} Oct 09 15:36:58 crc kubenswrapper[4719]: I1009 15:36:58.425043 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f5b332c9-c154-4ef0-8921-4e329b4b504a","Type":"ContainerStarted","Data":"cc61841b49042422c038248b415a1072c05d79aada511e445f57bc61c6b1d6e7"} Oct 09 15:36:58 crc kubenswrapper[4719]: I1009 15:36:58.427477 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b7f5-account-create-gknm7" event={"ID":"f57fc9e0-e7f0-43da-aa68-e507aef3750a","Type":"ContainerStarted","Data":"5b70d2a696b5c82ed8c48ffcb88f156a29bccca74d339ddfed9953c5d869de8a"} Oct 09 15:36:58 crc kubenswrapper[4719]: I1009 15:36:58.450086 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-b7f5-account-create-gknm7" podStartSLOduration=1.449958932 podStartE2EDuration="1.449958932s" podCreationTimestamp="2025-10-09 15:36:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:36:58.441235642 +0000 UTC m=+1123.950946937" watchObservedRunningTime="2025-10-09 15:36:58.449958932 +0000 UTC m=+1123.959670217" Oct 09 15:36:58 crc kubenswrapper[4719]: I1009 15:36:58.453573 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 09 15:36:58 crc kubenswrapper[4719]: I1009 15:36:58.591313 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:58 crc kubenswrapper[4719]: I1009 15:36:58.612201 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-556bc79449-9bdkb" Oct 09 15:36:58 crc kubenswrapper[4719]: I1009 15:36:58.690057 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 09 15:36:59 crc kubenswrapper[4719]: I1009 15:36:59.439561 4719 generic.go:334] "Generic (PLEG): container finished" podID="52a46418-61ee-44a3-b1e3-128f043ad33d" containerID="77244bc6605fd24aa78812800ddb48aa5ee1251fc43de8a1b1bca2ba99147999" exitCode=0 Oct 09 15:36:59 crc kubenswrapper[4719]: I1009 15:36:59.439626 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8326-account-create-lvblm" event={"ID":"52a46418-61ee-44a3-b1e3-128f043ad33d","Type":"ContainerDied","Data":"77244bc6605fd24aa78812800ddb48aa5ee1251fc43de8a1b1bca2ba99147999"} Oct 09 15:36:59 crc kubenswrapper[4719]: I1009 15:36:59.442256 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f5b332c9-c154-4ef0-8921-4e329b4b504a","Type":"ContainerStarted","Data":"ad1a9665225335b1399e22ccf0db9c3947d90159615a0d132bfc4d9d296264f2"} Oct 09 15:36:59 crc kubenswrapper[4719]: I1009 15:36:59.442401 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 09 15:36:59 crc kubenswrapper[4719]: I1009 15:36:59.444717 4719 generic.go:334] "Generic (PLEG): container finished" podID="f57fc9e0-e7f0-43da-aa68-e507aef3750a" containerID="2d4b60d2896ce3b31512fd54dee1348bfbea3cfa46ee0906f6ed3c117e22642b" exitCode=0 Oct 09 15:36:59 crc kubenswrapper[4719]: I1009 15:36:59.444748 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b7f5-account-create-gknm7" event={"ID":"f57fc9e0-e7f0-43da-aa68-e507aef3750a","Type":"ContainerDied","Data":"2d4b60d2896ce3b31512fd54dee1348bfbea3cfa46ee0906f6ed3c117e22642b"} Oct 09 15:36:59 crc kubenswrapper[4719]: I1009 15:36:59.447036 4719 generic.go:334] "Generic (PLEG): container finished" podID="492c63ff-0d48-43ff-964c-dcfa64728450" containerID="59c34ae01bc6bb0027e3d13a01e3f5fe42d9237382758c7f1e8cfb2ef7371a2e" exitCode=0 Oct 09 15:36:59 crc kubenswrapper[4719]: I1009 15:36:59.447094 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8488-account-create-g9bpn" event={"ID":"492c63ff-0d48-43ff-964c-dcfa64728450","Type":"ContainerDied","Data":"59c34ae01bc6bb0027e3d13a01e3f5fe42d9237382758c7f1e8cfb2ef7371a2e"} Oct 09 15:36:59 crc kubenswrapper[4719]: I1009 15:36:59.479805 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.479780762 podStartE2EDuration="3.479780762s" podCreationTimestamp="2025-10-09 15:36:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:36:59.472653043 +0000 UTC m=+1124.982364328" watchObservedRunningTime="2025-10-09 15:36:59.479780762 +0000 UTC m=+1124.989492057" Oct 09 15:36:59 crc kubenswrapper[4719]: I1009 15:36:59.526694 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 15:37:00 crc kubenswrapper[4719]: I1009 15:37:00.456866 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a4255094-4bc0-4cc5-bf14-c663dd9e17e7" containerName="cinder-scheduler" containerID="cri-o://f3a970ec68d524e6953aecb25820acfbab374f1d8635c01863349942ce51a813" gracePeriod=30 Oct 09 15:37:00 crc kubenswrapper[4719]: I1009 15:37:00.456907 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a4255094-4bc0-4cc5-bf14-c663dd9e17e7" containerName="probe" containerID="cri-o://94e8511f687e02172c7beb1f4b16f33db67db61bf7681ee68958a7379e05e09f" gracePeriod=30 Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.071832 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b7f5-account-create-gknm7" Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.175249 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzrm8\" (UniqueName: \"kubernetes.io/projected/f57fc9e0-e7f0-43da-aa68-e507aef3750a-kube-api-access-vzrm8\") pod \"f57fc9e0-e7f0-43da-aa68-e507aef3750a\" (UID: \"f57fc9e0-e7f0-43da-aa68-e507aef3750a\") " Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.193071 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f57fc9e0-e7f0-43da-aa68-e507aef3750a-kube-api-access-vzrm8" (OuterVolumeSpecName: "kube-api-access-vzrm8") pod "f57fc9e0-e7f0-43da-aa68-e507aef3750a" (UID: "f57fc9e0-e7f0-43da-aa68-e507aef3750a"). InnerVolumeSpecName "kube-api-access-vzrm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.277669 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzrm8\" (UniqueName: \"kubernetes.io/projected/f57fc9e0-e7f0-43da-aa68-e507aef3750a-kube-api-access-vzrm8\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.301270 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8488-account-create-g9bpn" Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.307632 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8326-account-create-lvblm" Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.379795 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggg6d\" (UniqueName: \"kubernetes.io/projected/492c63ff-0d48-43ff-964c-dcfa64728450-kube-api-access-ggg6d\") pod \"492c63ff-0d48-43ff-964c-dcfa64728450\" (UID: \"492c63ff-0d48-43ff-964c-dcfa64728450\") " Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.379900 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9c2x\" (UniqueName: \"kubernetes.io/projected/52a46418-61ee-44a3-b1e3-128f043ad33d-kube-api-access-n9c2x\") pod \"52a46418-61ee-44a3-b1e3-128f043ad33d\" (UID: \"52a46418-61ee-44a3-b1e3-128f043ad33d\") " Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.394129 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52a46418-61ee-44a3-b1e3-128f043ad33d-kube-api-access-n9c2x" (OuterVolumeSpecName: "kube-api-access-n9c2x") pod "52a46418-61ee-44a3-b1e3-128f043ad33d" (UID: "52a46418-61ee-44a3-b1e3-128f043ad33d"). InnerVolumeSpecName "kube-api-access-n9c2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.395823 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/492c63ff-0d48-43ff-964c-dcfa64728450-kube-api-access-ggg6d" (OuterVolumeSpecName: "kube-api-access-ggg6d") pod "492c63ff-0d48-43ff-964c-dcfa64728450" (UID: "492c63ff-0d48-43ff-964c-dcfa64728450"). InnerVolumeSpecName "kube-api-access-ggg6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.485318 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b7f5-account-create-gknm7" event={"ID":"f57fc9e0-e7f0-43da-aa68-e507aef3750a","Type":"ContainerDied","Data":"5b70d2a696b5c82ed8c48ffcb88f156a29bccca74d339ddfed9953c5d869de8a"} Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.485364 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b70d2a696b5c82ed8c48ffcb88f156a29bccca74d339ddfed9953c5d869de8a" Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.485410 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b7f5-account-create-gknm7" Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.485939 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9c2x\" (UniqueName: \"kubernetes.io/projected/52a46418-61ee-44a3-b1e3-128f043ad33d-kube-api-access-n9c2x\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.485953 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggg6d\" (UniqueName: \"kubernetes.io/projected/492c63ff-0d48-43ff-964c-dcfa64728450-kube-api-access-ggg6d\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.501814 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8488-account-create-g9bpn" event={"ID":"492c63ff-0d48-43ff-964c-dcfa64728450","Type":"ContainerDied","Data":"dda93e173f6a270f1d2521495e689c9f5a8ed1e676464eb68189fa678761eabf"} Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.501851 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dda93e173f6a270f1d2521495e689c9f5a8ed1e676464eb68189fa678761eabf" Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.501920 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8488-account-create-g9bpn" Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.504277 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8326-account-create-lvblm" event={"ID":"52a46418-61ee-44a3-b1e3-128f043ad33d","Type":"ContainerDied","Data":"c084ba75fd1d0650fec5372f6dd51c2c42aa842e67a9bf2ac580d0abcb396b65"} Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.504317 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c084ba75fd1d0650fec5372f6dd51c2c42aa842e67a9bf2ac580d0abcb396b65" Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.504394 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8326-account-create-lvblm" Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.509234 4719 generic.go:334] "Generic (PLEG): container finished" podID="2f1b276e-3d3d-42c4-a107-53af7102e33e" containerID="8a9a8bd7731aff4e8aa55610e08f7f7d6f7bbfdbd48a34009512be7d9e828436" exitCode=0 Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.509270 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58ddf56cd8-cl6cc" event={"ID":"2f1b276e-3d3d-42c4-a107-53af7102e33e","Type":"ContainerDied","Data":"8a9a8bd7731aff4e8aa55610e08f7f7d6f7bbfdbd48a34009512be7d9e828436"} Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.516447 4719 generic.go:334] "Generic (PLEG): container finished" podID="a4255094-4bc0-4cc5-bf14-c663dd9e17e7" containerID="94e8511f687e02172c7beb1f4b16f33db67db61bf7681ee68958a7379e05e09f" exitCode=0 Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.516501 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a4255094-4bc0-4cc5-bf14-c663dd9e17e7","Type":"ContainerDied","Data":"94e8511f687e02172c7beb1f4b16f33db67db61bf7681ee68958a7379e05e09f"} Oct 09 15:37:01 crc kubenswrapper[4719]: E1009 15:37:01.650269 4719 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4255094_4bc0_4cc5_bf14_c663dd9e17e7.slice/crio-conmon-f3a970ec68d524e6953aecb25820acfbab374f1d8635c01863349942ce51a813.scope\": RecentStats: unable to find data in memory cache]" Oct 09 15:37:01 crc kubenswrapper[4719]: I1009 15:37:01.929872 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-58ddf56cd8-cl6cc" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.000591 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f1b276e-3d3d-42c4-a107-53af7102e33e-config\") pod \"2f1b276e-3d3d-42c4-a107-53af7102e33e\" (UID: \"2f1b276e-3d3d-42c4-a107-53af7102e33e\") " Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.000971 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f1b276e-3d3d-42c4-a107-53af7102e33e-ovndb-tls-certs\") pod \"2f1b276e-3d3d-42c4-a107-53af7102e33e\" (UID: \"2f1b276e-3d3d-42c4-a107-53af7102e33e\") " Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.001021 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2f1b276e-3d3d-42c4-a107-53af7102e33e-httpd-config\") pod \"2f1b276e-3d3d-42c4-a107-53af7102e33e\" (UID: \"2f1b276e-3d3d-42c4-a107-53af7102e33e\") " Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.001047 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdkrb\" (UniqueName: \"kubernetes.io/projected/2f1b276e-3d3d-42c4-a107-53af7102e33e-kube-api-access-mdkrb\") pod \"2f1b276e-3d3d-42c4-a107-53af7102e33e\" (UID: \"2f1b276e-3d3d-42c4-a107-53af7102e33e\") " Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.001148 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f1b276e-3d3d-42c4-a107-53af7102e33e-combined-ca-bundle\") pod \"2f1b276e-3d3d-42c4-a107-53af7102e33e\" (UID: \"2f1b276e-3d3d-42c4-a107-53af7102e33e\") " Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.008851 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f1b276e-3d3d-42c4-a107-53af7102e33e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "2f1b276e-3d3d-42c4-a107-53af7102e33e" (UID: "2f1b276e-3d3d-42c4-a107-53af7102e33e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.015134 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f1b276e-3d3d-42c4-a107-53af7102e33e-kube-api-access-mdkrb" (OuterVolumeSpecName: "kube-api-access-mdkrb") pod "2f1b276e-3d3d-42c4-a107-53af7102e33e" (UID: "2f1b276e-3d3d-42c4-a107-53af7102e33e"). InnerVolumeSpecName "kube-api-access-mdkrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.070548 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f1b276e-3d3d-42c4-a107-53af7102e33e-config" (OuterVolumeSpecName: "config") pod "2f1b276e-3d3d-42c4-a107-53af7102e33e" (UID: "2f1b276e-3d3d-42c4-a107-53af7102e33e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.081524 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f1b276e-3d3d-42c4-a107-53af7102e33e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f1b276e-3d3d-42c4-a107-53af7102e33e" (UID: "2f1b276e-3d3d-42c4-a107-53af7102e33e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.094731 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f1b276e-3d3d-42c4-a107-53af7102e33e-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "2f1b276e-3d3d-42c4-a107-53af7102e33e" (UID: "2f1b276e-3d3d-42c4-a107-53af7102e33e"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.104454 4719 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f1b276e-3d3d-42c4-a107-53af7102e33e-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.104492 4719 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2f1b276e-3d3d-42c4-a107-53af7102e33e-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.104506 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdkrb\" (UniqueName: \"kubernetes.io/projected/2f1b276e-3d3d-42c4-a107-53af7102e33e-kube-api-access-mdkrb\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.104520 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f1b276e-3d3d-42c4-a107-53af7102e33e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.104532 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f1b276e-3d3d-42c4-a107-53af7102e33e-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.187097 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.307305 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-config-data-custom\") pod \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\" (UID: \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\") " Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.307464 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-etc-machine-id\") pod \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\" (UID: \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\") " Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.307494 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-combined-ca-bundle\") pod \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\" (UID: \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\") " Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.307590 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-config-data\") pod \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\" (UID: \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\") " Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.307618 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-scripts\") pod \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\" (UID: \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\") " Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.307620 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a4255094-4bc0-4cc5-bf14-c663dd9e17e7" (UID: "a4255094-4bc0-4cc5-bf14-c663dd9e17e7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.307739 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdcjb\" (UniqueName: \"kubernetes.io/projected/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-kube-api-access-wdcjb\") pod \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\" (UID: \"a4255094-4bc0-4cc5-bf14-c663dd9e17e7\") " Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.308231 4719 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.319545 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-kube-api-access-wdcjb" (OuterVolumeSpecName: "kube-api-access-wdcjb") pod "a4255094-4bc0-4cc5-bf14-c663dd9e17e7" (UID: "a4255094-4bc0-4cc5-bf14-c663dd9e17e7"). InnerVolumeSpecName "kube-api-access-wdcjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.322601 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a4255094-4bc0-4cc5-bf14-c663dd9e17e7" (UID: "a4255094-4bc0-4cc5-bf14-c663dd9e17e7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.327585 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-scripts" (OuterVolumeSpecName: "scripts") pod "a4255094-4bc0-4cc5-bf14-c663dd9e17e7" (UID: "a4255094-4bc0-4cc5-bf14-c663dd9e17e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.412738 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.412771 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdcjb\" (UniqueName: \"kubernetes.io/projected/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-kube-api-access-wdcjb\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.412782 4719 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.418510 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4255094-4bc0-4cc5-bf14-c663dd9e17e7" (UID: "a4255094-4bc0-4cc5-bf14-c663dd9e17e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.449554 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-r57qh"] Oct 09 15:37:02 crc kubenswrapper[4719]: E1009 15:37:02.449947 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4255094-4bc0-4cc5-bf14-c663dd9e17e7" containerName="cinder-scheduler" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.449964 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4255094-4bc0-4cc5-bf14-c663dd9e17e7" containerName="cinder-scheduler" Oct 09 15:37:02 crc kubenswrapper[4719]: E1009 15:37:02.449979 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4255094-4bc0-4cc5-bf14-c663dd9e17e7" containerName="probe" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.449986 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4255094-4bc0-4cc5-bf14-c663dd9e17e7" containerName="probe" Oct 09 15:37:02 crc kubenswrapper[4719]: E1009 15:37:02.449998 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1b276e-3d3d-42c4-a107-53af7102e33e" containerName="neutron-httpd" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.450004 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1b276e-3d3d-42c4-a107-53af7102e33e" containerName="neutron-httpd" Oct 09 15:37:02 crc kubenswrapper[4719]: E1009 15:37:02.450016 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="492c63ff-0d48-43ff-964c-dcfa64728450" containerName="mariadb-account-create" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.450021 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="492c63ff-0d48-43ff-964c-dcfa64728450" containerName="mariadb-account-create" Oct 09 15:37:02 crc kubenswrapper[4719]: E1009 15:37:02.450042 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1b276e-3d3d-42c4-a107-53af7102e33e" containerName="neutron-api" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.450048 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1b276e-3d3d-42c4-a107-53af7102e33e" containerName="neutron-api" Oct 09 15:37:02 crc kubenswrapper[4719]: E1009 15:37:02.450058 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f57fc9e0-e7f0-43da-aa68-e507aef3750a" containerName="mariadb-account-create" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.450063 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="f57fc9e0-e7f0-43da-aa68-e507aef3750a" containerName="mariadb-account-create" Oct 09 15:37:02 crc kubenswrapper[4719]: E1009 15:37:02.450075 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52a46418-61ee-44a3-b1e3-128f043ad33d" containerName="mariadb-account-create" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.450081 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="52a46418-61ee-44a3-b1e3-128f043ad33d" containerName="mariadb-account-create" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.450239 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4255094-4bc0-4cc5-bf14-c663dd9e17e7" containerName="probe" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.450252 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="52a46418-61ee-44a3-b1e3-128f043ad33d" containerName="mariadb-account-create" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.450266 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4255094-4bc0-4cc5-bf14-c663dd9e17e7" containerName="cinder-scheduler" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.450273 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1b276e-3d3d-42c4-a107-53af7102e33e" containerName="neutron-httpd" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.450286 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="f57fc9e0-e7f0-43da-aa68-e507aef3750a" containerName="mariadb-account-create" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.450293 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="492c63ff-0d48-43ff-964c-dcfa64728450" containerName="mariadb-account-create" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.450305 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1b276e-3d3d-42c4-a107-53af7102e33e" containerName="neutron-api" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.450926 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-r57qh" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.456764 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-nvqwf" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.457005 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.457288 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.496445 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-r57qh"] Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.515914 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de61779c-4ad9-40bd-908e-27b82b5c82cb-scripts\") pod \"nova-cell0-conductor-db-sync-r57qh\" (UID: \"de61779c-4ad9-40bd-908e-27b82b5c82cb\") " pod="openstack/nova-cell0-conductor-db-sync-r57qh" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.515965 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de61779c-4ad9-40bd-908e-27b82b5c82cb-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-r57qh\" (UID: \"de61779c-4ad9-40bd-908e-27b82b5c82cb\") " pod="openstack/nova-cell0-conductor-db-sync-r57qh" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.515983 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de61779c-4ad9-40bd-908e-27b82b5c82cb-config-data\") pod \"nova-cell0-conductor-db-sync-r57qh\" (UID: \"de61779c-4ad9-40bd-908e-27b82b5c82cb\") " pod="openstack/nova-cell0-conductor-db-sync-r57qh" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.516063 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lclj4\" (UniqueName: \"kubernetes.io/projected/de61779c-4ad9-40bd-908e-27b82b5c82cb-kube-api-access-lclj4\") pod \"nova-cell0-conductor-db-sync-r57qh\" (UID: \"de61779c-4ad9-40bd-908e-27b82b5c82cb\") " pod="openstack/nova-cell0-conductor-db-sync-r57qh" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.516117 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.534579 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-config-data" (OuterVolumeSpecName: "config-data") pod "a4255094-4bc0-4cc5-bf14-c663dd9e17e7" (UID: "a4255094-4bc0-4cc5-bf14-c663dd9e17e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.536833 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58ddf56cd8-cl6cc" event={"ID":"2f1b276e-3d3d-42c4-a107-53af7102e33e","Type":"ContainerDied","Data":"ead8180d17db68bc8a17000c3b60a41f357807c20491a92afe7b81d03ddc215f"} Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.536881 4719 scope.go:117] "RemoveContainer" containerID="02b5608bc8584640713c4eedfaf3a3fb38e93cb2d85aed061067b0dff02f0aed" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.537007 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-58ddf56cd8-cl6cc" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.546580 4719 generic.go:334] "Generic (PLEG): container finished" podID="a4255094-4bc0-4cc5-bf14-c663dd9e17e7" containerID="f3a970ec68d524e6953aecb25820acfbab374f1d8635c01863349942ce51a813" exitCode=0 Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.546626 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a4255094-4bc0-4cc5-bf14-c663dd9e17e7","Type":"ContainerDied","Data":"f3a970ec68d524e6953aecb25820acfbab374f1d8635c01863349942ce51a813"} Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.546652 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a4255094-4bc0-4cc5-bf14-c663dd9e17e7","Type":"ContainerDied","Data":"ff48cb35b869bde6dca57bc93d145117af2601cc9b4278ea01817dc74a03c68b"} Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.548561 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.566089 4719 scope.go:117] "RemoveContainer" containerID="8a9a8bd7731aff4e8aa55610e08f7f7d6f7bbfdbd48a34009512be7d9e828436" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.583404 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-58ddf56cd8-cl6cc"] Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.588729 4719 scope.go:117] "RemoveContainer" containerID="94e8511f687e02172c7beb1f4b16f33db67db61bf7681ee68958a7379e05e09f" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.603778 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-58ddf56cd8-cl6cc"] Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.612097 4719 scope.go:117] "RemoveContainer" containerID="f3a970ec68d524e6953aecb25820acfbab374f1d8635c01863349942ce51a813" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.617640 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lclj4\" (UniqueName: \"kubernetes.io/projected/de61779c-4ad9-40bd-908e-27b82b5c82cb-kube-api-access-lclj4\") pod \"nova-cell0-conductor-db-sync-r57qh\" (UID: \"de61779c-4ad9-40bd-908e-27b82b5c82cb\") " pod="openstack/nova-cell0-conductor-db-sync-r57qh" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.617897 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de61779c-4ad9-40bd-908e-27b82b5c82cb-scripts\") pod \"nova-cell0-conductor-db-sync-r57qh\" (UID: \"de61779c-4ad9-40bd-908e-27b82b5c82cb\") " pod="openstack/nova-cell0-conductor-db-sync-r57qh" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.617951 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de61779c-4ad9-40bd-908e-27b82b5c82cb-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-r57qh\" (UID: \"de61779c-4ad9-40bd-908e-27b82b5c82cb\") " pod="openstack/nova-cell0-conductor-db-sync-r57qh" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.617974 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de61779c-4ad9-40bd-908e-27b82b5c82cb-config-data\") pod \"nova-cell0-conductor-db-sync-r57qh\" (UID: \"de61779c-4ad9-40bd-908e-27b82b5c82cb\") " pod="openstack/nova-cell0-conductor-db-sync-r57qh" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.618098 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4255094-4bc0-4cc5-bf14-c663dd9e17e7-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.621615 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de61779c-4ad9-40bd-908e-27b82b5c82cb-scripts\") pod \"nova-cell0-conductor-db-sync-r57qh\" (UID: \"de61779c-4ad9-40bd-908e-27b82b5c82cb\") " pod="openstack/nova-cell0-conductor-db-sync-r57qh" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.622116 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de61779c-4ad9-40bd-908e-27b82b5c82cb-config-data\") pod \"nova-cell0-conductor-db-sync-r57qh\" (UID: \"de61779c-4ad9-40bd-908e-27b82b5c82cb\") " pod="openstack/nova-cell0-conductor-db-sync-r57qh" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.622733 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.623886 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de61779c-4ad9-40bd-908e-27b82b5c82cb-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-r57qh\" (UID: \"de61779c-4ad9-40bd-908e-27b82b5c82cb\") " pod="openstack/nova-cell0-conductor-db-sync-r57qh" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.632377 4719 scope.go:117] "RemoveContainer" containerID="94e8511f687e02172c7beb1f4b16f33db67db61bf7681ee68958a7379e05e09f" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.633681 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 15:37:02 crc kubenswrapper[4719]: E1009 15:37:02.634107 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94e8511f687e02172c7beb1f4b16f33db67db61bf7681ee68958a7379e05e09f\": container with ID starting with 94e8511f687e02172c7beb1f4b16f33db67db61bf7681ee68958a7379e05e09f not found: ID does not exist" containerID="94e8511f687e02172c7beb1f4b16f33db67db61bf7681ee68958a7379e05e09f" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.634136 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94e8511f687e02172c7beb1f4b16f33db67db61bf7681ee68958a7379e05e09f"} err="failed to get container status \"94e8511f687e02172c7beb1f4b16f33db67db61bf7681ee68958a7379e05e09f\": rpc error: code = NotFound desc = could not find container \"94e8511f687e02172c7beb1f4b16f33db67db61bf7681ee68958a7379e05e09f\": container with ID starting with 94e8511f687e02172c7beb1f4b16f33db67db61bf7681ee68958a7379e05e09f not found: ID does not exist" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.634161 4719 scope.go:117] "RemoveContainer" containerID="f3a970ec68d524e6953aecb25820acfbab374f1d8635c01863349942ce51a813" Oct 09 15:37:02 crc kubenswrapper[4719]: E1009 15:37:02.634399 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3a970ec68d524e6953aecb25820acfbab374f1d8635c01863349942ce51a813\": container with ID starting with f3a970ec68d524e6953aecb25820acfbab374f1d8635c01863349942ce51a813 not found: ID does not exist" containerID="f3a970ec68d524e6953aecb25820acfbab374f1d8635c01863349942ce51a813" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.634420 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3a970ec68d524e6953aecb25820acfbab374f1d8635c01863349942ce51a813"} err="failed to get container status \"f3a970ec68d524e6953aecb25820acfbab374f1d8635c01863349942ce51a813\": rpc error: code = NotFound desc = could not find container \"f3a970ec68d524e6953aecb25820acfbab374f1d8635c01863349942ce51a813\": container with ID starting with f3a970ec68d524e6953aecb25820acfbab374f1d8635c01863349942ce51a813 not found: ID does not exist" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.645234 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.647491 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.651427 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.657266 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lclj4\" (UniqueName: \"kubernetes.io/projected/de61779c-4ad9-40bd-908e-27b82b5c82cb-kube-api-access-lclj4\") pod \"nova-cell0-conductor-db-sync-r57qh\" (UID: \"de61779c-4ad9-40bd-908e-27b82b5c82cb\") " pod="openstack/nova-cell0-conductor-db-sync-r57qh" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.658197 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.719661 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e46828-5596-4987-8998-c52dbaf93086-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"46e46828-5596-4987-8998-c52dbaf93086\") " pod="openstack/cinder-scheduler-0" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.719764 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46e46828-5596-4987-8998-c52dbaf93086-scripts\") pod \"cinder-scheduler-0\" (UID: \"46e46828-5596-4987-8998-c52dbaf93086\") " pod="openstack/cinder-scheduler-0" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.719792 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46e46828-5596-4987-8998-c52dbaf93086-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"46e46828-5596-4987-8998-c52dbaf93086\") " pod="openstack/cinder-scheduler-0" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.719844 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8455r\" (UniqueName: \"kubernetes.io/projected/46e46828-5596-4987-8998-c52dbaf93086-kube-api-access-8455r\") pod \"cinder-scheduler-0\" (UID: \"46e46828-5596-4987-8998-c52dbaf93086\") " pod="openstack/cinder-scheduler-0" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.719876 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46e46828-5596-4987-8998-c52dbaf93086-config-data\") pod \"cinder-scheduler-0\" (UID: \"46e46828-5596-4987-8998-c52dbaf93086\") " pod="openstack/cinder-scheduler-0" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.719896 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46e46828-5596-4987-8998-c52dbaf93086-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"46e46828-5596-4987-8998-c52dbaf93086\") " pod="openstack/cinder-scheduler-0" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.799666 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-r57qh" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.822064 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e46828-5596-4987-8998-c52dbaf93086-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"46e46828-5596-4987-8998-c52dbaf93086\") " pod="openstack/cinder-scheduler-0" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.822473 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46e46828-5596-4987-8998-c52dbaf93086-scripts\") pod \"cinder-scheduler-0\" (UID: \"46e46828-5596-4987-8998-c52dbaf93086\") " pod="openstack/cinder-scheduler-0" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.822643 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46e46828-5596-4987-8998-c52dbaf93086-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"46e46828-5596-4987-8998-c52dbaf93086\") " pod="openstack/cinder-scheduler-0" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.822782 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46e46828-5596-4987-8998-c52dbaf93086-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"46e46828-5596-4987-8998-c52dbaf93086\") " pod="openstack/cinder-scheduler-0" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.822922 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8455r\" (UniqueName: \"kubernetes.io/projected/46e46828-5596-4987-8998-c52dbaf93086-kube-api-access-8455r\") pod \"cinder-scheduler-0\" (UID: \"46e46828-5596-4987-8998-c52dbaf93086\") " pod="openstack/cinder-scheduler-0" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.823028 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46e46828-5596-4987-8998-c52dbaf93086-config-data\") pod \"cinder-scheduler-0\" (UID: \"46e46828-5596-4987-8998-c52dbaf93086\") " pod="openstack/cinder-scheduler-0" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.823131 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46e46828-5596-4987-8998-c52dbaf93086-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"46e46828-5596-4987-8998-c52dbaf93086\") " pod="openstack/cinder-scheduler-0" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.828725 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46e46828-5596-4987-8998-c52dbaf93086-scripts\") pod \"cinder-scheduler-0\" (UID: \"46e46828-5596-4987-8998-c52dbaf93086\") " pod="openstack/cinder-scheduler-0" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.829276 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46e46828-5596-4987-8998-c52dbaf93086-config-data\") pod \"cinder-scheduler-0\" (UID: \"46e46828-5596-4987-8998-c52dbaf93086\") " pod="openstack/cinder-scheduler-0" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.829821 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46e46828-5596-4987-8998-c52dbaf93086-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"46e46828-5596-4987-8998-c52dbaf93086\") " pod="openstack/cinder-scheduler-0" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.839052 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e46828-5596-4987-8998-c52dbaf93086-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"46e46828-5596-4987-8998-c52dbaf93086\") " pod="openstack/cinder-scheduler-0" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.857162 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8455r\" (UniqueName: \"kubernetes.io/projected/46e46828-5596-4987-8998-c52dbaf93086-kube-api-access-8455r\") pod \"cinder-scheduler-0\" (UID: \"46e46828-5596-4987-8998-c52dbaf93086\") " pod="openstack/cinder-scheduler-0" Oct 09 15:37:02 crc kubenswrapper[4719]: I1009 15:37:02.977195 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 15:37:03 crc kubenswrapper[4719]: I1009 15:37:03.189725 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f1b276e-3d3d-42c4-a107-53af7102e33e" path="/var/lib/kubelet/pods/2f1b276e-3d3d-42c4-a107-53af7102e33e/volumes" Oct 09 15:37:03 crc kubenswrapper[4719]: I1009 15:37:03.190496 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4255094-4bc0-4cc5-bf14-c663dd9e17e7" path="/var/lib/kubelet/pods/a4255094-4bc0-4cc5-bf14-c663dd9e17e7/volumes" Oct 09 15:37:03 crc kubenswrapper[4719]: I1009 15:37:03.216182 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:37:03 crc kubenswrapper[4719]: I1009 15:37:03.216501 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="891bb60f-cf37-4c12-8907-4ff654886e06" containerName="ceilometer-central-agent" containerID="cri-o://72fd53e4f308ec26610a590d41cd0191d547dc131090ec90c4dcb57428e273d4" gracePeriod=30 Oct 09 15:37:03 crc kubenswrapper[4719]: I1009 15:37:03.216656 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="891bb60f-cf37-4c12-8907-4ff654886e06" containerName="proxy-httpd" containerID="cri-o://e57d1a52928c238a2adebb6b4548f25b8224b9fd8a83e5909154fd3923b4b22d" gracePeriod=30 Oct 09 15:37:03 crc kubenswrapper[4719]: I1009 15:37:03.216697 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="891bb60f-cf37-4c12-8907-4ff654886e06" containerName="sg-core" containerID="cri-o://e6f5f5da90afc5862246a3cb280c2cfb165cb727b1b3378acf1935e1e68a96c6" gracePeriod=30 Oct 09 15:37:03 crc kubenswrapper[4719]: I1009 15:37:03.216740 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="891bb60f-cf37-4c12-8907-4ff654886e06" containerName="ceilometer-notification-agent" containerID="cri-o://d5b6a17d0e627a0cb3e428fc34c217b8819f0837c1b622fc4459e5dc5f2680c9" gracePeriod=30 Oct 09 15:37:03 crc kubenswrapper[4719]: I1009 15:37:03.461264 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-r57qh"] Oct 09 15:37:03 crc kubenswrapper[4719]: I1009 15:37:03.563520 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77b756999f-5ptkd" Oct 09 15:37:03 crc kubenswrapper[4719]: I1009 15:37:03.575910 4719 generic.go:334] "Generic (PLEG): container finished" podID="891bb60f-cf37-4c12-8907-4ff654886e06" containerID="e57d1a52928c238a2adebb6b4548f25b8224b9fd8a83e5909154fd3923b4b22d" exitCode=0 Oct 09 15:37:03 crc kubenswrapper[4719]: I1009 15:37:03.575947 4719 generic.go:334] "Generic (PLEG): container finished" podID="891bb60f-cf37-4c12-8907-4ff654886e06" containerID="e6f5f5da90afc5862246a3cb280c2cfb165cb727b1b3378acf1935e1e68a96c6" exitCode=2 Oct 09 15:37:03 crc kubenswrapper[4719]: I1009 15:37:03.576042 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"891bb60f-cf37-4c12-8907-4ff654886e06","Type":"ContainerDied","Data":"e57d1a52928c238a2adebb6b4548f25b8224b9fd8a83e5909154fd3923b4b22d"} Oct 09 15:37:03 crc kubenswrapper[4719]: I1009 15:37:03.576070 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"891bb60f-cf37-4c12-8907-4ff654886e06","Type":"ContainerDied","Data":"e6f5f5da90afc5862246a3cb280c2cfb165cb727b1b3378acf1935e1e68a96c6"} Oct 09 15:37:03 crc kubenswrapper[4719]: I1009 15:37:03.596304 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-r57qh" event={"ID":"de61779c-4ad9-40bd-908e-27b82b5c82cb","Type":"ContainerStarted","Data":"ad9ad03ab0eb9b1d92611361bd2756fe71fdcdcecd78fb6ecf3453568b420a69"} Oct 09 15:37:03 crc kubenswrapper[4719]: I1009 15:37:03.609503 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 15:37:03 crc kubenswrapper[4719]: W1009 15:37:03.618497 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46e46828_5596_4987_8998_c52dbaf93086.slice/crio-d3947d92fe24a0a1567b225c6cde94b7e44d9e3e2524a80d54f9857203867cf5 WatchSource:0}: Error finding container d3947d92fe24a0a1567b225c6cde94b7e44d9e3e2524a80d54f9857203867cf5: Status 404 returned error can't find the container with id d3947d92fe24a0a1567b225c6cde94b7e44d9e3e2524a80d54f9857203867cf5 Oct 09 15:37:03 crc kubenswrapper[4719]: I1009 15:37:03.654640 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d78b7c8c7-8l5xz"] Oct 09 15:37:03 crc kubenswrapper[4719]: I1009 15:37:03.654919 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" podUID="9a01d050-b1bc-4b48-a783-64a36c24ad6e" containerName="dnsmasq-dns" containerID="cri-o://3651b71e9e850278ec8a4d9fadb14df578a3764ebad9838f6b067a451a51bfb8" gracePeriod=10 Oct 09 15:37:03 crc kubenswrapper[4719]: I1009 15:37:03.806170 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" podUID="9a01d050-b1bc-4b48-a783-64a36c24ad6e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.179:5353: connect: connection refused" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.620591 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.627597 4719 generic.go:334] "Generic (PLEG): container finished" podID="891bb60f-cf37-4c12-8907-4ff654886e06" containerID="d5b6a17d0e627a0cb3e428fc34c217b8819f0837c1b622fc4459e5dc5f2680c9" exitCode=0 Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.627629 4719 generic.go:334] "Generic (PLEG): container finished" podID="891bb60f-cf37-4c12-8907-4ff654886e06" containerID="72fd53e4f308ec26610a590d41cd0191d547dc131090ec90c4dcb57428e273d4" exitCode=0 Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.627716 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"891bb60f-cf37-4c12-8907-4ff654886e06","Type":"ContainerDied","Data":"d5b6a17d0e627a0cb3e428fc34c217b8819f0837c1b622fc4459e5dc5f2680c9"} Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.627751 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"891bb60f-cf37-4c12-8907-4ff654886e06","Type":"ContainerDied","Data":"72fd53e4f308ec26610a590d41cd0191d547dc131090ec90c4dcb57428e273d4"} Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.636891 4719 generic.go:334] "Generic (PLEG): container finished" podID="9a01d050-b1bc-4b48-a783-64a36c24ad6e" containerID="3651b71e9e850278ec8a4d9fadb14df578a3764ebad9838f6b067a451a51bfb8" exitCode=0 Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.636997 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" event={"ID":"9a01d050-b1bc-4b48-a783-64a36c24ad6e","Type":"ContainerDied","Data":"3651b71e9e850278ec8a4d9fadb14df578a3764ebad9838f6b067a451a51bfb8"} Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.637083 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" event={"ID":"9a01d050-b1bc-4b48-a783-64a36c24ad6e","Type":"ContainerDied","Data":"d493e4d2a8a78615e781ce934302464da9fed656147684829148d9876058506c"} Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.637128 4719 scope.go:117] "RemoveContainer" containerID="3651b71e9e850278ec8a4d9fadb14df578a3764ebad9838f6b067a451a51bfb8" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.637196 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d78b7c8c7-8l5xz" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.652184 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"46e46828-5596-4987-8998-c52dbaf93086","Type":"ContainerStarted","Data":"72703de225b201b3e60716b3ce447db573454f32bdeca4610a4940c201ab444a"} Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.652223 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"46e46828-5596-4987-8998-c52dbaf93086","Type":"ContainerStarted","Data":"d3947d92fe24a0a1567b225c6cde94b7e44d9e3e2524a80d54f9857203867cf5"} Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.678903 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.726215 4719 scope.go:117] "RemoveContainer" containerID="c835ad72851233f80e8a0b0eb5ed2019a1226f90d7dd9de3d34edb26f6e4d377" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.779155 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/891bb60f-cf37-4c12-8907-4ff654886e06-combined-ca-bundle\") pod \"891bb60f-cf37-4c12-8907-4ff654886e06\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.779207 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/891bb60f-cf37-4c12-8907-4ff654886e06-log-httpd\") pod \"891bb60f-cf37-4c12-8907-4ff654886e06\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.779240 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/891bb60f-cf37-4c12-8907-4ff654886e06-config-data\") pod \"891bb60f-cf37-4c12-8907-4ff654886e06\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.779380 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-dns-swift-storage-0\") pod \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\" (UID: \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\") " Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.779734 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/891bb60f-cf37-4c12-8907-4ff654886e06-run-httpd\") pod \"891bb60f-cf37-4c12-8907-4ff654886e06\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.779790 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5bnd\" (UniqueName: \"kubernetes.io/projected/9a01d050-b1bc-4b48-a783-64a36c24ad6e-kube-api-access-z5bnd\") pod \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\" (UID: \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\") " Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.779852 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/891bb60f-cf37-4c12-8907-4ff654886e06-scripts\") pod \"891bb60f-cf37-4c12-8907-4ff654886e06\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.779915 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/891bb60f-cf37-4c12-8907-4ff654886e06-sg-core-conf-yaml\") pod \"891bb60f-cf37-4c12-8907-4ff654886e06\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.779938 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-config\") pod \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\" (UID: \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\") " Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.779993 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mht27\" (UniqueName: \"kubernetes.io/projected/891bb60f-cf37-4c12-8907-4ff654886e06-kube-api-access-mht27\") pod \"891bb60f-cf37-4c12-8907-4ff654886e06\" (UID: \"891bb60f-cf37-4c12-8907-4ff654886e06\") " Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.780007 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-dns-svc\") pod \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\" (UID: \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\") " Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.780046 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-ovsdbserver-nb\") pod \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\" (UID: \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\") " Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.780063 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-ovsdbserver-sb\") pod \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\" (UID: \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\") " Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.786721 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/891bb60f-cf37-4c12-8907-4ff654886e06-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "891bb60f-cf37-4c12-8907-4ff654886e06" (UID: "891bb60f-cf37-4c12-8907-4ff654886e06"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.787054 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/891bb60f-cf37-4c12-8907-4ff654886e06-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "891bb60f-cf37-4c12-8907-4ff654886e06" (UID: "891bb60f-cf37-4c12-8907-4ff654886e06"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.792933 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a01d050-b1bc-4b48-a783-64a36c24ad6e-kube-api-access-z5bnd" (OuterVolumeSpecName: "kube-api-access-z5bnd") pod "9a01d050-b1bc-4b48-a783-64a36c24ad6e" (UID: "9a01d050-b1bc-4b48-a783-64a36c24ad6e"). InnerVolumeSpecName "kube-api-access-z5bnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.806798 4719 scope.go:117] "RemoveContainer" containerID="3651b71e9e850278ec8a4d9fadb14df578a3764ebad9838f6b067a451a51bfb8" Oct 09 15:37:04 crc kubenswrapper[4719]: E1009 15:37:04.807229 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3651b71e9e850278ec8a4d9fadb14df578a3764ebad9838f6b067a451a51bfb8\": container with ID starting with 3651b71e9e850278ec8a4d9fadb14df578a3764ebad9838f6b067a451a51bfb8 not found: ID does not exist" containerID="3651b71e9e850278ec8a4d9fadb14df578a3764ebad9838f6b067a451a51bfb8" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.807264 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3651b71e9e850278ec8a4d9fadb14df578a3764ebad9838f6b067a451a51bfb8"} err="failed to get container status \"3651b71e9e850278ec8a4d9fadb14df578a3764ebad9838f6b067a451a51bfb8\": rpc error: code = NotFound desc = could not find container \"3651b71e9e850278ec8a4d9fadb14df578a3764ebad9838f6b067a451a51bfb8\": container with ID starting with 3651b71e9e850278ec8a4d9fadb14df578a3764ebad9838f6b067a451a51bfb8 not found: ID does not exist" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.807290 4719 scope.go:117] "RemoveContainer" containerID="c835ad72851233f80e8a0b0eb5ed2019a1226f90d7dd9de3d34edb26f6e4d377" Oct 09 15:37:04 crc kubenswrapper[4719]: E1009 15:37:04.807627 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c835ad72851233f80e8a0b0eb5ed2019a1226f90d7dd9de3d34edb26f6e4d377\": container with ID starting with c835ad72851233f80e8a0b0eb5ed2019a1226f90d7dd9de3d34edb26f6e4d377 not found: ID does not exist" containerID="c835ad72851233f80e8a0b0eb5ed2019a1226f90d7dd9de3d34edb26f6e4d377" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.807658 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c835ad72851233f80e8a0b0eb5ed2019a1226f90d7dd9de3d34edb26f6e4d377"} err="failed to get container status \"c835ad72851233f80e8a0b0eb5ed2019a1226f90d7dd9de3d34edb26f6e4d377\": rpc error: code = NotFound desc = could not find container \"c835ad72851233f80e8a0b0eb5ed2019a1226f90d7dd9de3d34edb26f6e4d377\": container with ID starting with c835ad72851233f80e8a0b0eb5ed2019a1226f90d7dd9de3d34edb26f6e4d377 not found: ID does not exist" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.811287 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/891bb60f-cf37-4c12-8907-4ff654886e06-kube-api-access-mht27" (OuterVolumeSpecName: "kube-api-access-mht27") pod "891bb60f-cf37-4c12-8907-4ff654886e06" (UID: "891bb60f-cf37-4c12-8907-4ff654886e06"). InnerVolumeSpecName "kube-api-access-mht27". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.811427 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/891bb60f-cf37-4c12-8907-4ff654886e06-scripts" (OuterVolumeSpecName: "scripts") pod "891bb60f-cf37-4c12-8907-4ff654886e06" (UID: "891bb60f-cf37-4c12-8907-4ff654886e06"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.858870 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/891bb60f-cf37-4c12-8907-4ff654886e06-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "891bb60f-cf37-4c12-8907-4ff654886e06" (UID: "891bb60f-cf37-4c12-8907-4ff654886e06"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.882447 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9a01d050-b1bc-4b48-a783-64a36c24ad6e" (UID: "9a01d050-b1bc-4b48-a783-64a36c24ad6e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.883425 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-dns-swift-storage-0\") pod \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\" (UID: \"9a01d050-b1bc-4b48-a783-64a36c24ad6e\") " Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.883838 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5bnd\" (UniqueName: \"kubernetes.io/projected/9a01d050-b1bc-4b48-a783-64a36c24ad6e-kube-api-access-z5bnd\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.883850 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/891bb60f-cf37-4c12-8907-4ff654886e06-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.883859 4719 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/891bb60f-cf37-4c12-8907-4ff654886e06-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.883867 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mht27\" (UniqueName: \"kubernetes.io/projected/891bb60f-cf37-4c12-8907-4ff654886e06-kube-api-access-mht27\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.883875 4719 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/891bb60f-cf37-4c12-8907-4ff654886e06-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.883883 4719 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/891bb60f-cf37-4c12-8907-4ff654886e06-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:04 crc kubenswrapper[4719]: W1009 15:37:04.883955 4719 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/9a01d050-b1bc-4b48-a783-64a36c24ad6e/volumes/kubernetes.io~configmap/dns-swift-storage-0 Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.883966 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9a01d050-b1bc-4b48-a783-64a36c24ad6e" (UID: "9a01d050-b1bc-4b48-a783-64a36c24ad6e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.890007 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9a01d050-b1bc-4b48-a783-64a36c24ad6e" (UID: "9a01d050-b1bc-4b48-a783-64a36c24ad6e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.895221 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9a01d050-b1bc-4b48-a783-64a36c24ad6e" (UID: "9a01d050-b1bc-4b48-a783-64a36c24ad6e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.898868 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9a01d050-b1bc-4b48-a783-64a36c24ad6e" (UID: "9a01d050-b1bc-4b48-a783-64a36c24ad6e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.907740 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-config" (OuterVolumeSpecName: "config") pod "9a01d050-b1bc-4b48-a783-64a36c24ad6e" (UID: "9a01d050-b1bc-4b48-a783-64a36c24ad6e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.942790 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/891bb60f-cf37-4c12-8907-4ff654886e06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "891bb60f-cf37-4c12-8907-4ff654886e06" (UID: "891bb60f-cf37-4c12-8907-4ff654886e06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.963970 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/891bb60f-cf37-4c12-8907-4ff654886e06-config-data" (OuterVolumeSpecName: "config-data") pod "891bb60f-cf37-4c12-8907-4ff654886e06" (UID: "891bb60f-cf37-4c12-8907-4ff654886e06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.985477 4719 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.985516 4719 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.985531 4719 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.985542 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/891bb60f-cf37-4c12-8907-4ff654886e06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.985555 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/891bb60f-cf37-4c12-8907-4ff654886e06-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.985567 4719 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:04 crc kubenswrapper[4719]: I1009 15:37:04.985578 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a01d050-b1bc-4b48-a783-64a36c24ad6e-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.147558 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d78b7c8c7-8l5xz"] Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.156585 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d78b7c8c7-8l5xz"] Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.177004 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a01d050-b1bc-4b48-a783-64a36c24ad6e" path="/var/lib/kubelet/pods/9a01d050-b1bc-4b48-a783-64a36c24ad6e/volumes" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.665425 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"891bb60f-cf37-4c12-8907-4ff654886e06","Type":"ContainerDied","Data":"622025d6a904faf315c1110529e3540e812e054d78f2276615fdfaf9968bd624"} Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.665446 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.665482 4719 scope.go:117] "RemoveContainer" containerID="e57d1a52928c238a2adebb6b4548f25b8224b9fd8a83e5909154fd3923b4b22d" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.674842 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"46e46828-5596-4987-8998-c52dbaf93086","Type":"ContainerStarted","Data":"b33a9aefbf7d9f476b8c62b327469e6b55533ef24a55d7fa4adabdc92418aa5e"} Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.692878 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.700404 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.712243 4719 scope.go:117] "RemoveContainer" containerID="e6f5f5da90afc5862246a3cb280c2cfb165cb727b1b3378acf1935e1e68a96c6" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.724978 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:37:05 crc kubenswrapper[4719]: E1009 15:37:05.725585 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a01d050-b1bc-4b48-a783-64a36c24ad6e" containerName="init" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.725605 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a01d050-b1bc-4b48-a783-64a36c24ad6e" containerName="init" Oct 09 15:37:05 crc kubenswrapper[4719]: E1009 15:37:05.725616 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="891bb60f-cf37-4c12-8907-4ff654886e06" containerName="proxy-httpd" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.725623 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="891bb60f-cf37-4c12-8907-4ff654886e06" containerName="proxy-httpd" Oct 09 15:37:05 crc kubenswrapper[4719]: E1009 15:37:05.725652 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="891bb60f-cf37-4c12-8907-4ff654886e06" containerName="sg-core" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.725660 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="891bb60f-cf37-4c12-8907-4ff654886e06" containerName="sg-core" Oct 09 15:37:05 crc kubenswrapper[4719]: E1009 15:37:05.725675 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="891bb60f-cf37-4c12-8907-4ff654886e06" containerName="ceilometer-central-agent" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.725682 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="891bb60f-cf37-4c12-8907-4ff654886e06" containerName="ceilometer-central-agent" Oct 09 15:37:05 crc kubenswrapper[4719]: E1009 15:37:05.725692 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="891bb60f-cf37-4c12-8907-4ff654886e06" containerName="ceilometer-notification-agent" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.725697 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="891bb60f-cf37-4c12-8907-4ff654886e06" containerName="ceilometer-notification-agent" Oct 09 15:37:05 crc kubenswrapper[4719]: E1009 15:37:05.725717 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a01d050-b1bc-4b48-a783-64a36c24ad6e" containerName="dnsmasq-dns" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.725722 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a01d050-b1bc-4b48-a783-64a36c24ad6e" containerName="dnsmasq-dns" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.725893 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="891bb60f-cf37-4c12-8907-4ff654886e06" containerName="ceilometer-notification-agent" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.725908 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="891bb60f-cf37-4c12-8907-4ff654886e06" containerName="ceilometer-central-agent" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.725924 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="891bb60f-cf37-4c12-8907-4ff654886e06" containerName="proxy-httpd" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.725949 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="891bb60f-cf37-4c12-8907-4ff654886e06" containerName="sg-core" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.725961 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a01d050-b1bc-4b48-a783-64a36c24ad6e" containerName="dnsmasq-dns" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.726786 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.726736541 podStartE2EDuration="3.726736541s" podCreationTimestamp="2025-10-09 15:37:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:37:05.70737538 +0000 UTC m=+1131.217086685" watchObservedRunningTime="2025-10-09 15:37:05.726736541 +0000 UTC m=+1131.236447856" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.728040 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.730257 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.734042 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.752415 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.772016 4719 scope.go:117] "RemoveContainer" containerID="d5b6a17d0e627a0cb3e428fc34c217b8819f0837c1b622fc4459e5dc5f2680c9" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.807715 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/884e1376-b069-4cc2-8ff9-3a93832be9c0-run-httpd\") pod \"ceilometer-0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " pod="openstack/ceilometer-0" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.807798 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/884e1376-b069-4cc2-8ff9-3a93832be9c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " pod="openstack/ceilometer-0" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.807925 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/884e1376-b069-4cc2-8ff9-3a93832be9c0-scripts\") pod \"ceilometer-0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " pod="openstack/ceilometer-0" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.808046 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884e1376-b069-4cc2-8ff9-3a93832be9c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " pod="openstack/ceilometer-0" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.808115 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/884e1376-b069-4cc2-8ff9-3a93832be9c0-config-data\") pod \"ceilometer-0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " pod="openstack/ceilometer-0" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.808185 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5sc9\" (UniqueName: \"kubernetes.io/projected/884e1376-b069-4cc2-8ff9-3a93832be9c0-kube-api-access-p5sc9\") pod \"ceilometer-0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " pod="openstack/ceilometer-0" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.808377 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/884e1376-b069-4cc2-8ff9-3a93832be9c0-log-httpd\") pod \"ceilometer-0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " pod="openstack/ceilometer-0" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.809982 4719 scope.go:117] "RemoveContainer" containerID="72fd53e4f308ec26610a590d41cd0191d547dc131090ec90c4dcb57428e273d4" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.909770 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5sc9\" (UniqueName: \"kubernetes.io/projected/884e1376-b069-4cc2-8ff9-3a93832be9c0-kube-api-access-p5sc9\") pod \"ceilometer-0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " pod="openstack/ceilometer-0" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.909841 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/884e1376-b069-4cc2-8ff9-3a93832be9c0-log-httpd\") pod \"ceilometer-0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " pod="openstack/ceilometer-0" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.909880 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/884e1376-b069-4cc2-8ff9-3a93832be9c0-run-httpd\") pod \"ceilometer-0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " pod="openstack/ceilometer-0" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.909909 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/884e1376-b069-4cc2-8ff9-3a93832be9c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " pod="openstack/ceilometer-0" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.909983 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/884e1376-b069-4cc2-8ff9-3a93832be9c0-scripts\") pod \"ceilometer-0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " pod="openstack/ceilometer-0" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.910006 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884e1376-b069-4cc2-8ff9-3a93832be9c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " pod="openstack/ceilometer-0" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.910029 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/884e1376-b069-4cc2-8ff9-3a93832be9c0-config-data\") pod \"ceilometer-0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " pod="openstack/ceilometer-0" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.911142 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/884e1376-b069-4cc2-8ff9-3a93832be9c0-run-httpd\") pod \"ceilometer-0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " pod="openstack/ceilometer-0" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.912405 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/884e1376-b069-4cc2-8ff9-3a93832be9c0-log-httpd\") pod \"ceilometer-0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " pod="openstack/ceilometer-0" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.915384 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/884e1376-b069-4cc2-8ff9-3a93832be9c0-scripts\") pod \"ceilometer-0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " pod="openstack/ceilometer-0" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.915383 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/884e1376-b069-4cc2-8ff9-3a93832be9c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " pod="openstack/ceilometer-0" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.916892 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/884e1376-b069-4cc2-8ff9-3a93832be9c0-config-data\") pod \"ceilometer-0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " pod="openstack/ceilometer-0" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.920852 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884e1376-b069-4cc2-8ff9-3a93832be9c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " pod="openstack/ceilometer-0" Oct 09 15:37:05 crc kubenswrapper[4719]: I1009 15:37:05.938433 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5sc9\" (UniqueName: \"kubernetes.io/projected/884e1376-b069-4cc2-8ff9-3a93832be9c0-kube-api-access-p5sc9\") pod \"ceilometer-0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " pod="openstack/ceilometer-0" Oct 09 15:37:06 crc kubenswrapper[4719]: I1009 15:37:06.064465 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:37:06 crc kubenswrapper[4719]: I1009 15:37:06.591206 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:37:06 crc kubenswrapper[4719]: I1009 15:37:06.690184 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"884e1376-b069-4cc2-8ff9-3a93832be9c0","Type":"ContainerStarted","Data":"bfa1b789b2e2f84227dbc59bc494a0f4a8a028ae9e96ef4c14dd830a02f318e4"} Oct 09 15:37:07 crc kubenswrapper[4719]: I1009 15:37:07.173431 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="891bb60f-cf37-4c12-8907-4ff654886e06" path="/var/lib/kubelet/pods/891bb60f-cf37-4c12-8907-4ff654886e06/volumes" Oct 09 15:37:07 crc kubenswrapper[4719]: I1009 15:37:07.701675 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"884e1376-b069-4cc2-8ff9-3a93832be9c0","Type":"ContainerStarted","Data":"f599e284133cf7503eb12f1ae700039756cd0f9f3e32c57a7059aee913297b19"} Oct 09 15:37:07 crc kubenswrapper[4719]: I1009 15:37:07.701717 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"884e1376-b069-4cc2-8ff9-3a93832be9c0","Type":"ContainerStarted","Data":"7f50f73431a793c6229841375f222cf05c1df3977c86b4fb6b18eccb71944cf1"} Oct 09 15:37:07 crc kubenswrapper[4719]: I1009 15:37:07.978626 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 09 15:37:08 crc kubenswrapper[4719]: I1009 15:37:08.368779 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:37:08 crc kubenswrapper[4719]: I1009 15:37:08.722667 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"884e1376-b069-4cc2-8ff9-3a93832be9c0","Type":"ContainerStarted","Data":"c8abf1e233fc56d5170657034946a962fc9fa9e943c9da8f4f14fc879d44496d"} Oct 09 15:37:08 crc kubenswrapper[4719]: I1009 15:37:08.970304 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 15:37:08 crc kubenswrapper[4719]: I1009 15:37:08.970698 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0b11b82a-b620-435a-9274-fb3419c35a72" containerName="glance-log" containerID="cri-o://128a964196cdbb9e408fa42332db38c9f560920577b0c0d847b4db5d8f54737f" gracePeriod=30 Oct 09 15:37:08 crc kubenswrapper[4719]: I1009 15:37:08.970813 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0b11b82a-b620-435a-9274-fb3419c35a72" containerName="glance-httpd" containerID="cri-o://6411b42af443ab4bf4d25c6db21d5e174e6dfbf1dcf8c7e84532a0d83d159fec" gracePeriod=30 Oct 09 15:37:09 crc kubenswrapper[4719]: I1009 15:37:09.080004 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 09 15:37:09 crc kubenswrapper[4719]: I1009 15:37:09.742655 4719 generic.go:334] "Generic (PLEG): container finished" podID="0b11b82a-b620-435a-9274-fb3419c35a72" containerID="128a964196cdbb9e408fa42332db38c9f560920577b0c0d847b4db5d8f54737f" exitCode=143 Oct 09 15:37:09 crc kubenswrapper[4719]: I1009 15:37:09.742814 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0b11b82a-b620-435a-9274-fb3419c35a72","Type":"ContainerDied","Data":"128a964196cdbb9e408fa42332db38c9f560920577b0c0d847b4db5d8f54737f"} Oct 09 15:37:10 crc kubenswrapper[4719]: I1009 15:37:10.506847 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 15:37:10 crc kubenswrapper[4719]: I1009 15:37:10.507128 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0023a9b5-248d-48cc-8560-b109ff59fc04" containerName="glance-log" containerID="cri-o://e41e62ba79ff1042f03aff39173579f2a3716ffd22a7956025ae06a45239d4aa" gracePeriod=30 Oct 09 15:37:10 crc kubenswrapper[4719]: I1009 15:37:10.507655 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0023a9b5-248d-48cc-8560-b109ff59fc04" containerName="glance-httpd" containerID="cri-o://2bb2cd4b4e048596f8edfc6c6071d249253ba2061983dd8ac534910c2da4e411" gracePeriod=30 Oct 09 15:37:10 crc kubenswrapper[4719]: I1009 15:37:10.760366 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0023a9b5-248d-48cc-8560-b109ff59fc04","Type":"ContainerDied","Data":"e41e62ba79ff1042f03aff39173579f2a3716ffd22a7956025ae06a45239d4aa"} Oct 09 15:37:10 crc kubenswrapper[4719]: I1009 15:37:10.765654 4719 generic.go:334] "Generic (PLEG): container finished" podID="0023a9b5-248d-48cc-8560-b109ff59fc04" containerID="e41e62ba79ff1042f03aff39173579f2a3716ffd22a7956025ae06a45239d4aa" exitCode=143 Oct 09 15:37:11 crc kubenswrapper[4719]: I1009 15:37:11.161568 4719 scope.go:117] "RemoveContainer" containerID="1b0d8ce3fa12f7379def00ca2131e79585f6bdff6f8c5cc63816c53109df6822" Oct 09 15:37:11 crc kubenswrapper[4719]: I1009 15:37:11.782335 4719 generic.go:334] "Generic (PLEG): container finished" podID="0b11b82a-b620-435a-9274-fb3419c35a72" containerID="6411b42af443ab4bf4d25c6db21d5e174e6dfbf1dcf8c7e84532a0d83d159fec" exitCode=0 Oct 09 15:37:11 crc kubenswrapper[4719]: I1009 15:37:11.783171 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0b11b82a-b620-435a-9274-fb3419c35a72","Type":"ContainerDied","Data":"6411b42af443ab4bf4d25c6db21d5e174e6dfbf1dcf8c7e84532a0d83d159fec"} Oct 09 15:37:12 crc kubenswrapper[4719]: I1009 15:37:12.802294 4719 generic.go:334] "Generic (PLEG): container finished" podID="0023a9b5-248d-48cc-8560-b109ff59fc04" containerID="2bb2cd4b4e048596f8edfc6c6071d249253ba2061983dd8ac534910c2da4e411" exitCode=0 Oct 09 15:37:12 crc kubenswrapper[4719]: I1009 15:37:12.802385 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0023a9b5-248d-48cc-8560-b109ff59fc04","Type":"ContainerDied","Data":"2bb2cd4b4e048596f8edfc6c6071d249253ba2061983dd8ac534910c2da4e411"} Oct 09 15:37:13 crc kubenswrapper[4719]: I1009 15:37:13.256853 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.256452 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.336286 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b11b82a-b620-435a-9274-fb3419c35a72-scripts\") pod \"0b11b82a-b620-435a-9274-fb3419c35a72\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.336343 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0b11b82a-b620-435a-9274-fb3419c35a72-httpd-run\") pod \"0b11b82a-b620-435a-9274-fb3419c35a72\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.336381 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"0b11b82a-b620-435a-9274-fb3419c35a72\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.336440 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z568v\" (UniqueName: \"kubernetes.io/projected/0b11b82a-b620-435a-9274-fb3419c35a72-kube-api-access-z568v\") pod \"0b11b82a-b620-435a-9274-fb3419c35a72\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.336468 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b11b82a-b620-435a-9274-fb3419c35a72-logs\") pod \"0b11b82a-b620-435a-9274-fb3419c35a72\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.336491 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b11b82a-b620-435a-9274-fb3419c35a72-combined-ca-bundle\") pod \"0b11b82a-b620-435a-9274-fb3419c35a72\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.336514 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b11b82a-b620-435a-9274-fb3419c35a72-config-data\") pod \"0b11b82a-b620-435a-9274-fb3419c35a72\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.336655 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b11b82a-b620-435a-9274-fb3419c35a72-public-tls-certs\") pod \"0b11b82a-b620-435a-9274-fb3419c35a72\" (UID: \"0b11b82a-b620-435a-9274-fb3419c35a72\") " Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.339111 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b11b82a-b620-435a-9274-fb3419c35a72-logs" (OuterVolumeSpecName: "logs") pod "0b11b82a-b620-435a-9274-fb3419c35a72" (UID: "0b11b82a-b620-435a-9274-fb3419c35a72"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.341593 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b11b82a-b620-435a-9274-fb3419c35a72-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0b11b82a-b620-435a-9274-fb3419c35a72" (UID: "0b11b82a-b620-435a-9274-fb3419c35a72"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.345612 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b11b82a-b620-435a-9274-fb3419c35a72-scripts" (OuterVolumeSpecName: "scripts") pod "0b11b82a-b620-435a-9274-fb3419c35a72" (UID: "0b11b82a-b620-435a-9274-fb3419c35a72"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.348308 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "0b11b82a-b620-435a-9274-fb3419c35a72" (UID: "0b11b82a-b620-435a-9274-fb3419c35a72"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.396803 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b11b82a-b620-435a-9274-fb3419c35a72-kube-api-access-z568v" (OuterVolumeSpecName: "kube-api-access-z568v") pod "0b11b82a-b620-435a-9274-fb3419c35a72" (UID: "0b11b82a-b620-435a-9274-fb3419c35a72"). InnerVolumeSpecName "kube-api-access-z568v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.439628 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b11b82a-b620-435a-9274-fb3419c35a72-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.439657 4719 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0b11b82a-b620-435a-9274-fb3419c35a72-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.439679 4719 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.439689 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z568v\" (UniqueName: \"kubernetes.io/projected/0b11b82a-b620-435a-9274-fb3419c35a72-kube-api-access-z568v\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.439698 4719 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b11b82a-b620-435a-9274-fb3419c35a72-logs\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.513457 4719 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.524087 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.541949 4719 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.560770 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b11b82a-b620-435a-9274-fb3419c35a72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b11b82a-b620-435a-9274-fb3419c35a72" (UID: "0b11b82a-b620-435a-9274-fb3419c35a72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.623925 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b11b82a-b620-435a-9274-fb3419c35a72-config-data" (OuterVolumeSpecName: "config-data") pod "0b11b82a-b620-435a-9274-fb3419c35a72" (UID: "0b11b82a-b620-435a-9274-fb3419c35a72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.644929 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0023a9b5-248d-48cc-8560-b109ff59fc04-logs\") pod \"0023a9b5-248d-48cc-8560-b109ff59fc04\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.645062 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0023a9b5-248d-48cc-8560-b109ff59fc04-httpd-run\") pod \"0023a9b5-248d-48cc-8560-b109ff59fc04\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.645114 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0023a9b5-248d-48cc-8560-b109ff59fc04-config-data\") pod \"0023a9b5-248d-48cc-8560-b109ff59fc04\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.645133 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0023a9b5-248d-48cc-8560-b109ff59fc04-scripts\") pod \"0023a9b5-248d-48cc-8560-b109ff59fc04\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.645216 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"0023a9b5-248d-48cc-8560-b109ff59fc04\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.645280 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkjqh\" (UniqueName: \"kubernetes.io/projected/0023a9b5-248d-48cc-8560-b109ff59fc04-kube-api-access-bkjqh\") pod \"0023a9b5-248d-48cc-8560-b109ff59fc04\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.645299 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0023a9b5-248d-48cc-8560-b109ff59fc04-combined-ca-bundle\") pod \"0023a9b5-248d-48cc-8560-b109ff59fc04\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.645364 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0023a9b5-248d-48cc-8560-b109ff59fc04-internal-tls-certs\") pod \"0023a9b5-248d-48cc-8560-b109ff59fc04\" (UID: \"0023a9b5-248d-48cc-8560-b109ff59fc04\") " Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.645558 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0023a9b5-248d-48cc-8560-b109ff59fc04-logs" (OuterVolumeSpecName: "logs") pod "0023a9b5-248d-48cc-8560-b109ff59fc04" (UID: "0023a9b5-248d-48cc-8560-b109ff59fc04"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.645574 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0023a9b5-248d-48cc-8560-b109ff59fc04-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0023a9b5-248d-48cc-8560-b109ff59fc04" (UID: "0023a9b5-248d-48cc-8560-b109ff59fc04"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.646118 4719 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0023a9b5-248d-48cc-8560-b109ff59fc04-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.646133 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b11b82a-b620-435a-9274-fb3419c35a72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.646145 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b11b82a-b620-435a-9274-fb3419c35a72-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.646153 4719 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0023a9b5-248d-48cc-8560-b109ff59fc04-logs\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.647447 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b11b82a-b620-435a-9274-fb3419c35a72-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0b11b82a-b620-435a-9274-fb3419c35a72" (UID: "0b11b82a-b620-435a-9274-fb3419c35a72"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.651807 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "0023a9b5-248d-48cc-8560-b109ff59fc04" (UID: "0023a9b5-248d-48cc-8560-b109ff59fc04"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.651939 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0023a9b5-248d-48cc-8560-b109ff59fc04-kube-api-access-bkjqh" (OuterVolumeSpecName: "kube-api-access-bkjqh") pod "0023a9b5-248d-48cc-8560-b109ff59fc04" (UID: "0023a9b5-248d-48cc-8560-b109ff59fc04"). InnerVolumeSpecName "kube-api-access-bkjqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.656697 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0023a9b5-248d-48cc-8560-b109ff59fc04-scripts" (OuterVolumeSpecName: "scripts") pod "0023a9b5-248d-48cc-8560-b109ff59fc04" (UID: "0023a9b5-248d-48cc-8560-b109ff59fc04"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.703406 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0023a9b5-248d-48cc-8560-b109ff59fc04-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0023a9b5-248d-48cc-8560-b109ff59fc04" (UID: "0023a9b5-248d-48cc-8560-b109ff59fc04"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.721271 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0023a9b5-248d-48cc-8560-b109ff59fc04-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0023a9b5-248d-48cc-8560-b109ff59fc04" (UID: "0023a9b5-248d-48cc-8560-b109ff59fc04"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.725048 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0023a9b5-248d-48cc-8560-b109ff59fc04-config-data" (OuterVolumeSpecName: "config-data") pod "0023a9b5-248d-48cc-8560-b109ff59fc04" (UID: "0023a9b5-248d-48cc-8560-b109ff59fc04"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.747880 4719 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.747920 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkjqh\" (UniqueName: \"kubernetes.io/projected/0023a9b5-248d-48cc-8560-b109ff59fc04-kube-api-access-bkjqh\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.747932 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0023a9b5-248d-48cc-8560-b109ff59fc04-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.747940 4719 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0023a9b5-248d-48cc-8560-b109ff59fc04-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.747949 4719 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b11b82a-b620-435a-9274-fb3419c35a72-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.747957 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0023a9b5-248d-48cc-8560-b109ff59fc04-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.747965 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0023a9b5-248d-48cc-8560-b109ff59fc04-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.769764 4719 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.849138 4719 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.849931 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0023a9b5-248d-48cc-8560-b109ff59fc04","Type":"ContainerDied","Data":"1c101a108339ad464f44da0bd31fd5587908ecb0cf8f4001dfc1e0070adee85d"} Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.849973 4719 scope.go:117] "RemoveContainer" containerID="2bb2cd4b4e048596f8edfc6c6071d249253ba2061983dd8ac534910c2da4e411" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.850010 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.852456 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0b11b82a-b620-435a-9274-fb3419c35a72","Type":"ContainerDied","Data":"488295aac0e63fc321f1a177d8890d16f0c359fef26097a80b69ff56bc60554b"} Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.852533 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.854727 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"884e1376-b069-4cc2-8ff9-3a93832be9c0","Type":"ContainerStarted","Data":"fba13d50c006f23823b02e15fe2b03ff6a5cd31c69b6fb711e792673f3c2240e"} Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.854854 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="884e1376-b069-4cc2-8ff9-3a93832be9c0" containerName="ceilometer-central-agent" containerID="cri-o://7f50f73431a793c6229841375f222cf05c1df3977c86b4fb6b18eccb71944cf1" gracePeriod=30 Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.855070 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.855122 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="884e1376-b069-4cc2-8ff9-3a93832be9c0" containerName="proxy-httpd" containerID="cri-o://fba13d50c006f23823b02e15fe2b03ff6a5cd31c69b6fb711e792673f3c2240e" gracePeriod=30 Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.855176 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="884e1376-b069-4cc2-8ff9-3a93832be9c0" containerName="sg-core" containerID="cri-o://c8abf1e233fc56d5170657034946a962fc9fa9e943c9da8f4f14fc879d44496d" gracePeriod=30 Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.855209 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="884e1376-b069-4cc2-8ff9-3a93832be9c0" containerName="ceilometer-notification-agent" containerID="cri-o://f599e284133cf7503eb12f1ae700039756cd0f9f3e32c57a7059aee913297b19" gracePeriod=30 Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.860128 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-r57qh" event={"ID":"de61779c-4ad9-40bd-908e-27b82b5c82cb","Type":"ContainerStarted","Data":"690296cf308dd5fe94519d81b80bdbbd8feb6d565660c122d20de7a8f1fba837"} Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.871712 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"75999b62-ce1b-4a9b-8507-c8af12441083","Type":"ContainerStarted","Data":"4c60b2a50df19d61f978d5628ec6dedb3f7671d62836d8027b8f74250a9e3b07"} Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.878228 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.362943705 podStartE2EDuration="11.878208995s" podCreationTimestamp="2025-10-09 15:37:05 +0000 UTC" firstStartedPulling="2025-10-09 15:37:06.606020407 +0000 UTC m=+1132.115731692" lastFinishedPulling="2025-10-09 15:37:16.121285697 +0000 UTC m=+1141.630996982" observedRunningTime="2025-10-09 15:37:16.876603233 +0000 UTC m=+1142.386314528" watchObservedRunningTime="2025-10-09 15:37:16.878208995 +0000 UTC m=+1142.387920290" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.885639 4719 scope.go:117] "RemoveContainer" containerID="e41e62ba79ff1042f03aff39173579f2a3716ffd22a7956025ae06a45239d4aa" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.914098 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-r57qh" podStartSLOduration=2.199689592 podStartE2EDuration="14.914077933s" podCreationTimestamp="2025-10-09 15:37:02 +0000 UTC" firstStartedPulling="2025-10-09 15:37:03.476393802 +0000 UTC m=+1128.986105087" lastFinishedPulling="2025-10-09 15:37:16.190782143 +0000 UTC m=+1141.700493428" observedRunningTime="2025-10-09 15:37:16.889586079 +0000 UTC m=+1142.399297374" watchObservedRunningTime="2025-10-09 15:37:16.914077933 +0000 UTC m=+1142.423789218" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.917399 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.918123 4719 scope.go:117] "RemoveContainer" containerID="6411b42af443ab4bf4d25c6db21d5e174e6dfbf1dcf8c7e84532a0d83d159fec" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.943089 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.949871 4719 scope.go:117] "RemoveContainer" containerID="128a964196cdbb9e408fa42332db38c9f560920577b0c0d847b4db5d8f54737f" Oct 09 15:37:16 crc kubenswrapper[4719]: I1009 15:37:16.977299 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.030294 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.056417 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 15:37:17 crc kubenswrapper[4719]: E1009 15:37:17.056923 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b11b82a-b620-435a-9274-fb3419c35a72" containerName="glance-log" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.056942 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b11b82a-b620-435a-9274-fb3419c35a72" containerName="glance-log" Oct 09 15:37:17 crc kubenswrapper[4719]: E1009 15:37:17.056958 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0023a9b5-248d-48cc-8560-b109ff59fc04" containerName="glance-log" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.056965 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="0023a9b5-248d-48cc-8560-b109ff59fc04" containerName="glance-log" Oct 09 15:37:17 crc kubenswrapper[4719]: E1009 15:37:17.056989 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0023a9b5-248d-48cc-8560-b109ff59fc04" containerName="glance-httpd" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.056995 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="0023a9b5-248d-48cc-8560-b109ff59fc04" containerName="glance-httpd" Oct 09 15:37:17 crc kubenswrapper[4719]: E1009 15:37:17.057003 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b11b82a-b620-435a-9274-fb3419c35a72" containerName="glance-httpd" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.057029 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b11b82a-b620-435a-9274-fb3419c35a72" containerName="glance-httpd" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.057252 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b11b82a-b620-435a-9274-fb3419c35a72" containerName="glance-log" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.057270 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="0023a9b5-248d-48cc-8560-b109ff59fc04" containerName="glance-log" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.057286 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b11b82a-b620-435a-9274-fb3419c35a72" containerName="glance-httpd" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.057295 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="0023a9b5-248d-48cc-8560-b109ff59fc04" containerName="glance-httpd" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.058315 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.060289 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.060647 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zsmzp" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.061206 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.064147 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.098159 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.114128 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.115760 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.122209 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.122797 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.125127 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.159388 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de5609da-6273-4076-9f02-b6c4614ebd07-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"de5609da-6273-4076-9f02-b6c4614ebd07\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.162504 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd66v\" (UniqueName: \"kubernetes.io/projected/de5609da-6273-4076-9f02-b6c4614ebd07-kube-api-access-kd66v\") pod \"glance-default-internal-api-0\" (UID: \"de5609da-6273-4076-9f02-b6c4614ebd07\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.162571 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de5609da-6273-4076-9f02-b6c4614ebd07-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"de5609da-6273-4076-9f02-b6c4614ebd07\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.162794 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de5609da-6273-4076-9f02-b6c4614ebd07-logs\") pod \"glance-default-internal-api-0\" (UID: \"de5609da-6273-4076-9f02-b6c4614ebd07\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.162853 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de5609da-6273-4076-9f02-b6c4614ebd07-scripts\") pod \"glance-default-internal-api-0\" (UID: \"de5609da-6273-4076-9f02-b6c4614ebd07\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.162872 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de5609da-6273-4076-9f02-b6c4614ebd07-config-data\") pod \"glance-default-internal-api-0\" (UID: \"de5609da-6273-4076-9f02-b6c4614ebd07\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.162921 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"de5609da-6273-4076-9f02-b6c4614ebd07\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.163060 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5609da-6273-4076-9f02-b6c4614ebd07-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"de5609da-6273-4076-9f02-b6c4614ebd07\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.178721 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0023a9b5-248d-48cc-8560-b109ff59fc04" path="/var/lib/kubelet/pods/0023a9b5-248d-48cc-8560-b109ff59fc04/volumes" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.179943 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b11b82a-b620-435a-9274-fb3419c35a72" path="/var/lib/kubelet/pods/0b11b82a-b620-435a-9274-fb3419c35a72/volumes" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.264440 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3bab132-2f43-4321-99c6-6164f0f93e86-logs\") pod \"glance-default-external-api-0\" (UID: \"b3bab132-2f43-4321-99c6-6164f0f93e86\") " pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.264525 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de5609da-6273-4076-9f02-b6c4614ebd07-logs\") pod \"glance-default-internal-api-0\" (UID: \"de5609da-6273-4076-9f02-b6c4614ebd07\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.264562 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3bab132-2f43-4321-99c6-6164f0f93e86-config-data\") pod \"glance-default-external-api-0\" (UID: \"b3bab132-2f43-4321-99c6-6164f0f93e86\") " pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.264581 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de5609da-6273-4076-9f02-b6c4614ebd07-scripts\") pod \"glance-default-internal-api-0\" (UID: \"de5609da-6273-4076-9f02-b6c4614ebd07\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.264598 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de5609da-6273-4076-9f02-b6c4614ebd07-config-data\") pod \"glance-default-internal-api-0\" (UID: \"de5609da-6273-4076-9f02-b6c4614ebd07\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.264615 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b3bab132-2f43-4321-99c6-6164f0f93e86-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b3bab132-2f43-4321-99c6-6164f0f93e86\") " pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.264636 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"de5609da-6273-4076-9f02-b6c4614ebd07\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.264667 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3bab132-2f43-4321-99c6-6164f0f93e86-scripts\") pod \"glance-default-external-api-0\" (UID: \"b3bab132-2f43-4321-99c6-6164f0f93e86\") " pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.264685 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3bab132-2f43-4321-99c6-6164f0f93e86-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b3bab132-2f43-4321-99c6-6164f0f93e86\") " pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.264713 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"b3bab132-2f43-4321-99c6-6164f0f93e86\") " pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.264729 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3bab132-2f43-4321-99c6-6164f0f93e86-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b3bab132-2f43-4321-99c6-6164f0f93e86\") " pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.264770 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5609da-6273-4076-9f02-b6c4614ebd07-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"de5609da-6273-4076-9f02-b6c4614ebd07\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.264789 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6np82\" (UniqueName: \"kubernetes.io/projected/b3bab132-2f43-4321-99c6-6164f0f93e86-kube-api-access-6np82\") pod \"glance-default-external-api-0\" (UID: \"b3bab132-2f43-4321-99c6-6164f0f93e86\") " pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.264966 4719 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"de5609da-6273-4076-9f02-b6c4614ebd07\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.265089 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de5609da-6273-4076-9f02-b6c4614ebd07-logs\") pod \"glance-default-internal-api-0\" (UID: \"de5609da-6273-4076-9f02-b6c4614ebd07\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.265196 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de5609da-6273-4076-9f02-b6c4614ebd07-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"de5609da-6273-4076-9f02-b6c4614ebd07\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.265313 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd66v\" (UniqueName: \"kubernetes.io/projected/de5609da-6273-4076-9f02-b6c4614ebd07-kube-api-access-kd66v\") pod \"glance-default-internal-api-0\" (UID: \"de5609da-6273-4076-9f02-b6c4614ebd07\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.265457 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de5609da-6273-4076-9f02-b6c4614ebd07-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"de5609da-6273-4076-9f02-b6c4614ebd07\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.265587 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de5609da-6273-4076-9f02-b6c4614ebd07-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"de5609da-6273-4076-9f02-b6c4614ebd07\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.270777 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de5609da-6273-4076-9f02-b6c4614ebd07-scripts\") pod \"glance-default-internal-api-0\" (UID: \"de5609da-6273-4076-9f02-b6c4614ebd07\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.270880 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5609da-6273-4076-9f02-b6c4614ebd07-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"de5609da-6273-4076-9f02-b6c4614ebd07\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.272078 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de5609da-6273-4076-9f02-b6c4614ebd07-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"de5609da-6273-4076-9f02-b6c4614ebd07\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.274023 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de5609da-6273-4076-9f02-b6c4614ebd07-config-data\") pod \"glance-default-internal-api-0\" (UID: \"de5609da-6273-4076-9f02-b6c4614ebd07\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.287198 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd66v\" (UniqueName: \"kubernetes.io/projected/de5609da-6273-4076-9f02-b6c4614ebd07-kube-api-access-kd66v\") pod \"glance-default-internal-api-0\" (UID: \"de5609da-6273-4076-9f02-b6c4614ebd07\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.304299 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"de5609da-6273-4076-9f02-b6c4614ebd07\") " pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.367582 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3bab132-2f43-4321-99c6-6164f0f93e86-logs\") pod \"glance-default-external-api-0\" (UID: \"b3bab132-2f43-4321-99c6-6164f0f93e86\") " pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.367653 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3bab132-2f43-4321-99c6-6164f0f93e86-config-data\") pod \"glance-default-external-api-0\" (UID: \"b3bab132-2f43-4321-99c6-6164f0f93e86\") " pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.367683 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b3bab132-2f43-4321-99c6-6164f0f93e86-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b3bab132-2f43-4321-99c6-6164f0f93e86\") " pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.367708 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3bab132-2f43-4321-99c6-6164f0f93e86-scripts\") pod \"glance-default-external-api-0\" (UID: \"b3bab132-2f43-4321-99c6-6164f0f93e86\") " pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.367731 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3bab132-2f43-4321-99c6-6164f0f93e86-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b3bab132-2f43-4321-99c6-6164f0f93e86\") " pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.367772 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"b3bab132-2f43-4321-99c6-6164f0f93e86\") " pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.367794 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3bab132-2f43-4321-99c6-6164f0f93e86-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b3bab132-2f43-4321-99c6-6164f0f93e86\") " pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.367854 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6np82\" (UniqueName: \"kubernetes.io/projected/b3bab132-2f43-4321-99c6-6164f0f93e86-kube-api-access-6np82\") pod \"glance-default-external-api-0\" (UID: \"b3bab132-2f43-4321-99c6-6164f0f93e86\") " pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.368542 4719 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"b3bab132-2f43-4321-99c6-6164f0f93e86\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.369000 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3bab132-2f43-4321-99c6-6164f0f93e86-logs\") pod \"glance-default-external-api-0\" (UID: \"b3bab132-2f43-4321-99c6-6164f0f93e86\") " pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.369103 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b3bab132-2f43-4321-99c6-6164f0f93e86-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b3bab132-2f43-4321-99c6-6164f0f93e86\") " pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.371918 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3bab132-2f43-4321-99c6-6164f0f93e86-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b3bab132-2f43-4321-99c6-6164f0f93e86\") " pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.374120 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3bab132-2f43-4321-99c6-6164f0f93e86-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b3bab132-2f43-4321-99c6-6164f0f93e86\") " pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.374302 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3bab132-2f43-4321-99c6-6164f0f93e86-config-data\") pod \"glance-default-external-api-0\" (UID: \"b3bab132-2f43-4321-99c6-6164f0f93e86\") " pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.376416 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3bab132-2f43-4321-99c6-6164f0f93e86-scripts\") pod \"glance-default-external-api-0\" (UID: \"b3bab132-2f43-4321-99c6-6164f0f93e86\") " pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.387891 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6np82\" (UniqueName: \"kubernetes.io/projected/b3bab132-2f43-4321-99c6-6164f0f93e86-kube-api-access-6np82\") pod \"glance-default-external-api-0\" (UID: \"b3bab132-2f43-4321-99c6-6164f0f93e86\") " pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.392744 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.397129 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"b3bab132-2f43-4321-99c6-6164f0f93e86\") " pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.439436 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.885472 4719 generic.go:334] "Generic (PLEG): container finished" podID="884e1376-b069-4cc2-8ff9-3a93832be9c0" containerID="fba13d50c006f23823b02e15fe2b03ff6a5cd31c69b6fb711e792673f3c2240e" exitCode=0 Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.885755 4719 generic.go:334] "Generic (PLEG): container finished" podID="884e1376-b069-4cc2-8ff9-3a93832be9c0" containerID="c8abf1e233fc56d5170657034946a962fc9fa9e943c9da8f4f14fc879d44496d" exitCode=2 Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.885769 4719 generic.go:334] "Generic (PLEG): container finished" podID="884e1376-b069-4cc2-8ff9-3a93832be9c0" containerID="7f50f73431a793c6229841375f222cf05c1df3977c86b4fb6b18eccb71944cf1" exitCode=0 Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.885638 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"884e1376-b069-4cc2-8ff9-3a93832be9c0","Type":"ContainerDied","Data":"fba13d50c006f23823b02e15fe2b03ff6a5cd31c69b6fb711e792673f3c2240e"} Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.885851 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"884e1376-b069-4cc2-8ff9-3a93832be9c0","Type":"ContainerDied","Data":"c8abf1e233fc56d5170657034946a962fc9fa9e943c9da8f4f14fc879d44496d"} Oct 09 15:37:17 crc kubenswrapper[4719]: I1009 15:37:17.885867 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"884e1376-b069-4cc2-8ff9-3a93832be9c0","Type":"ContainerDied","Data":"7f50f73431a793c6229841375f222cf05c1df3977c86b4fb6b18eccb71944cf1"} Oct 09 15:37:17 crc kubenswrapper[4719]: W1009 15:37:17.991663 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde5609da_6273_4076_9f02_b6c4614ebd07.slice/crio-165a3bd37799abacec9f436b7dc31e88fc8dd78e26b4d541cea0323a2581bdda WatchSource:0}: Error finding container 165a3bd37799abacec9f436b7dc31e88fc8dd78e26b4d541cea0323a2581bdda: Status 404 returned error can't find the container with id 165a3bd37799abacec9f436b7dc31e88fc8dd78e26b4d541cea0323a2581bdda Oct 09 15:37:18 crc kubenswrapper[4719]: I1009 15:37:18.010682 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 15:37:18 crc kubenswrapper[4719]: I1009 15:37:18.197284 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 15:37:18 crc kubenswrapper[4719]: I1009 15:37:18.905802 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b3bab132-2f43-4321-99c6-6164f0f93e86","Type":"ContainerStarted","Data":"6bb4041ac9e1da9e9e3ec8decbd7935542833ad0f64ed04af263a949c3941339"} Oct 09 15:37:18 crc kubenswrapper[4719]: I1009 15:37:18.908220 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"de5609da-6273-4076-9f02-b6c4614ebd07","Type":"ContainerStarted","Data":"79e929812e78d2ba100651386ba6363e596e2d827ded91655dafcd98e06fe83a"} Oct 09 15:37:18 crc kubenswrapper[4719]: I1009 15:37:18.908260 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"de5609da-6273-4076-9f02-b6c4614ebd07","Type":"ContainerStarted","Data":"165a3bd37799abacec9f436b7dc31e88fc8dd78e26b4d541cea0323a2581bdda"} Oct 09 15:37:19 crc kubenswrapper[4719]: I1009 15:37:19.918768 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b3bab132-2f43-4321-99c6-6164f0f93e86","Type":"ContainerStarted","Data":"500d40f67a08439b081228f2dd8b3d697621059c9f8c87d668aa4a8d7228c761"} Oct 09 15:37:19 crc kubenswrapper[4719]: I1009 15:37:19.919504 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b3bab132-2f43-4321-99c6-6164f0f93e86","Type":"ContainerStarted","Data":"d6034ced47565be3c0c3f139d4a5da8fd5cfdae652da89f482b43628bb3a6ed6"} Oct 09 15:37:19 crc kubenswrapper[4719]: I1009 15:37:19.920698 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"de5609da-6273-4076-9f02-b6c4614ebd07","Type":"ContainerStarted","Data":"88216973629fc58013d19f5ee135afb8f84fddf7620488777fe951605c246d4d"} Oct 09 15:37:19 crc kubenswrapper[4719]: I1009 15:37:19.960529 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.960503815 podStartE2EDuration="3.960503815s" podCreationTimestamp="2025-10-09 15:37:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:37:19.93881552 +0000 UTC m=+1145.448526825" watchObservedRunningTime="2025-10-09 15:37:19.960503815 +0000 UTC m=+1145.470215100" Oct 09 15:37:19 crc kubenswrapper[4719]: I1009 15:37:19.987066 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.987044496 podStartE2EDuration="3.987044496s" podCreationTimestamp="2025-10-09 15:37:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:37:19.983099939 +0000 UTC m=+1145.492811224" watchObservedRunningTime="2025-10-09 15:37:19.987044496 +0000 UTC m=+1145.496755781" Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.663836 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.740016 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884e1376-b069-4cc2-8ff9-3a93832be9c0-combined-ca-bundle\") pod \"884e1376-b069-4cc2-8ff9-3a93832be9c0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.740068 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/884e1376-b069-4cc2-8ff9-3a93832be9c0-run-httpd\") pod \"884e1376-b069-4cc2-8ff9-3a93832be9c0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.740114 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/884e1376-b069-4cc2-8ff9-3a93832be9c0-log-httpd\") pod \"884e1376-b069-4cc2-8ff9-3a93832be9c0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.740137 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5sc9\" (UniqueName: \"kubernetes.io/projected/884e1376-b069-4cc2-8ff9-3a93832be9c0-kube-api-access-p5sc9\") pod \"884e1376-b069-4cc2-8ff9-3a93832be9c0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.740163 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/884e1376-b069-4cc2-8ff9-3a93832be9c0-config-data\") pod \"884e1376-b069-4cc2-8ff9-3a93832be9c0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.740179 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/884e1376-b069-4cc2-8ff9-3a93832be9c0-scripts\") pod \"884e1376-b069-4cc2-8ff9-3a93832be9c0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.740659 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/884e1376-b069-4cc2-8ff9-3a93832be9c0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "884e1376-b069-4cc2-8ff9-3a93832be9c0" (UID: "884e1376-b069-4cc2-8ff9-3a93832be9c0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.740782 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/884e1376-b069-4cc2-8ff9-3a93832be9c0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "884e1376-b069-4cc2-8ff9-3a93832be9c0" (UID: "884e1376-b069-4cc2-8ff9-3a93832be9c0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.741366 4719 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/884e1376-b069-4cc2-8ff9-3a93832be9c0-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.741394 4719 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/884e1376-b069-4cc2-8ff9-3a93832be9c0-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.745908 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/884e1376-b069-4cc2-8ff9-3a93832be9c0-kube-api-access-p5sc9" (OuterVolumeSpecName: "kube-api-access-p5sc9") pod "884e1376-b069-4cc2-8ff9-3a93832be9c0" (UID: "884e1376-b069-4cc2-8ff9-3a93832be9c0"). InnerVolumeSpecName "kube-api-access-p5sc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.748467 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884e1376-b069-4cc2-8ff9-3a93832be9c0-scripts" (OuterVolumeSpecName: "scripts") pod "884e1376-b069-4cc2-8ff9-3a93832be9c0" (UID: "884e1376-b069-4cc2-8ff9-3a93832be9c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.834569 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884e1376-b069-4cc2-8ff9-3a93832be9c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "884e1376-b069-4cc2-8ff9-3a93832be9c0" (UID: "884e1376-b069-4cc2-8ff9-3a93832be9c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.842745 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/884e1376-b069-4cc2-8ff9-3a93832be9c0-sg-core-conf-yaml\") pod \"884e1376-b069-4cc2-8ff9-3a93832be9c0\" (UID: \"884e1376-b069-4cc2-8ff9-3a93832be9c0\") " Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.843477 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884e1376-b069-4cc2-8ff9-3a93832be9c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.843506 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5sc9\" (UniqueName: \"kubernetes.io/projected/884e1376-b069-4cc2-8ff9-3a93832be9c0-kube-api-access-p5sc9\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.843520 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/884e1376-b069-4cc2-8ff9-3a93832be9c0-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.867816 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884e1376-b069-4cc2-8ff9-3a93832be9c0-config-data" (OuterVolumeSpecName: "config-data") pod "884e1376-b069-4cc2-8ff9-3a93832be9c0" (UID: "884e1376-b069-4cc2-8ff9-3a93832be9c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.869579 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884e1376-b069-4cc2-8ff9-3a93832be9c0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "884e1376-b069-4cc2-8ff9-3a93832be9c0" (UID: "884e1376-b069-4cc2-8ff9-3a93832be9c0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.935773 4719 generic.go:334] "Generic (PLEG): container finished" podID="884e1376-b069-4cc2-8ff9-3a93832be9c0" containerID="f599e284133cf7503eb12f1ae700039756cd0f9f3e32c57a7059aee913297b19" exitCode=0 Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.935881 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"884e1376-b069-4cc2-8ff9-3a93832be9c0","Type":"ContainerDied","Data":"f599e284133cf7503eb12f1ae700039756cd0f9f3e32c57a7059aee913297b19"} Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.935917 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"884e1376-b069-4cc2-8ff9-3a93832be9c0","Type":"ContainerDied","Data":"bfa1b789b2e2f84227dbc59bc494a0f4a8a028ae9e96ef4c14dd830a02f318e4"} Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.935934 4719 scope.go:117] "RemoveContainer" containerID="fba13d50c006f23823b02e15fe2b03ff6a5cd31c69b6fb711e792673f3c2240e" Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.935931 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.945817 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/884e1376-b069-4cc2-8ff9-3a93832be9c0-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.945851 4719 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/884e1376-b069-4cc2-8ff9-3a93832be9c0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.958646 4719 scope.go:117] "RemoveContainer" containerID="c8abf1e233fc56d5170657034946a962fc9fa9e943c9da8f4f14fc879d44496d" Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.986167 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:37:20 crc kubenswrapper[4719]: I1009 15:37:20.997505 4719 scope.go:117] "RemoveContainer" containerID="f599e284133cf7503eb12f1ae700039756cd0f9f3e32c57a7059aee913297b19" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.004931 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.020898 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:37:21 crc kubenswrapper[4719]: E1009 15:37:21.022266 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="884e1376-b069-4cc2-8ff9-3a93832be9c0" containerName="ceilometer-central-agent" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.022296 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="884e1376-b069-4cc2-8ff9-3a93832be9c0" containerName="ceilometer-central-agent" Oct 09 15:37:21 crc kubenswrapper[4719]: E1009 15:37:21.022316 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="884e1376-b069-4cc2-8ff9-3a93832be9c0" containerName="sg-core" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.022324 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="884e1376-b069-4cc2-8ff9-3a93832be9c0" containerName="sg-core" Oct 09 15:37:21 crc kubenswrapper[4719]: E1009 15:37:21.022345 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="884e1376-b069-4cc2-8ff9-3a93832be9c0" containerName="ceilometer-notification-agent" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.022375 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="884e1376-b069-4cc2-8ff9-3a93832be9c0" containerName="ceilometer-notification-agent" Oct 09 15:37:21 crc kubenswrapper[4719]: E1009 15:37:21.022400 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="884e1376-b069-4cc2-8ff9-3a93832be9c0" containerName="proxy-httpd" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.022407 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="884e1376-b069-4cc2-8ff9-3a93832be9c0" containerName="proxy-httpd" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.022628 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="884e1376-b069-4cc2-8ff9-3a93832be9c0" containerName="proxy-httpd" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.022660 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="884e1376-b069-4cc2-8ff9-3a93832be9c0" containerName="sg-core" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.022673 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="884e1376-b069-4cc2-8ff9-3a93832be9c0" containerName="ceilometer-central-agent" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.022681 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="884e1376-b069-4cc2-8ff9-3a93832be9c0" containerName="ceilometer-notification-agent" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.024903 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.027265 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.027586 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.029522 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.035434 4719 scope.go:117] "RemoveContainer" containerID="7f50f73431a793c6229841375f222cf05c1df3977c86b4fb6b18eccb71944cf1" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.065441 4719 scope.go:117] "RemoveContainer" containerID="fba13d50c006f23823b02e15fe2b03ff6a5cd31c69b6fb711e792673f3c2240e" Oct 09 15:37:21 crc kubenswrapper[4719]: E1009 15:37:21.065996 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fba13d50c006f23823b02e15fe2b03ff6a5cd31c69b6fb711e792673f3c2240e\": container with ID starting with fba13d50c006f23823b02e15fe2b03ff6a5cd31c69b6fb711e792673f3c2240e not found: ID does not exist" containerID="fba13d50c006f23823b02e15fe2b03ff6a5cd31c69b6fb711e792673f3c2240e" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.066054 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fba13d50c006f23823b02e15fe2b03ff6a5cd31c69b6fb711e792673f3c2240e"} err="failed to get container status \"fba13d50c006f23823b02e15fe2b03ff6a5cd31c69b6fb711e792673f3c2240e\": rpc error: code = NotFound desc = could not find container \"fba13d50c006f23823b02e15fe2b03ff6a5cd31c69b6fb711e792673f3c2240e\": container with ID starting with fba13d50c006f23823b02e15fe2b03ff6a5cd31c69b6fb711e792673f3c2240e not found: ID does not exist" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.066089 4719 scope.go:117] "RemoveContainer" containerID="c8abf1e233fc56d5170657034946a962fc9fa9e943c9da8f4f14fc879d44496d" Oct 09 15:37:21 crc kubenswrapper[4719]: E1009 15:37:21.067860 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8abf1e233fc56d5170657034946a962fc9fa9e943c9da8f4f14fc879d44496d\": container with ID starting with c8abf1e233fc56d5170657034946a962fc9fa9e943c9da8f4f14fc879d44496d not found: ID does not exist" containerID="c8abf1e233fc56d5170657034946a962fc9fa9e943c9da8f4f14fc879d44496d" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.067900 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8abf1e233fc56d5170657034946a962fc9fa9e943c9da8f4f14fc879d44496d"} err="failed to get container status \"c8abf1e233fc56d5170657034946a962fc9fa9e943c9da8f4f14fc879d44496d\": rpc error: code = NotFound desc = could not find container \"c8abf1e233fc56d5170657034946a962fc9fa9e943c9da8f4f14fc879d44496d\": container with ID starting with c8abf1e233fc56d5170657034946a962fc9fa9e943c9da8f4f14fc879d44496d not found: ID does not exist" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.067925 4719 scope.go:117] "RemoveContainer" containerID="f599e284133cf7503eb12f1ae700039756cd0f9f3e32c57a7059aee913297b19" Oct 09 15:37:21 crc kubenswrapper[4719]: E1009 15:37:21.068232 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f599e284133cf7503eb12f1ae700039756cd0f9f3e32c57a7059aee913297b19\": container with ID starting with f599e284133cf7503eb12f1ae700039756cd0f9f3e32c57a7059aee913297b19 not found: ID does not exist" containerID="f599e284133cf7503eb12f1ae700039756cd0f9f3e32c57a7059aee913297b19" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.068253 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f599e284133cf7503eb12f1ae700039756cd0f9f3e32c57a7059aee913297b19"} err="failed to get container status \"f599e284133cf7503eb12f1ae700039756cd0f9f3e32c57a7059aee913297b19\": rpc error: code = NotFound desc = could not find container \"f599e284133cf7503eb12f1ae700039756cd0f9f3e32c57a7059aee913297b19\": container with ID starting with f599e284133cf7503eb12f1ae700039756cd0f9f3e32c57a7059aee913297b19 not found: ID does not exist" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.068267 4719 scope.go:117] "RemoveContainer" containerID="7f50f73431a793c6229841375f222cf05c1df3977c86b4fb6b18eccb71944cf1" Oct 09 15:37:21 crc kubenswrapper[4719]: E1009 15:37:21.068641 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f50f73431a793c6229841375f222cf05c1df3977c86b4fb6b18eccb71944cf1\": container with ID starting with 7f50f73431a793c6229841375f222cf05c1df3977c86b4fb6b18eccb71944cf1 not found: ID does not exist" containerID="7f50f73431a793c6229841375f222cf05c1df3977c86b4fb6b18eccb71944cf1" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.068673 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f50f73431a793c6229841375f222cf05c1df3977c86b4fb6b18eccb71944cf1"} err="failed to get container status \"7f50f73431a793c6229841375f222cf05c1df3977c86b4fb6b18eccb71944cf1\": rpc error: code = NotFound desc = could not find container \"7f50f73431a793c6229841375f222cf05c1df3977c86b4fb6b18eccb71944cf1\": container with ID starting with 7f50f73431a793c6229841375f222cf05c1df3977c86b4fb6b18eccb71944cf1 not found: ID does not exist" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.148598 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b5a962-57a0-4466-8be1-a849530b1c91-run-httpd\") pod \"ceilometer-0\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " pod="openstack/ceilometer-0" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.148720 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61b5a962-57a0-4466-8be1-a849530b1c91-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " pod="openstack/ceilometer-0" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.148748 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61b5a962-57a0-4466-8be1-a849530b1c91-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " pod="openstack/ceilometer-0" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.148771 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61b5a962-57a0-4466-8be1-a849530b1c91-config-data\") pod \"ceilometer-0\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " pod="openstack/ceilometer-0" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.148796 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b5a962-57a0-4466-8be1-a849530b1c91-log-httpd\") pod \"ceilometer-0\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " pod="openstack/ceilometer-0" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.148820 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61b5a962-57a0-4466-8be1-a849530b1c91-scripts\") pod \"ceilometer-0\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " pod="openstack/ceilometer-0" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.148903 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcfcp\" (UniqueName: \"kubernetes.io/projected/61b5a962-57a0-4466-8be1-a849530b1c91-kube-api-access-wcfcp\") pod \"ceilometer-0\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " pod="openstack/ceilometer-0" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.180277 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="884e1376-b069-4cc2-8ff9-3a93832be9c0" path="/var/lib/kubelet/pods/884e1376-b069-4cc2-8ff9-3a93832be9c0/volumes" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.250561 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61b5a962-57a0-4466-8be1-a849530b1c91-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " pod="openstack/ceilometer-0" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.250719 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61b5a962-57a0-4466-8be1-a849530b1c91-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " pod="openstack/ceilometer-0" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.250771 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61b5a962-57a0-4466-8be1-a849530b1c91-config-data\") pod \"ceilometer-0\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " pod="openstack/ceilometer-0" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.250811 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b5a962-57a0-4466-8be1-a849530b1c91-log-httpd\") pod \"ceilometer-0\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " pod="openstack/ceilometer-0" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.250883 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61b5a962-57a0-4466-8be1-a849530b1c91-scripts\") pod \"ceilometer-0\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " pod="openstack/ceilometer-0" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.250979 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcfcp\" (UniqueName: \"kubernetes.io/projected/61b5a962-57a0-4466-8be1-a849530b1c91-kube-api-access-wcfcp\") pod \"ceilometer-0\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " pod="openstack/ceilometer-0" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.251077 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b5a962-57a0-4466-8be1-a849530b1c91-run-httpd\") pod \"ceilometer-0\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " pod="openstack/ceilometer-0" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.251664 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b5a962-57a0-4466-8be1-a849530b1c91-log-httpd\") pod \"ceilometer-0\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " pod="openstack/ceilometer-0" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.252835 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b5a962-57a0-4466-8be1-a849530b1c91-run-httpd\") pod \"ceilometer-0\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " pod="openstack/ceilometer-0" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.254242 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61b5a962-57a0-4466-8be1-a849530b1c91-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " pod="openstack/ceilometer-0" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.255175 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61b5a962-57a0-4466-8be1-a849530b1c91-scripts\") pod \"ceilometer-0\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " pod="openstack/ceilometer-0" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.255449 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61b5a962-57a0-4466-8be1-a849530b1c91-config-data\") pod \"ceilometer-0\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " pod="openstack/ceilometer-0" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.256447 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61b5a962-57a0-4466-8be1-a849530b1c91-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " pod="openstack/ceilometer-0" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.274225 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcfcp\" (UniqueName: \"kubernetes.io/projected/61b5a962-57a0-4466-8be1-a849530b1c91-kube-api-access-wcfcp\") pod \"ceilometer-0\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " pod="openstack/ceilometer-0" Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.345226 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:37:21 crc kubenswrapper[4719]: W1009 15:37:21.787523 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61b5a962_57a0_4466_8be1_a849530b1c91.slice/crio-f35099ec5ce95e25cda08d6be9fe77bbc40549cc4696bc4b175ba17076026ec4 WatchSource:0}: Error finding container f35099ec5ce95e25cda08d6be9fe77bbc40549cc4696bc4b175ba17076026ec4: Status 404 returned error can't find the container with id f35099ec5ce95e25cda08d6be9fe77bbc40549cc4696bc4b175ba17076026ec4 Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.788336 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:37:21 crc kubenswrapper[4719]: I1009 15:37:21.952648 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b5a962-57a0-4466-8be1-a849530b1c91","Type":"ContainerStarted","Data":"f35099ec5ce95e25cda08d6be9fe77bbc40549cc4696bc4b175ba17076026ec4"} Oct 09 15:37:22 crc kubenswrapper[4719]: I1009 15:37:22.965487 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b5a962-57a0-4466-8be1-a849530b1c91","Type":"ContainerStarted","Data":"80329c8d7a2a1c949a6503e63844db2ff9e23e803b8872107b69dc67114d66ab"} Oct 09 15:37:22 crc kubenswrapper[4719]: I1009 15:37:22.966452 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b5a962-57a0-4466-8be1-a849530b1c91","Type":"ContainerStarted","Data":"4dcfef3b54b8269d6b3f4510696eff2acdfa0d8cb9ab27306d81732318677ed1"} Oct 09 15:37:22 crc kubenswrapper[4719]: I1009 15:37:22.966493 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b5a962-57a0-4466-8be1-a849530b1c91","Type":"ContainerStarted","Data":"a2bc5465f33cd28f2c2c69b633e3b88f8f3174bf8700a72320333592a9bb3817"} Oct 09 15:37:24 crc kubenswrapper[4719]: I1009 15:37:24.982981 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b5a962-57a0-4466-8be1-a849530b1c91","Type":"ContainerStarted","Data":"f39ff661dd68e44c4e4f94bce7d60cb6704b89dbd836578aeb69849a43ca20eb"} Oct 09 15:37:24 crc kubenswrapper[4719]: I1009 15:37:24.983594 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 15:37:25 crc kubenswrapper[4719]: I1009 15:37:25.012069 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.769729129 podStartE2EDuration="5.012052931s" podCreationTimestamp="2025-10-09 15:37:20 +0000 UTC" firstStartedPulling="2025-10-09 15:37:21.789763415 +0000 UTC m=+1147.299474700" lastFinishedPulling="2025-10-09 15:37:24.032087217 +0000 UTC m=+1149.541798502" observedRunningTime="2025-10-09 15:37:25.007393142 +0000 UTC m=+1150.517104437" watchObservedRunningTime="2025-10-09 15:37:25.012052931 +0000 UTC m=+1150.521764216" Oct 09 15:37:26 crc kubenswrapper[4719]: I1009 15:37:26.792187 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 09 15:37:26 crc kubenswrapper[4719]: I1009 15:37:26.792544 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 09 15:37:26 crc kubenswrapper[4719]: I1009 15:37:26.820125 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 09 15:37:27 crc kubenswrapper[4719]: I1009 15:37:27.026460 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Oct 09 15:37:27 crc kubenswrapper[4719]: I1009 15:37:27.394110 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 09 15:37:27 crc kubenswrapper[4719]: I1009 15:37:27.394154 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 09 15:37:27 crc kubenswrapper[4719]: I1009 15:37:27.434082 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 09 15:37:27 crc kubenswrapper[4719]: I1009 15:37:27.439746 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 09 15:37:27 crc kubenswrapper[4719]: I1009 15:37:27.440784 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 09 15:37:27 crc kubenswrapper[4719]: I1009 15:37:27.444157 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 09 15:37:27 crc kubenswrapper[4719]: I1009 15:37:27.493549 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 09 15:37:27 crc kubenswrapper[4719]: I1009 15:37:27.503796 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 09 15:37:28 crc kubenswrapper[4719]: I1009 15:37:28.009045 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 09 15:37:28 crc kubenswrapper[4719]: I1009 15:37:28.009382 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 09 15:37:28 crc kubenswrapper[4719]: I1009 15:37:28.009414 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 09 15:37:28 crc kubenswrapper[4719]: I1009 15:37:28.009426 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 09 15:37:29 crc kubenswrapper[4719]: I1009 15:37:29.018796 4719 generic.go:334] "Generic (PLEG): container finished" podID="de61779c-4ad9-40bd-908e-27b82b5c82cb" containerID="690296cf308dd5fe94519d81b80bdbbd8feb6d565660c122d20de7a8f1fba837" exitCode=0 Oct 09 15:37:29 crc kubenswrapper[4719]: I1009 15:37:29.019920 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-r57qh" event={"ID":"de61779c-4ad9-40bd-908e-27b82b5c82cb","Type":"ContainerDied","Data":"690296cf308dd5fe94519d81b80bdbbd8feb6d565660c122d20de7a8f1fba837"} Oct 09 15:37:29 crc kubenswrapper[4719]: I1009 15:37:29.872669 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 09 15:37:29 crc kubenswrapper[4719]: I1009 15:37:29.878833 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 09 15:37:29 crc kubenswrapper[4719]: I1009 15:37:29.889003 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 09 15:37:29 crc kubenswrapper[4719]: I1009 15:37:29.998316 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 09 15:37:30 crc kubenswrapper[4719]: I1009 15:37:30.528791 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-r57qh" Oct 09 15:37:30 crc kubenswrapper[4719]: I1009 15:37:30.644086 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de61779c-4ad9-40bd-908e-27b82b5c82cb-scripts\") pod \"de61779c-4ad9-40bd-908e-27b82b5c82cb\" (UID: \"de61779c-4ad9-40bd-908e-27b82b5c82cb\") " Oct 09 15:37:30 crc kubenswrapper[4719]: I1009 15:37:30.644394 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de61779c-4ad9-40bd-908e-27b82b5c82cb-combined-ca-bundle\") pod \"de61779c-4ad9-40bd-908e-27b82b5c82cb\" (UID: \"de61779c-4ad9-40bd-908e-27b82b5c82cb\") " Oct 09 15:37:30 crc kubenswrapper[4719]: I1009 15:37:30.644529 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de61779c-4ad9-40bd-908e-27b82b5c82cb-config-data\") pod \"de61779c-4ad9-40bd-908e-27b82b5c82cb\" (UID: \"de61779c-4ad9-40bd-908e-27b82b5c82cb\") " Oct 09 15:37:30 crc kubenswrapper[4719]: I1009 15:37:30.644703 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lclj4\" (UniqueName: \"kubernetes.io/projected/de61779c-4ad9-40bd-908e-27b82b5c82cb-kube-api-access-lclj4\") pod \"de61779c-4ad9-40bd-908e-27b82b5c82cb\" (UID: \"de61779c-4ad9-40bd-908e-27b82b5c82cb\") " Oct 09 15:37:30 crc kubenswrapper[4719]: I1009 15:37:30.650326 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de61779c-4ad9-40bd-908e-27b82b5c82cb-kube-api-access-lclj4" (OuterVolumeSpecName: "kube-api-access-lclj4") pod "de61779c-4ad9-40bd-908e-27b82b5c82cb" (UID: "de61779c-4ad9-40bd-908e-27b82b5c82cb"). InnerVolumeSpecName "kube-api-access-lclj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:37:30 crc kubenswrapper[4719]: I1009 15:37:30.664741 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de61779c-4ad9-40bd-908e-27b82b5c82cb-scripts" (OuterVolumeSpecName: "scripts") pod "de61779c-4ad9-40bd-908e-27b82b5c82cb" (UID: "de61779c-4ad9-40bd-908e-27b82b5c82cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:30 crc kubenswrapper[4719]: E1009 15:37:30.679458 4719 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de61779c-4ad9-40bd-908e-27b82b5c82cb-combined-ca-bundle podName:de61779c-4ad9-40bd-908e-27b82b5c82cb nodeName:}" failed. No retries permitted until 2025-10-09 15:37:31.179426954 +0000 UTC m=+1156.689138239 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/de61779c-4ad9-40bd-908e-27b82b5c82cb-combined-ca-bundle") pod "de61779c-4ad9-40bd-908e-27b82b5c82cb" (UID: "de61779c-4ad9-40bd-908e-27b82b5c82cb") : error deleting /var/lib/kubelet/pods/de61779c-4ad9-40bd-908e-27b82b5c82cb/volume-subpaths: remove /var/lib/kubelet/pods/de61779c-4ad9-40bd-908e-27b82b5c82cb/volume-subpaths: no such file or directory Oct 09 15:37:30 crc kubenswrapper[4719]: I1009 15:37:30.682585 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de61779c-4ad9-40bd-908e-27b82b5c82cb-config-data" (OuterVolumeSpecName: "config-data") pod "de61779c-4ad9-40bd-908e-27b82b5c82cb" (UID: "de61779c-4ad9-40bd-908e-27b82b5c82cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:30 crc kubenswrapper[4719]: I1009 15:37:30.746894 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de61779c-4ad9-40bd-908e-27b82b5c82cb-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:30 crc kubenswrapper[4719]: I1009 15:37:30.746929 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de61779c-4ad9-40bd-908e-27b82b5c82cb-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:30 crc kubenswrapper[4719]: I1009 15:37:30.746944 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lclj4\" (UniqueName: \"kubernetes.io/projected/de61779c-4ad9-40bd-908e-27b82b5c82cb-kube-api-access-lclj4\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:31 crc kubenswrapper[4719]: I1009 15:37:31.041503 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-r57qh" event={"ID":"de61779c-4ad9-40bd-908e-27b82b5c82cb","Type":"ContainerDied","Data":"ad9ad03ab0eb9b1d92611361bd2756fe71fdcdcecd78fb6ecf3453568b420a69"} Oct 09 15:37:31 crc kubenswrapper[4719]: I1009 15:37:31.041552 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad9ad03ab0eb9b1d92611361bd2756fe71fdcdcecd78fb6ecf3453568b420a69" Oct 09 15:37:31 crc kubenswrapper[4719]: I1009 15:37:31.041660 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-r57qh" Oct 09 15:37:31 crc kubenswrapper[4719]: I1009 15:37:31.141073 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 09 15:37:31 crc kubenswrapper[4719]: E1009 15:37:31.141581 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de61779c-4ad9-40bd-908e-27b82b5c82cb" containerName="nova-cell0-conductor-db-sync" Oct 09 15:37:31 crc kubenswrapper[4719]: I1009 15:37:31.141605 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="de61779c-4ad9-40bd-908e-27b82b5c82cb" containerName="nova-cell0-conductor-db-sync" Oct 09 15:37:31 crc kubenswrapper[4719]: I1009 15:37:31.141845 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="de61779c-4ad9-40bd-908e-27b82b5c82cb" containerName="nova-cell0-conductor-db-sync" Oct 09 15:37:31 crc kubenswrapper[4719]: I1009 15:37:31.142718 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 09 15:37:31 crc kubenswrapper[4719]: I1009 15:37:31.152535 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 09 15:37:31 crc kubenswrapper[4719]: I1009 15:37:31.256096 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de61779c-4ad9-40bd-908e-27b82b5c82cb-combined-ca-bundle\") pod \"de61779c-4ad9-40bd-908e-27b82b5c82cb\" (UID: \"de61779c-4ad9-40bd-908e-27b82b5c82cb\") " Oct 09 15:37:31 crc kubenswrapper[4719]: I1009 15:37:31.256936 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9773c656-c5bc-46a2-a323-d2d310e6a104-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9773c656-c5bc-46a2-a323-d2d310e6a104\") " pod="openstack/nova-cell0-conductor-0" Oct 09 15:37:31 crc kubenswrapper[4719]: I1009 15:37:31.257147 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9773c656-c5bc-46a2-a323-d2d310e6a104-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9773c656-c5bc-46a2-a323-d2d310e6a104\") " pod="openstack/nova-cell0-conductor-0" Oct 09 15:37:31 crc kubenswrapper[4719]: I1009 15:37:31.257211 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq6ql\" (UniqueName: \"kubernetes.io/projected/9773c656-c5bc-46a2-a323-d2d310e6a104-kube-api-access-sq6ql\") pod \"nova-cell0-conductor-0\" (UID: \"9773c656-c5bc-46a2-a323-d2d310e6a104\") " pod="openstack/nova-cell0-conductor-0" Oct 09 15:37:31 crc kubenswrapper[4719]: I1009 15:37:31.259868 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de61779c-4ad9-40bd-908e-27b82b5c82cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de61779c-4ad9-40bd-908e-27b82b5c82cb" (UID: "de61779c-4ad9-40bd-908e-27b82b5c82cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:31 crc kubenswrapper[4719]: I1009 15:37:31.358291 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9773c656-c5bc-46a2-a323-d2d310e6a104-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9773c656-c5bc-46a2-a323-d2d310e6a104\") " pod="openstack/nova-cell0-conductor-0" Oct 09 15:37:31 crc kubenswrapper[4719]: I1009 15:37:31.358511 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq6ql\" (UniqueName: \"kubernetes.io/projected/9773c656-c5bc-46a2-a323-d2d310e6a104-kube-api-access-sq6ql\") pod \"nova-cell0-conductor-0\" (UID: \"9773c656-c5bc-46a2-a323-d2d310e6a104\") " pod="openstack/nova-cell0-conductor-0" Oct 09 15:37:31 crc kubenswrapper[4719]: I1009 15:37:31.358614 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9773c656-c5bc-46a2-a323-d2d310e6a104-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9773c656-c5bc-46a2-a323-d2d310e6a104\") " pod="openstack/nova-cell0-conductor-0" Oct 09 15:37:31 crc kubenswrapper[4719]: I1009 15:37:31.358756 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de61779c-4ad9-40bd-908e-27b82b5c82cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:31 crc kubenswrapper[4719]: I1009 15:37:31.362232 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9773c656-c5bc-46a2-a323-d2d310e6a104-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9773c656-c5bc-46a2-a323-d2d310e6a104\") " pod="openstack/nova-cell0-conductor-0" Oct 09 15:37:31 crc kubenswrapper[4719]: I1009 15:37:31.363108 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9773c656-c5bc-46a2-a323-d2d310e6a104-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9773c656-c5bc-46a2-a323-d2d310e6a104\") " pod="openstack/nova-cell0-conductor-0" Oct 09 15:37:31 crc kubenswrapper[4719]: I1009 15:37:31.375555 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq6ql\" (UniqueName: \"kubernetes.io/projected/9773c656-c5bc-46a2-a323-d2d310e6a104-kube-api-access-sq6ql\") pod \"nova-cell0-conductor-0\" (UID: \"9773c656-c5bc-46a2-a323-d2d310e6a104\") " pod="openstack/nova-cell0-conductor-0" Oct 09 15:37:31 crc kubenswrapper[4719]: I1009 15:37:31.470486 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 09 15:37:31 crc kubenswrapper[4719]: I1009 15:37:31.976144 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 09 15:37:32 crc kubenswrapper[4719]: I1009 15:37:32.052809 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9773c656-c5bc-46a2-a323-d2d310e6a104","Type":"ContainerStarted","Data":"6ec752170645e7205a091d3fe8fe6ddf6897e71f8e9049bd952300772a92c6f8"} Oct 09 15:37:33 crc kubenswrapper[4719]: I1009 15:37:33.081613 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9773c656-c5bc-46a2-a323-d2d310e6a104","Type":"ContainerStarted","Data":"eebce7882d9088534e11c0336746ad7a6869f4d1de7da554fcdd245ba8fa770f"} Oct 09 15:37:33 crc kubenswrapper[4719]: I1009 15:37:33.082489 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 09 15:37:33 crc kubenswrapper[4719]: I1009 15:37:33.102464 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.102443944 podStartE2EDuration="2.102443944s" podCreationTimestamp="2025-10-09 15:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:37:33.095470071 +0000 UTC m=+1158.605181376" watchObservedRunningTime="2025-10-09 15:37:33.102443944 +0000 UTC m=+1158.612155229" Oct 09 15:37:33 crc kubenswrapper[4719]: I1009 15:37:33.322573 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 09 15:37:33 crc kubenswrapper[4719]: I1009 15:37:33.333433 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 09 15:37:33 crc kubenswrapper[4719]: I1009 15:37:33.333695 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="75999b62-ce1b-4a9b-8507-c8af12441083" containerName="watcher-decision-engine" containerID="cri-o://4c60b2a50df19d61f978d5628ec6dedb3f7671d62836d8027b8f74250a9e3b07" gracePeriod=30 Oct 09 15:37:34 crc kubenswrapper[4719]: I1009 15:37:34.095188 4719 generic.go:334] "Generic (PLEG): container finished" podID="75999b62-ce1b-4a9b-8507-c8af12441083" containerID="4c60b2a50df19d61f978d5628ec6dedb3f7671d62836d8027b8f74250a9e3b07" exitCode=0 Oct 09 15:37:34 crc kubenswrapper[4719]: I1009 15:37:34.095235 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"75999b62-ce1b-4a9b-8507-c8af12441083","Type":"ContainerDied","Data":"4c60b2a50df19d61f978d5628ec6dedb3f7671d62836d8027b8f74250a9e3b07"} Oct 09 15:37:34 crc kubenswrapper[4719]: I1009 15:37:34.095309 4719 scope.go:117] "RemoveContainer" containerID="1b0d8ce3fa12f7379def00ca2131e79585f6bdff6f8c5cc63816c53109df6822" Oct 09 15:37:34 crc kubenswrapper[4719]: I1009 15:37:34.429112 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 09 15:37:34 crc kubenswrapper[4719]: I1009 15:37:34.521562 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9j4r\" (UniqueName: \"kubernetes.io/projected/75999b62-ce1b-4a9b-8507-c8af12441083-kube-api-access-n9j4r\") pod \"75999b62-ce1b-4a9b-8507-c8af12441083\" (UID: \"75999b62-ce1b-4a9b-8507-c8af12441083\") " Oct 09 15:37:34 crc kubenswrapper[4719]: I1009 15:37:34.521723 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75999b62-ce1b-4a9b-8507-c8af12441083-logs\") pod \"75999b62-ce1b-4a9b-8507-c8af12441083\" (UID: \"75999b62-ce1b-4a9b-8507-c8af12441083\") " Oct 09 15:37:34 crc kubenswrapper[4719]: I1009 15:37:34.521780 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/75999b62-ce1b-4a9b-8507-c8af12441083-custom-prometheus-ca\") pod \"75999b62-ce1b-4a9b-8507-c8af12441083\" (UID: \"75999b62-ce1b-4a9b-8507-c8af12441083\") " Oct 09 15:37:34 crc kubenswrapper[4719]: I1009 15:37:34.521870 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75999b62-ce1b-4a9b-8507-c8af12441083-config-data\") pod \"75999b62-ce1b-4a9b-8507-c8af12441083\" (UID: \"75999b62-ce1b-4a9b-8507-c8af12441083\") " Oct 09 15:37:34 crc kubenswrapper[4719]: I1009 15:37:34.522059 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75999b62-ce1b-4a9b-8507-c8af12441083-logs" (OuterVolumeSpecName: "logs") pod "75999b62-ce1b-4a9b-8507-c8af12441083" (UID: "75999b62-ce1b-4a9b-8507-c8af12441083"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:37:34 crc kubenswrapper[4719]: I1009 15:37:34.522114 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75999b62-ce1b-4a9b-8507-c8af12441083-combined-ca-bundle\") pod \"75999b62-ce1b-4a9b-8507-c8af12441083\" (UID: \"75999b62-ce1b-4a9b-8507-c8af12441083\") " Oct 09 15:37:34 crc kubenswrapper[4719]: I1009 15:37:34.522767 4719 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75999b62-ce1b-4a9b-8507-c8af12441083-logs\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:34 crc kubenswrapper[4719]: I1009 15:37:34.528562 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75999b62-ce1b-4a9b-8507-c8af12441083-kube-api-access-n9j4r" (OuterVolumeSpecName: "kube-api-access-n9j4r") pod "75999b62-ce1b-4a9b-8507-c8af12441083" (UID: "75999b62-ce1b-4a9b-8507-c8af12441083"). InnerVolumeSpecName "kube-api-access-n9j4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:37:34 crc kubenswrapper[4719]: I1009 15:37:34.552827 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75999b62-ce1b-4a9b-8507-c8af12441083-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75999b62-ce1b-4a9b-8507-c8af12441083" (UID: "75999b62-ce1b-4a9b-8507-c8af12441083"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:34 crc kubenswrapper[4719]: I1009 15:37:34.558298 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75999b62-ce1b-4a9b-8507-c8af12441083-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "75999b62-ce1b-4a9b-8507-c8af12441083" (UID: "75999b62-ce1b-4a9b-8507-c8af12441083"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:34 crc kubenswrapper[4719]: I1009 15:37:34.586090 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75999b62-ce1b-4a9b-8507-c8af12441083-config-data" (OuterVolumeSpecName: "config-data") pod "75999b62-ce1b-4a9b-8507-c8af12441083" (UID: "75999b62-ce1b-4a9b-8507-c8af12441083"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:34 crc kubenswrapper[4719]: I1009 15:37:34.625048 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9j4r\" (UniqueName: \"kubernetes.io/projected/75999b62-ce1b-4a9b-8507-c8af12441083-kube-api-access-n9j4r\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:34 crc kubenswrapper[4719]: I1009 15:37:34.625092 4719 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/75999b62-ce1b-4a9b-8507-c8af12441083-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:34 crc kubenswrapper[4719]: I1009 15:37:34.625104 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75999b62-ce1b-4a9b-8507-c8af12441083-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:34 crc kubenswrapper[4719]: I1009 15:37:34.625115 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75999b62-ce1b-4a9b-8507-c8af12441083-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:34 crc kubenswrapper[4719]: I1009 15:37:34.965776 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:37:34 crc kubenswrapper[4719]: I1009 15:37:34.966228 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61b5a962-57a0-4466-8be1-a849530b1c91" containerName="ceilometer-central-agent" containerID="cri-o://a2bc5465f33cd28f2c2c69b633e3b88f8f3174bf8700a72320333592a9bb3817" gracePeriod=30 Oct 09 15:37:34 crc kubenswrapper[4719]: I1009 15:37:34.966334 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61b5a962-57a0-4466-8be1-a849530b1c91" containerName="proxy-httpd" containerID="cri-o://f39ff661dd68e44c4e4f94bce7d60cb6704b89dbd836578aeb69849a43ca20eb" gracePeriod=30 Oct 09 15:37:34 crc kubenswrapper[4719]: I1009 15:37:34.966384 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61b5a962-57a0-4466-8be1-a849530b1c91" containerName="sg-core" containerID="cri-o://80329c8d7a2a1c949a6503e63844db2ff9e23e803b8872107b69dc67114d66ab" gracePeriod=30 Oct 09 15:37:34 crc kubenswrapper[4719]: I1009 15:37:34.966414 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61b5a962-57a0-4466-8be1-a849530b1c91" containerName="ceilometer-notification-agent" containerID="cri-o://4dcfef3b54b8269d6b3f4510696eff2acdfa0d8cb9ab27306d81732318677ed1" gracePeriod=30 Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.066375 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="61b5a962-57a0-4466-8be1-a849530b1c91" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.202:3000/\": read tcp 10.217.0.2:40348->10.217.0.202:3000: read: connection reset by peer" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.108234 4719 generic.go:334] "Generic (PLEG): container finished" podID="61b5a962-57a0-4466-8be1-a849530b1c91" containerID="f39ff661dd68e44c4e4f94bce7d60cb6704b89dbd836578aeb69849a43ca20eb" exitCode=0 Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.109321 4719 generic.go:334] "Generic (PLEG): container finished" podID="61b5a962-57a0-4466-8be1-a849530b1c91" containerID="80329c8d7a2a1c949a6503e63844db2ff9e23e803b8872107b69dc67114d66ab" exitCode=2 Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.108389 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b5a962-57a0-4466-8be1-a849530b1c91","Type":"ContainerDied","Data":"f39ff661dd68e44c4e4f94bce7d60cb6704b89dbd836578aeb69849a43ca20eb"} Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.109521 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b5a962-57a0-4466-8be1-a849530b1c91","Type":"ContainerDied","Data":"80329c8d7a2a1c949a6503e63844db2ff9e23e803b8872107b69dc67114d66ab"} Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.112459 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="9773c656-c5bc-46a2-a323-d2d310e6a104" containerName="nova-cell0-conductor-conductor" containerID="cri-o://eebce7882d9088534e11c0336746ad7a6869f4d1de7da554fcdd245ba8fa770f" gracePeriod=30 Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.112933 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.113129 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"75999b62-ce1b-4a9b-8507-c8af12441083","Type":"ContainerDied","Data":"ed4278e0cbed707901ccadffa63f799e939d112caf148579263c4d77f79e2389"} Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.114523 4719 scope.go:117] "RemoveContainer" containerID="4c60b2a50df19d61f978d5628ec6dedb3f7671d62836d8027b8f74250a9e3b07" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.201107 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.216416 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.225004 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 09 15:37:35 crc kubenswrapper[4719]: E1009 15:37:35.225461 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75999b62-ce1b-4a9b-8507-c8af12441083" containerName="watcher-decision-engine" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.225478 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="75999b62-ce1b-4a9b-8507-c8af12441083" containerName="watcher-decision-engine" Oct 09 15:37:35 crc kubenswrapper[4719]: E1009 15:37:35.225497 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75999b62-ce1b-4a9b-8507-c8af12441083" containerName="watcher-decision-engine" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.225503 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="75999b62-ce1b-4a9b-8507-c8af12441083" containerName="watcher-decision-engine" Oct 09 15:37:35 crc kubenswrapper[4719]: E1009 15:37:35.225519 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75999b62-ce1b-4a9b-8507-c8af12441083" containerName="watcher-decision-engine" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.225525 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="75999b62-ce1b-4a9b-8507-c8af12441083" containerName="watcher-decision-engine" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.225715 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="75999b62-ce1b-4a9b-8507-c8af12441083" containerName="watcher-decision-engine" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.225727 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="75999b62-ce1b-4a9b-8507-c8af12441083" containerName="watcher-decision-engine" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.225736 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="75999b62-ce1b-4a9b-8507-c8af12441083" containerName="watcher-decision-engine" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.226393 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.231811 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.239802 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.338156 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5d4f8ef-6b73-4d97-9899-49865bf6d744-logs\") pod \"watcher-decision-engine-0\" (UID: \"c5d4f8ef-6b73-4d97-9899-49865bf6d744\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.338308 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfzfm\" (UniqueName: \"kubernetes.io/projected/c5d4f8ef-6b73-4d97-9899-49865bf6d744-kube-api-access-kfzfm\") pod \"watcher-decision-engine-0\" (UID: \"c5d4f8ef-6b73-4d97-9899-49865bf6d744\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.338334 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5d4f8ef-6b73-4d97-9899-49865bf6d744-config-data\") pod \"watcher-decision-engine-0\" (UID: \"c5d4f8ef-6b73-4d97-9899-49865bf6d744\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.338546 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c5d4f8ef-6b73-4d97-9899-49865bf6d744-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"c5d4f8ef-6b73-4d97-9899-49865bf6d744\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.338614 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d4f8ef-6b73-4d97-9899-49865bf6d744-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"c5d4f8ef-6b73-4d97-9899-49865bf6d744\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.440581 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfzfm\" (UniqueName: \"kubernetes.io/projected/c5d4f8ef-6b73-4d97-9899-49865bf6d744-kube-api-access-kfzfm\") pod \"watcher-decision-engine-0\" (UID: \"c5d4f8ef-6b73-4d97-9899-49865bf6d744\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.440637 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5d4f8ef-6b73-4d97-9899-49865bf6d744-config-data\") pod \"watcher-decision-engine-0\" (UID: \"c5d4f8ef-6b73-4d97-9899-49865bf6d744\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.440745 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c5d4f8ef-6b73-4d97-9899-49865bf6d744-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"c5d4f8ef-6b73-4d97-9899-49865bf6d744\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.440774 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d4f8ef-6b73-4d97-9899-49865bf6d744-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"c5d4f8ef-6b73-4d97-9899-49865bf6d744\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.440797 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5d4f8ef-6b73-4d97-9899-49865bf6d744-logs\") pod \"watcher-decision-engine-0\" (UID: \"c5d4f8ef-6b73-4d97-9899-49865bf6d744\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.441312 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5d4f8ef-6b73-4d97-9899-49865bf6d744-logs\") pod \"watcher-decision-engine-0\" (UID: \"c5d4f8ef-6b73-4d97-9899-49865bf6d744\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.446320 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d4f8ef-6b73-4d97-9899-49865bf6d744-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"c5d4f8ef-6b73-4d97-9899-49865bf6d744\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.448076 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5d4f8ef-6b73-4d97-9899-49865bf6d744-config-data\") pod \"watcher-decision-engine-0\" (UID: \"c5d4f8ef-6b73-4d97-9899-49865bf6d744\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.457166 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c5d4f8ef-6b73-4d97-9899-49865bf6d744-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"c5d4f8ef-6b73-4d97-9899-49865bf6d744\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.472041 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfzfm\" (UniqueName: \"kubernetes.io/projected/c5d4f8ef-6b73-4d97-9899-49865bf6d744-kube-api-access-kfzfm\") pod \"watcher-decision-engine-0\" (UID: \"c5d4f8ef-6b73-4d97-9899-49865bf6d744\") " pod="openstack/watcher-decision-engine-0" Oct 09 15:37:35 crc kubenswrapper[4719]: I1009 15:37:35.549787 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 09 15:37:36 crc kubenswrapper[4719]: I1009 15:37:36.102983 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 09 15:37:36 crc kubenswrapper[4719]: W1009 15:37:36.105802 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5d4f8ef_6b73_4d97_9899_49865bf6d744.slice/crio-976a8776818cd3882fba67371706909c555733f169c1acc9055f60c599f7ab38 WatchSource:0}: Error finding container 976a8776818cd3882fba67371706909c555733f169c1acc9055f60c599f7ab38: Status 404 returned error can't find the container with id 976a8776818cd3882fba67371706909c555733f169c1acc9055f60c599f7ab38 Oct 09 15:37:36 crc kubenswrapper[4719]: I1009 15:37:36.125648 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"c5d4f8ef-6b73-4d97-9899-49865bf6d744","Type":"ContainerStarted","Data":"976a8776818cd3882fba67371706909c555733f169c1acc9055f60c599f7ab38"} Oct 09 15:37:36 crc kubenswrapper[4719]: I1009 15:37:36.130066 4719 generic.go:334] "Generic (PLEG): container finished" podID="61b5a962-57a0-4466-8be1-a849530b1c91" containerID="a2bc5465f33cd28f2c2c69b633e3b88f8f3174bf8700a72320333592a9bb3817" exitCode=0 Oct 09 15:37:36 crc kubenswrapper[4719]: I1009 15:37:36.130125 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b5a962-57a0-4466-8be1-a849530b1c91","Type":"ContainerDied","Data":"a2bc5465f33cd28f2c2c69b633e3b88f8f3174bf8700a72320333592a9bb3817"} Oct 09 15:37:37 crc kubenswrapper[4719]: I1009 15:37:37.141773 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"c5d4f8ef-6b73-4d97-9899-49865bf6d744","Type":"ContainerStarted","Data":"8d6f2e8e76cbe0f3bc727976ccce0ff4a94cfbfe21aff8dafe848dc86dfafaf7"} Oct 09 15:37:37 crc kubenswrapper[4719]: I1009 15:37:37.166639 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.166622218 podStartE2EDuration="2.166622218s" podCreationTimestamp="2025-10-09 15:37:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:37:37.161108361 +0000 UTC m=+1162.670819636" watchObservedRunningTime="2025-10-09 15:37:37.166622218 +0000 UTC m=+1162.676333503" Oct 09 15:37:37 crc kubenswrapper[4719]: I1009 15:37:37.174382 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75999b62-ce1b-4a9b-8507-c8af12441083" path="/var/lib/kubelet/pods/75999b62-ce1b-4a9b-8507-c8af12441083/volumes" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.054503 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.093655 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61b5a962-57a0-4466-8be1-a849530b1c91-scripts\") pod \"61b5a962-57a0-4466-8be1-a849530b1c91\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.093696 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcfcp\" (UniqueName: \"kubernetes.io/projected/61b5a962-57a0-4466-8be1-a849530b1c91-kube-api-access-wcfcp\") pod \"61b5a962-57a0-4466-8be1-a849530b1c91\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.093727 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61b5a962-57a0-4466-8be1-a849530b1c91-config-data\") pod \"61b5a962-57a0-4466-8be1-a849530b1c91\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.093778 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b5a962-57a0-4466-8be1-a849530b1c91-run-httpd\") pod \"61b5a962-57a0-4466-8be1-a849530b1c91\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.093799 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61b5a962-57a0-4466-8be1-a849530b1c91-sg-core-conf-yaml\") pod \"61b5a962-57a0-4466-8be1-a849530b1c91\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.093896 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b5a962-57a0-4466-8be1-a849530b1c91-log-httpd\") pod \"61b5a962-57a0-4466-8be1-a849530b1c91\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.093923 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61b5a962-57a0-4466-8be1-a849530b1c91-combined-ca-bundle\") pod \"61b5a962-57a0-4466-8be1-a849530b1c91\" (UID: \"61b5a962-57a0-4466-8be1-a849530b1c91\") " Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.094556 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61b5a962-57a0-4466-8be1-a849530b1c91-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "61b5a962-57a0-4466-8be1-a849530b1c91" (UID: "61b5a962-57a0-4466-8be1-a849530b1c91"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.094870 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61b5a962-57a0-4466-8be1-a849530b1c91-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "61b5a962-57a0-4466-8be1-a849530b1c91" (UID: "61b5a962-57a0-4466-8be1-a849530b1c91"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.099130 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61b5a962-57a0-4466-8be1-a849530b1c91-scripts" (OuterVolumeSpecName: "scripts") pod "61b5a962-57a0-4466-8be1-a849530b1c91" (UID: "61b5a962-57a0-4466-8be1-a849530b1c91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.099336 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61b5a962-57a0-4466-8be1-a849530b1c91-kube-api-access-wcfcp" (OuterVolumeSpecName: "kube-api-access-wcfcp") pod "61b5a962-57a0-4466-8be1-a849530b1c91" (UID: "61b5a962-57a0-4466-8be1-a849530b1c91"). InnerVolumeSpecName "kube-api-access-wcfcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.122050 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61b5a962-57a0-4466-8be1-a849530b1c91-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "61b5a962-57a0-4466-8be1-a849530b1c91" (UID: "61b5a962-57a0-4466-8be1-a849530b1c91"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.153513 4719 generic.go:334] "Generic (PLEG): container finished" podID="61b5a962-57a0-4466-8be1-a849530b1c91" containerID="4dcfef3b54b8269d6b3f4510696eff2acdfa0d8cb9ab27306d81732318677ed1" exitCode=0 Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.153572 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.153608 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b5a962-57a0-4466-8be1-a849530b1c91","Type":"ContainerDied","Data":"4dcfef3b54b8269d6b3f4510696eff2acdfa0d8cb9ab27306d81732318677ed1"} Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.153667 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61b5a962-57a0-4466-8be1-a849530b1c91","Type":"ContainerDied","Data":"f35099ec5ce95e25cda08d6be9fe77bbc40549cc4696bc4b175ba17076026ec4"} Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.153690 4719 scope.go:117] "RemoveContainer" containerID="f39ff661dd68e44c4e4f94bce7d60cb6704b89dbd836578aeb69849a43ca20eb" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.171275 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61b5a962-57a0-4466-8be1-a849530b1c91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61b5a962-57a0-4466-8be1-a849530b1c91" (UID: "61b5a962-57a0-4466-8be1-a849530b1c91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.196126 4719 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b5a962-57a0-4466-8be1-a849530b1c91-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.196162 4719 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61b5a962-57a0-4466-8be1-a849530b1c91-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.196173 4719 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61b5a962-57a0-4466-8be1-a849530b1c91-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.196186 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61b5a962-57a0-4466-8be1-a849530b1c91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.196198 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcfcp\" (UniqueName: \"kubernetes.io/projected/61b5a962-57a0-4466-8be1-a849530b1c91-kube-api-access-wcfcp\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.196210 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61b5a962-57a0-4466-8be1-a849530b1c91-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.204456 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61b5a962-57a0-4466-8be1-a849530b1c91-config-data" (OuterVolumeSpecName: "config-data") pod "61b5a962-57a0-4466-8be1-a849530b1c91" (UID: "61b5a962-57a0-4466-8be1-a849530b1c91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.216279 4719 scope.go:117] "RemoveContainer" containerID="80329c8d7a2a1c949a6503e63844db2ff9e23e803b8872107b69dc67114d66ab" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.234736 4719 scope.go:117] "RemoveContainer" containerID="4dcfef3b54b8269d6b3f4510696eff2acdfa0d8cb9ab27306d81732318677ed1" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.251228 4719 scope.go:117] "RemoveContainer" containerID="a2bc5465f33cd28f2c2c69b633e3b88f8f3174bf8700a72320333592a9bb3817" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.269162 4719 scope.go:117] "RemoveContainer" containerID="f39ff661dd68e44c4e4f94bce7d60cb6704b89dbd836578aeb69849a43ca20eb" Oct 09 15:37:38 crc kubenswrapper[4719]: E1009 15:37:38.269544 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f39ff661dd68e44c4e4f94bce7d60cb6704b89dbd836578aeb69849a43ca20eb\": container with ID starting with f39ff661dd68e44c4e4f94bce7d60cb6704b89dbd836578aeb69849a43ca20eb not found: ID does not exist" containerID="f39ff661dd68e44c4e4f94bce7d60cb6704b89dbd836578aeb69849a43ca20eb" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.269572 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f39ff661dd68e44c4e4f94bce7d60cb6704b89dbd836578aeb69849a43ca20eb"} err="failed to get container status \"f39ff661dd68e44c4e4f94bce7d60cb6704b89dbd836578aeb69849a43ca20eb\": rpc error: code = NotFound desc = could not find container \"f39ff661dd68e44c4e4f94bce7d60cb6704b89dbd836578aeb69849a43ca20eb\": container with ID starting with f39ff661dd68e44c4e4f94bce7d60cb6704b89dbd836578aeb69849a43ca20eb not found: ID does not exist" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.269593 4719 scope.go:117] "RemoveContainer" containerID="80329c8d7a2a1c949a6503e63844db2ff9e23e803b8872107b69dc67114d66ab" Oct 09 15:37:38 crc kubenswrapper[4719]: E1009 15:37:38.269909 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80329c8d7a2a1c949a6503e63844db2ff9e23e803b8872107b69dc67114d66ab\": container with ID starting with 80329c8d7a2a1c949a6503e63844db2ff9e23e803b8872107b69dc67114d66ab not found: ID does not exist" containerID="80329c8d7a2a1c949a6503e63844db2ff9e23e803b8872107b69dc67114d66ab" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.269949 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80329c8d7a2a1c949a6503e63844db2ff9e23e803b8872107b69dc67114d66ab"} err="failed to get container status \"80329c8d7a2a1c949a6503e63844db2ff9e23e803b8872107b69dc67114d66ab\": rpc error: code = NotFound desc = could not find container \"80329c8d7a2a1c949a6503e63844db2ff9e23e803b8872107b69dc67114d66ab\": container with ID starting with 80329c8d7a2a1c949a6503e63844db2ff9e23e803b8872107b69dc67114d66ab not found: ID does not exist" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.269977 4719 scope.go:117] "RemoveContainer" containerID="4dcfef3b54b8269d6b3f4510696eff2acdfa0d8cb9ab27306d81732318677ed1" Oct 09 15:37:38 crc kubenswrapper[4719]: E1009 15:37:38.270490 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dcfef3b54b8269d6b3f4510696eff2acdfa0d8cb9ab27306d81732318677ed1\": container with ID starting with 4dcfef3b54b8269d6b3f4510696eff2acdfa0d8cb9ab27306d81732318677ed1 not found: ID does not exist" containerID="4dcfef3b54b8269d6b3f4510696eff2acdfa0d8cb9ab27306d81732318677ed1" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.270526 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dcfef3b54b8269d6b3f4510696eff2acdfa0d8cb9ab27306d81732318677ed1"} err="failed to get container status \"4dcfef3b54b8269d6b3f4510696eff2acdfa0d8cb9ab27306d81732318677ed1\": rpc error: code = NotFound desc = could not find container \"4dcfef3b54b8269d6b3f4510696eff2acdfa0d8cb9ab27306d81732318677ed1\": container with ID starting with 4dcfef3b54b8269d6b3f4510696eff2acdfa0d8cb9ab27306d81732318677ed1 not found: ID does not exist" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.270544 4719 scope.go:117] "RemoveContainer" containerID="a2bc5465f33cd28f2c2c69b633e3b88f8f3174bf8700a72320333592a9bb3817" Oct 09 15:37:38 crc kubenswrapper[4719]: E1009 15:37:38.270851 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2bc5465f33cd28f2c2c69b633e3b88f8f3174bf8700a72320333592a9bb3817\": container with ID starting with a2bc5465f33cd28f2c2c69b633e3b88f8f3174bf8700a72320333592a9bb3817 not found: ID does not exist" containerID="a2bc5465f33cd28f2c2c69b633e3b88f8f3174bf8700a72320333592a9bb3817" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.270904 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2bc5465f33cd28f2c2c69b633e3b88f8f3174bf8700a72320333592a9bb3817"} err="failed to get container status \"a2bc5465f33cd28f2c2c69b633e3b88f8f3174bf8700a72320333592a9bb3817\": rpc error: code = NotFound desc = could not find container \"a2bc5465f33cd28f2c2c69b633e3b88f8f3174bf8700a72320333592a9bb3817\": container with ID starting with a2bc5465f33cd28f2c2c69b633e3b88f8f3174bf8700a72320333592a9bb3817 not found: ID does not exist" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.298706 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61b5a962-57a0-4466-8be1-a849530b1c91-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.484070 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.492336 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.511989 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:37:38 crc kubenswrapper[4719]: E1009 15:37:38.513017 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61b5a962-57a0-4466-8be1-a849530b1c91" containerName="proxy-httpd" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.513044 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="61b5a962-57a0-4466-8be1-a849530b1c91" containerName="proxy-httpd" Oct 09 15:37:38 crc kubenswrapper[4719]: E1009 15:37:38.513060 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75999b62-ce1b-4a9b-8507-c8af12441083" containerName="watcher-decision-engine" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.513069 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="75999b62-ce1b-4a9b-8507-c8af12441083" containerName="watcher-decision-engine" Oct 09 15:37:38 crc kubenswrapper[4719]: E1009 15:37:38.513095 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61b5a962-57a0-4466-8be1-a849530b1c91" containerName="ceilometer-central-agent" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.513103 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="61b5a962-57a0-4466-8be1-a849530b1c91" containerName="ceilometer-central-agent" Oct 09 15:37:38 crc kubenswrapper[4719]: E1009 15:37:38.513118 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61b5a962-57a0-4466-8be1-a849530b1c91" containerName="ceilometer-notification-agent" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.513125 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="61b5a962-57a0-4466-8be1-a849530b1c91" containerName="ceilometer-notification-agent" Oct 09 15:37:38 crc kubenswrapper[4719]: E1009 15:37:38.513138 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61b5a962-57a0-4466-8be1-a849530b1c91" containerName="sg-core" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.513143 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="61b5a962-57a0-4466-8be1-a849530b1c91" containerName="sg-core" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.513325 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="75999b62-ce1b-4a9b-8507-c8af12441083" containerName="watcher-decision-engine" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.513340 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="61b5a962-57a0-4466-8be1-a849530b1c91" containerName="ceilometer-notification-agent" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.513373 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="61b5a962-57a0-4466-8be1-a849530b1c91" containerName="sg-core" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.513385 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="61b5a962-57a0-4466-8be1-a849530b1c91" containerName="proxy-httpd" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.513394 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="61b5a962-57a0-4466-8be1-a849530b1c91" containerName="ceilometer-central-agent" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.515482 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.521550 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.521550 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.531849 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.602798 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cd5h\" (UniqueName: \"kubernetes.io/projected/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-kube-api-access-8cd5h\") pod \"ceilometer-0\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " pod="openstack/ceilometer-0" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.602910 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-log-httpd\") pod \"ceilometer-0\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " pod="openstack/ceilometer-0" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.602936 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " pod="openstack/ceilometer-0" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.602953 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-config-data\") pod \"ceilometer-0\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " pod="openstack/ceilometer-0" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.602980 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-run-httpd\") pod \"ceilometer-0\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " pod="openstack/ceilometer-0" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.603024 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " pod="openstack/ceilometer-0" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.603050 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-scripts\") pod \"ceilometer-0\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " pod="openstack/ceilometer-0" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.704524 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-run-httpd\") pod \"ceilometer-0\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " pod="openstack/ceilometer-0" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.704609 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " pod="openstack/ceilometer-0" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.704652 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-scripts\") pod \"ceilometer-0\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " pod="openstack/ceilometer-0" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.704682 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cd5h\" (UniqueName: \"kubernetes.io/projected/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-kube-api-access-8cd5h\") pod \"ceilometer-0\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " pod="openstack/ceilometer-0" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.704790 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-log-httpd\") pod \"ceilometer-0\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " pod="openstack/ceilometer-0" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.704817 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " pod="openstack/ceilometer-0" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.704861 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-config-data\") pod \"ceilometer-0\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " pod="openstack/ceilometer-0" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.705190 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-log-httpd\") pod \"ceilometer-0\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " pod="openstack/ceilometer-0" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.705471 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-run-httpd\") pod \"ceilometer-0\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " pod="openstack/ceilometer-0" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.708751 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " pod="openstack/ceilometer-0" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.708912 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " pod="openstack/ceilometer-0" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.709138 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-config-data\") pod \"ceilometer-0\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " pod="openstack/ceilometer-0" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.709245 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-scripts\") pod \"ceilometer-0\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " pod="openstack/ceilometer-0" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.723972 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cd5h\" (UniqueName: \"kubernetes.io/projected/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-kube-api-access-8cd5h\") pod \"ceilometer-0\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " pod="openstack/ceilometer-0" Oct 09 15:37:38 crc kubenswrapper[4719]: I1009 15:37:38.834192 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:37:39 crc kubenswrapper[4719]: I1009 15:37:39.184587 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61b5a962-57a0-4466-8be1-a849530b1c91" path="/var/lib/kubelet/pods/61b5a962-57a0-4466-8be1-a849530b1c91/volumes" Oct 09 15:37:39 crc kubenswrapper[4719]: I1009 15:37:39.264909 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:37:40 crc kubenswrapper[4719]: I1009 15:37:40.178069 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac","Type":"ContainerStarted","Data":"fd0da4fa978485b84d3fe1df33b378fadca4abcda86474c9e8497fe666c6a6b2"} Oct 09 15:37:40 crc kubenswrapper[4719]: I1009 15:37:40.178449 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac","Type":"ContainerStarted","Data":"568061fd1344d0ed2923d54e50ae80d71d28d0e14a068820f5b86efe7fe5ccd5"} Oct 09 15:37:40 crc kubenswrapper[4719]: I1009 15:37:40.178464 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac","Type":"ContainerStarted","Data":"121136afb18934c1d1c11c4dd6ca17a8890db9ae4cdb667a53f0f899a5b2289c"} Oct 09 15:37:41 crc kubenswrapper[4719]: I1009 15:37:41.188933 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac","Type":"ContainerStarted","Data":"29c519552f94b7ac4278dd882695b28b9ce3f8f5723bc75f81a2242c39a20575"} Oct 09 15:37:41 crc kubenswrapper[4719]: E1009 15:37:41.472790 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eebce7882d9088534e11c0336746ad7a6869f4d1de7da554fcdd245ba8fa770f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 09 15:37:41 crc kubenswrapper[4719]: E1009 15:37:41.474761 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eebce7882d9088534e11c0336746ad7a6869f4d1de7da554fcdd245ba8fa770f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 09 15:37:41 crc kubenswrapper[4719]: E1009 15:37:41.478671 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eebce7882d9088534e11c0336746ad7a6869f4d1de7da554fcdd245ba8fa770f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 09 15:37:41 crc kubenswrapper[4719]: E1009 15:37:41.478717 4719 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="9773c656-c5bc-46a2-a323-d2d310e6a104" containerName="nova-cell0-conductor-conductor" Oct 09 15:37:43 crc kubenswrapper[4719]: I1009 15:37:43.219451 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac","Type":"ContainerStarted","Data":"21eb77e81e5c5c5e92cee62aa6fea977469a904b378cbe5b5b62b7b1d513d52d"} Oct 09 15:37:43 crc kubenswrapper[4719]: I1009 15:37:43.222625 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 15:37:43 crc kubenswrapper[4719]: I1009 15:37:43.255029 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.356372271 podStartE2EDuration="5.255009523s" podCreationTimestamp="2025-10-09 15:37:38 +0000 UTC" firstStartedPulling="2025-10-09 15:37:39.272968325 +0000 UTC m=+1164.782679610" lastFinishedPulling="2025-10-09 15:37:42.171605577 +0000 UTC m=+1167.681316862" observedRunningTime="2025-10-09 15:37:43.242470542 +0000 UTC m=+1168.752181847" watchObservedRunningTime="2025-10-09 15:37:43.255009523 +0000 UTC m=+1168.764720808" Oct 09 15:37:45 crc kubenswrapper[4719]: I1009 15:37:45.551556 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 09 15:37:45 crc kubenswrapper[4719]: I1009 15:37:45.583716 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 09 15:37:46 crc kubenswrapper[4719]: I1009 15:37:46.243800 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 09 15:37:46 crc kubenswrapper[4719]: I1009 15:37:46.267880 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Oct 09 15:37:46 crc kubenswrapper[4719]: E1009 15:37:46.473810 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eebce7882d9088534e11c0336746ad7a6869f4d1de7da554fcdd245ba8fa770f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 09 15:37:46 crc kubenswrapper[4719]: E1009 15:37:46.479809 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eebce7882d9088534e11c0336746ad7a6869f4d1de7da554fcdd245ba8fa770f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 09 15:37:46 crc kubenswrapper[4719]: E1009 15:37:46.481786 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eebce7882d9088534e11c0336746ad7a6869f4d1de7da554fcdd245ba8fa770f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 09 15:37:46 crc kubenswrapper[4719]: E1009 15:37:46.481844 4719 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="9773c656-c5bc-46a2-a323-d2d310e6a104" containerName="nova-cell0-conductor-conductor" Oct 09 15:37:51 crc kubenswrapper[4719]: E1009 15:37:51.473811 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eebce7882d9088534e11c0336746ad7a6869f4d1de7da554fcdd245ba8fa770f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 09 15:37:51 crc kubenswrapper[4719]: E1009 15:37:51.475912 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eebce7882d9088534e11c0336746ad7a6869f4d1de7da554fcdd245ba8fa770f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 09 15:37:51 crc kubenswrapper[4719]: E1009 15:37:51.477853 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eebce7882d9088534e11c0336746ad7a6869f4d1de7da554fcdd245ba8fa770f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 09 15:37:51 crc kubenswrapper[4719]: E1009 15:37:51.477946 4719 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="9773c656-c5bc-46a2-a323-d2d310e6a104" containerName="nova-cell0-conductor-conductor" Oct 09 15:37:56 crc kubenswrapper[4719]: E1009 15:37:56.473114 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eebce7882d9088534e11c0336746ad7a6869f4d1de7da554fcdd245ba8fa770f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 09 15:37:56 crc kubenswrapper[4719]: E1009 15:37:56.475181 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eebce7882d9088534e11c0336746ad7a6869f4d1de7da554fcdd245ba8fa770f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 09 15:37:56 crc kubenswrapper[4719]: E1009 15:37:56.476645 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eebce7882d9088534e11c0336746ad7a6869f4d1de7da554fcdd245ba8fa770f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 09 15:37:56 crc kubenswrapper[4719]: E1009 15:37:56.476702 4719 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="9773c656-c5bc-46a2-a323-d2d310e6a104" containerName="nova-cell0-conductor-conductor" Oct 09 15:38:01 crc kubenswrapper[4719]: E1009 15:38:01.472560 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eebce7882d9088534e11c0336746ad7a6869f4d1de7da554fcdd245ba8fa770f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 09 15:38:01 crc kubenswrapper[4719]: E1009 15:38:01.474777 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eebce7882d9088534e11c0336746ad7a6869f4d1de7da554fcdd245ba8fa770f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 09 15:38:01 crc kubenswrapper[4719]: E1009 15:38:01.475768 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eebce7882d9088534e11c0336746ad7a6869f4d1de7da554fcdd245ba8fa770f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 09 15:38:01 crc kubenswrapper[4719]: E1009 15:38:01.475843 4719 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="9773c656-c5bc-46a2-a323-d2d310e6a104" containerName="nova-cell0-conductor-conductor" Oct 09 15:38:05 crc kubenswrapper[4719]: I1009 15:38:05.454316 4719 generic.go:334] "Generic (PLEG): container finished" podID="9773c656-c5bc-46a2-a323-d2d310e6a104" containerID="eebce7882d9088534e11c0336746ad7a6869f4d1de7da554fcdd245ba8fa770f" exitCode=137 Oct 09 15:38:05 crc kubenswrapper[4719]: I1009 15:38:05.454390 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9773c656-c5bc-46a2-a323-d2d310e6a104","Type":"ContainerDied","Data":"eebce7882d9088534e11c0336746ad7a6869f4d1de7da554fcdd245ba8fa770f"} Oct 09 15:38:05 crc kubenswrapper[4719]: I1009 15:38:05.454911 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9773c656-c5bc-46a2-a323-d2d310e6a104","Type":"ContainerDied","Data":"6ec752170645e7205a091d3fe8fe6ddf6897e71f8e9049bd952300772a92c6f8"} Oct 09 15:38:05 crc kubenswrapper[4719]: I1009 15:38:05.454927 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ec752170645e7205a091d3fe8fe6ddf6897e71f8e9049bd952300772a92c6f8" Oct 09 15:38:05 crc kubenswrapper[4719]: I1009 15:38:05.521033 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 09 15:38:05 crc kubenswrapper[4719]: I1009 15:38:05.615088 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq6ql\" (UniqueName: \"kubernetes.io/projected/9773c656-c5bc-46a2-a323-d2d310e6a104-kube-api-access-sq6ql\") pod \"9773c656-c5bc-46a2-a323-d2d310e6a104\" (UID: \"9773c656-c5bc-46a2-a323-d2d310e6a104\") " Oct 09 15:38:05 crc kubenswrapper[4719]: I1009 15:38:05.615237 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9773c656-c5bc-46a2-a323-d2d310e6a104-config-data\") pod \"9773c656-c5bc-46a2-a323-d2d310e6a104\" (UID: \"9773c656-c5bc-46a2-a323-d2d310e6a104\") " Oct 09 15:38:05 crc kubenswrapper[4719]: I1009 15:38:05.615301 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9773c656-c5bc-46a2-a323-d2d310e6a104-combined-ca-bundle\") pod \"9773c656-c5bc-46a2-a323-d2d310e6a104\" (UID: \"9773c656-c5bc-46a2-a323-d2d310e6a104\") " Oct 09 15:38:05 crc kubenswrapper[4719]: I1009 15:38:05.621982 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9773c656-c5bc-46a2-a323-d2d310e6a104-kube-api-access-sq6ql" (OuterVolumeSpecName: "kube-api-access-sq6ql") pod "9773c656-c5bc-46a2-a323-d2d310e6a104" (UID: "9773c656-c5bc-46a2-a323-d2d310e6a104"). InnerVolumeSpecName "kube-api-access-sq6ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:38:05 crc kubenswrapper[4719]: I1009 15:38:05.645279 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9773c656-c5bc-46a2-a323-d2d310e6a104-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9773c656-c5bc-46a2-a323-d2d310e6a104" (UID: "9773c656-c5bc-46a2-a323-d2d310e6a104"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:38:05 crc kubenswrapper[4719]: I1009 15:38:05.646532 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9773c656-c5bc-46a2-a323-d2d310e6a104-config-data" (OuterVolumeSpecName: "config-data") pod "9773c656-c5bc-46a2-a323-d2d310e6a104" (UID: "9773c656-c5bc-46a2-a323-d2d310e6a104"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:38:05 crc kubenswrapper[4719]: I1009 15:38:05.717922 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9773c656-c5bc-46a2-a323-d2d310e6a104-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:05 crc kubenswrapper[4719]: I1009 15:38:05.718250 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sq6ql\" (UniqueName: \"kubernetes.io/projected/9773c656-c5bc-46a2-a323-d2d310e6a104-kube-api-access-sq6ql\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:05 crc kubenswrapper[4719]: I1009 15:38:05.718262 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9773c656-c5bc-46a2-a323-d2d310e6a104-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:06 crc kubenswrapper[4719]: I1009 15:38:06.463713 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 09 15:38:06 crc kubenswrapper[4719]: I1009 15:38:06.496331 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 09 15:38:06 crc kubenswrapper[4719]: I1009 15:38:06.505421 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 09 15:38:06 crc kubenswrapper[4719]: I1009 15:38:06.526423 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 09 15:38:06 crc kubenswrapper[4719]: E1009 15:38:06.526974 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9773c656-c5bc-46a2-a323-d2d310e6a104" containerName="nova-cell0-conductor-conductor" Oct 09 15:38:06 crc kubenswrapper[4719]: I1009 15:38:06.526995 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="9773c656-c5bc-46a2-a323-d2d310e6a104" containerName="nova-cell0-conductor-conductor" Oct 09 15:38:06 crc kubenswrapper[4719]: I1009 15:38:06.527236 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="9773c656-c5bc-46a2-a323-d2d310e6a104" containerName="nova-cell0-conductor-conductor" Oct 09 15:38:06 crc kubenswrapper[4719]: I1009 15:38:06.528033 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 09 15:38:06 crc kubenswrapper[4719]: I1009 15:38:06.530856 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-nvqwf" Oct 09 15:38:06 crc kubenswrapper[4719]: I1009 15:38:06.531188 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 09 15:38:06 crc kubenswrapper[4719]: I1009 15:38:06.534466 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 09 15:38:06 crc kubenswrapper[4719]: I1009 15:38:06.551827 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a46282a-5af1-483f-97a4-b96fd855dc00-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1a46282a-5af1-483f-97a4-b96fd855dc00\") " pod="openstack/nova-cell0-conductor-0" Oct 09 15:38:06 crc kubenswrapper[4719]: I1009 15:38:06.551867 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hfgj\" (UniqueName: \"kubernetes.io/projected/1a46282a-5af1-483f-97a4-b96fd855dc00-kube-api-access-8hfgj\") pod \"nova-cell0-conductor-0\" (UID: \"1a46282a-5af1-483f-97a4-b96fd855dc00\") " pod="openstack/nova-cell0-conductor-0" Oct 09 15:38:06 crc kubenswrapper[4719]: I1009 15:38:06.552003 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a46282a-5af1-483f-97a4-b96fd855dc00-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1a46282a-5af1-483f-97a4-b96fd855dc00\") " pod="openstack/nova-cell0-conductor-0" Oct 09 15:38:06 crc kubenswrapper[4719]: I1009 15:38:06.654320 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a46282a-5af1-483f-97a4-b96fd855dc00-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1a46282a-5af1-483f-97a4-b96fd855dc00\") " pod="openstack/nova-cell0-conductor-0" Oct 09 15:38:06 crc kubenswrapper[4719]: I1009 15:38:06.654392 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hfgj\" (UniqueName: \"kubernetes.io/projected/1a46282a-5af1-483f-97a4-b96fd855dc00-kube-api-access-8hfgj\") pod \"nova-cell0-conductor-0\" (UID: \"1a46282a-5af1-483f-97a4-b96fd855dc00\") " pod="openstack/nova-cell0-conductor-0" Oct 09 15:38:06 crc kubenswrapper[4719]: I1009 15:38:06.654507 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a46282a-5af1-483f-97a4-b96fd855dc00-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1a46282a-5af1-483f-97a4-b96fd855dc00\") " pod="openstack/nova-cell0-conductor-0" Oct 09 15:38:06 crc kubenswrapper[4719]: I1009 15:38:06.659543 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a46282a-5af1-483f-97a4-b96fd855dc00-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1a46282a-5af1-483f-97a4-b96fd855dc00\") " pod="openstack/nova-cell0-conductor-0" Oct 09 15:38:06 crc kubenswrapper[4719]: I1009 15:38:06.662069 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a46282a-5af1-483f-97a4-b96fd855dc00-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1a46282a-5af1-483f-97a4-b96fd855dc00\") " pod="openstack/nova-cell0-conductor-0" Oct 09 15:38:06 crc kubenswrapper[4719]: I1009 15:38:06.671918 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hfgj\" (UniqueName: \"kubernetes.io/projected/1a46282a-5af1-483f-97a4-b96fd855dc00-kube-api-access-8hfgj\") pod \"nova-cell0-conductor-0\" (UID: \"1a46282a-5af1-483f-97a4-b96fd855dc00\") " pod="openstack/nova-cell0-conductor-0" Oct 09 15:38:06 crc kubenswrapper[4719]: I1009 15:38:06.858088 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 09 15:38:06 crc kubenswrapper[4719]: I1009 15:38:06.976833 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:38:06 crc kubenswrapper[4719]: I1009 15:38:06.977407 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:38:07 crc kubenswrapper[4719]: I1009 15:38:07.172652 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9773c656-c5bc-46a2-a323-d2d310e6a104" path="/var/lib/kubelet/pods/9773c656-c5bc-46a2-a323-d2d310e6a104/volumes" Oct 09 15:38:07 crc kubenswrapper[4719]: I1009 15:38:07.306886 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 09 15:38:07 crc kubenswrapper[4719]: W1009 15:38:07.307156 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a46282a_5af1_483f_97a4_b96fd855dc00.slice/crio-565a416bc075f364c95ca23582bdf7543e48e1b9fdc0937032634cac1bb83d6c WatchSource:0}: Error finding container 565a416bc075f364c95ca23582bdf7543e48e1b9fdc0937032634cac1bb83d6c: Status 404 returned error can't find the container with id 565a416bc075f364c95ca23582bdf7543e48e1b9fdc0937032634cac1bb83d6c Oct 09 15:38:07 crc kubenswrapper[4719]: I1009 15:38:07.474772 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1a46282a-5af1-483f-97a4-b96fd855dc00","Type":"ContainerStarted","Data":"565a416bc075f364c95ca23582bdf7543e48e1b9fdc0937032634cac1bb83d6c"} Oct 09 15:38:08 crc kubenswrapper[4719]: I1009 15:38:08.483552 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1a46282a-5af1-483f-97a4-b96fd855dc00","Type":"ContainerStarted","Data":"bddb5a4a500a732f6f5a8d6666418d1eb9cce5a4a932d8a961c4ac7754e09a1e"} Oct 09 15:38:08 crc kubenswrapper[4719]: I1009 15:38:08.483866 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 09 15:38:08 crc kubenswrapper[4719]: I1009 15:38:08.498012 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.497993158 podStartE2EDuration="2.497993158s" podCreationTimestamp="2025-10-09 15:38:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:38:08.496752208 +0000 UTC m=+1194.006463493" watchObservedRunningTime="2025-10-09 15:38:08.497993158 +0000 UTC m=+1194.007704443" Oct 09 15:38:08 crc kubenswrapper[4719]: I1009 15:38:08.841972 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 09 15:38:12 crc kubenswrapper[4719]: I1009 15:38:12.689139 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 15:38:12 crc kubenswrapper[4719]: I1009 15:38:12.689908 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="6b96d4d6-d2f4-4e82-9a9e-4c95e6f5389a" containerName="kube-state-metrics" containerID="cri-o://ef9b73cf72dbe624f8649238f78b7a283967c35b1b38f59eb3ceb64324f2b069" gracePeriod=30 Oct 09 15:38:12 crc kubenswrapper[4719]: I1009 15:38:12.796095 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="6b96d4d6-d2f4-4e82-9a9e-4c95e6f5389a" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": dial tcp 10.217.0.114:8081: connect: connection refused" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.163469 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.214310 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7hlp\" (UniqueName: \"kubernetes.io/projected/6b96d4d6-d2f4-4e82-9a9e-4c95e6f5389a-kube-api-access-x7hlp\") pod \"6b96d4d6-d2f4-4e82-9a9e-4c95e6f5389a\" (UID: \"6b96d4d6-d2f4-4e82-9a9e-4c95e6f5389a\") " Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.223126 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b96d4d6-d2f4-4e82-9a9e-4c95e6f5389a-kube-api-access-x7hlp" (OuterVolumeSpecName: "kube-api-access-x7hlp") pod "6b96d4d6-d2f4-4e82-9a9e-4c95e6f5389a" (UID: "6b96d4d6-d2f4-4e82-9a9e-4c95e6f5389a"). InnerVolumeSpecName "kube-api-access-x7hlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.316214 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7hlp\" (UniqueName: \"kubernetes.io/projected/6b96d4d6-d2f4-4e82-9a9e-4c95e6f5389a-kube-api-access-x7hlp\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.534544 4719 generic.go:334] "Generic (PLEG): container finished" podID="6b96d4d6-d2f4-4e82-9a9e-4c95e6f5389a" containerID="ef9b73cf72dbe624f8649238f78b7a283967c35b1b38f59eb3ceb64324f2b069" exitCode=2 Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.534601 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6b96d4d6-d2f4-4e82-9a9e-4c95e6f5389a","Type":"ContainerDied","Data":"ef9b73cf72dbe624f8649238f78b7a283967c35b1b38f59eb3ceb64324f2b069"} Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.534633 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6b96d4d6-d2f4-4e82-9a9e-4c95e6f5389a","Type":"ContainerDied","Data":"54cbd68263ae2fc30f4e03beb2be53a91d482606d5e9f2298c0a7e21e5f0c884"} Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.534653 4719 scope.go:117] "RemoveContainer" containerID="ef9b73cf72dbe624f8649238f78b7a283967c35b1b38f59eb3ceb64324f2b069" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.534823 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.564040 4719 scope.go:117] "RemoveContainer" containerID="ef9b73cf72dbe624f8649238f78b7a283967c35b1b38f59eb3ceb64324f2b069" Oct 09 15:38:13 crc kubenswrapper[4719]: E1009 15:38:13.565649 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef9b73cf72dbe624f8649238f78b7a283967c35b1b38f59eb3ceb64324f2b069\": container with ID starting with ef9b73cf72dbe624f8649238f78b7a283967c35b1b38f59eb3ceb64324f2b069 not found: ID does not exist" containerID="ef9b73cf72dbe624f8649238f78b7a283967c35b1b38f59eb3ceb64324f2b069" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.565681 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef9b73cf72dbe624f8649238f78b7a283967c35b1b38f59eb3ceb64324f2b069"} err="failed to get container status \"ef9b73cf72dbe624f8649238f78b7a283967c35b1b38f59eb3ceb64324f2b069\": rpc error: code = NotFound desc = could not find container \"ef9b73cf72dbe624f8649238f78b7a283967c35b1b38f59eb3ceb64324f2b069\": container with ID starting with ef9b73cf72dbe624f8649238f78b7a283967c35b1b38f59eb3ceb64324f2b069 not found: ID does not exist" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.593396 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.609851 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.621338 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 15:38:13 crc kubenswrapper[4719]: E1009 15:38:13.621760 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b96d4d6-d2f4-4e82-9a9e-4c95e6f5389a" containerName="kube-state-metrics" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.621779 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b96d4d6-d2f4-4e82-9a9e-4c95e6f5389a" containerName="kube-state-metrics" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.621972 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b96d4d6-d2f4-4e82-9a9e-4c95e6f5389a" containerName="kube-state-metrics" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.622579 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.630761 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.631848 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.632237 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.722506 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8n5c\" (UniqueName: \"kubernetes.io/projected/1f083e47-9fa6-462e-b596-8665719a2e4f-kube-api-access-t8n5c\") pod \"kube-state-metrics-0\" (UID: \"1f083e47-9fa6-462e-b596-8665719a2e4f\") " pod="openstack/kube-state-metrics-0" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.722900 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f083e47-9fa6-462e-b596-8665719a2e4f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1f083e47-9fa6-462e-b596-8665719a2e4f\") " pod="openstack/kube-state-metrics-0" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.722941 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f083e47-9fa6-462e-b596-8665719a2e4f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1f083e47-9fa6-462e-b596-8665719a2e4f\") " pod="openstack/kube-state-metrics-0" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.723131 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1f083e47-9fa6-462e-b596-8665719a2e4f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1f083e47-9fa6-462e-b596-8665719a2e4f\") " pod="openstack/kube-state-metrics-0" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.825099 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8n5c\" (UniqueName: \"kubernetes.io/projected/1f083e47-9fa6-462e-b596-8665719a2e4f-kube-api-access-t8n5c\") pod \"kube-state-metrics-0\" (UID: \"1f083e47-9fa6-462e-b596-8665719a2e4f\") " pod="openstack/kube-state-metrics-0" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.825183 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f083e47-9fa6-462e-b596-8665719a2e4f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1f083e47-9fa6-462e-b596-8665719a2e4f\") " pod="openstack/kube-state-metrics-0" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.825210 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f083e47-9fa6-462e-b596-8665719a2e4f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1f083e47-9fa6-462e-b596-8665719a2e4f\") " pod="openstack/kube-state-metrics-0" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.825284 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1f083e47-9fa6-462e-b596-8665719a2e4f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1f083e47-9fa6-462e-b596-8665719a2e4f\") " pod="openstack/kube-state-metrics-0" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.835217 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1f083e47-9fa6-462e-b596-8665719a2e4f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1f083e47-9fa6-462e-b596-8665719a2e4f\") " pod="openstack/kube-state-metrics-0" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.835370 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f083e47-9fa6-462e-b596-8665719a2e4f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1f083e47-9fa6-462e-b596-8665719a2e4f\") " pod="openstack/kube-state-metrics-0" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.835852 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f083e47-9fa6-462e-b596-8665719a2e4f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1f083e47-9fa6-462e-b596-8665719a2e4f\") " pod="openstack/kube-state-metrics-0" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.844930 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8n5c\" (UniqueName: \"kubernetes.io/projected/1f083e47-9fa6-462e-b596-8665719a2e4f-kube-api-access-t8n5c\") pod \"kube-state-metrics-0\" (UID: \"1f083e47-9fa6-462e-b596-8665719a2e4f\") " pod="openstack/kube-state-metrics-0" Oct 09 15:38:13 crc kubenswrapper[4719]: I1009 15:38:13.938536 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 15:38:14 crc kubenswrapper[4719]: I1009 15:38:14.445090 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 15:38:14 crc kubenswrapper[4719]: W1009 15:38:14.449304 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f083e47_9fa6_462e_b596_8665719a2e4f.slice/crio-07a6522f8214200b859137aa4bd4e31a4cb016d7139054d7777d2d8efa9dfc8c WatchSource:0}: Error finding container 07a6522f8214200b859137aa4bd4e31a4cb016d7139054d7777d2d8efa9dfc8c: Status 404 returned error can't find the container with id 07a6522f8214200b859137aa4bd4e31a4cb016d7139054d7777d2d8efa9dfc8c Oct 09 15:38:14 crc kubenswrapper[4719]: I1009 15:38:14.452646 4719 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 15:38:14 crc kubenswrapper[4719]: I1009 15:38:14.544923 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1f083e47-9fa6-462e-b596-8665719a2e4f","Type":"ContainerStarted","Data":"07a6522f8214200b859137aa4bd4e31a4cb016d7139054d7777d2d8efa9dfc8c"} Oct 09 15:38:14 crc kubenswrapper[4719]: I1009 15:38:14.939948 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:38:14 crc kubenswrapper[4719]: I1009 15:38:14.940582 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" containerName="ceilometer-central-agent" containerID="cri-o://568061fd1344d0ed2923d54e50ae80d71d28d0e14a068820f5b86efe7fe5ccd5" gracePeriod=30 Oct 09 15:38:14 crc kubenswrapper[4719]: I1009 15:38:14.941063 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" containerName="proxy-httpd" containerID="cri-o://21eb77e81e5c5c5e92cee62aa6fea977469a904b378cbe5b5b62b7b1d513d52d" gracePeriod=30 Oct 09 15:38:14 crc kubenswrapper[4719]: I1009 15:38:14.941124 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" containerName="sg-core" containerID="cri-o://29c519552f94b7ac4278dd882695b28b9ce3f8f5723bc75f81a2242c39a20575" gracePeriod=30 Oct 09 15:38:14 crc kubenswrapper[4719]: I1009 15:38:14.941174 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" containerName="ceilometer-notification-agent" containerID="cri-o://fd0da4fa978485b84d3fe1df33b378fadca4abcda86474c9e8497fe666c6a6b2" gracePeriod=30 Oct 09 15:38:15 crc kubenswrapper[4719]: I1009 15:38:15.181440 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b96d4d6-d2f4-4e82-9a9e-4c95e6f5389a" path="/var/lib/kubelet/pods/6b96d4d6-d2f4-4e82-9a9e-4c95e6f5389a/volumes" Oct 09 15:38:15 crc kubenswrapper[4719]: I1009 15:38:15.559096 4719 generic.go:334] "Generic (PLEG): container finished" podID="4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" containerID="21eb77e81e5c5c5e92cee62aa6fea977469a904b378cbe5b5b62b7b1d513d52d" exitCode=0 Oct 09 15:38:15 crc kubenswrapper[4719]: I1009 15:38:15.559371 4719 generic.go:334] "Generic (PLEG): container finished" podID="4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" containerID="29c519552f94b7ac4278dd882695b28b9ce3f8f5723bc75f81a2242c39a20575" exitCode=2 Oct 09 15:38:15 crc kubenswrapper[4719]: I1009 15:38:15.559305 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac","Type":"ContainerDied","Data":"21eb77e81e5c5c5e92cee62aa6fea977469a904b378cbe5b5b62b7b1d513d52d"} Oct 09 15:38:15 crc kubenswrapper[4719]: I1009 15:38:15.559443 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac","Type":"ContainerDied","Data":"29c519552f94b7ac4278dd882695b28b9ce3f8f5723bc75f81a2242c39a20575"} Oct 09 15:38:15 crc kubenswrapper[4719]: I1009 15:38:15.561539 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1f083e47-9fa6-462e-b596-8665719a2e4f","Type":"ContainerStarted","Data":"c1c32a942ba925001cda3af5bae4fa96a7a85c38e104239a6d69b9f972dcfbc5"} Oct 09 15:38:15 crc kubenswrapper[4719]: I1009 15:38:15.562397 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 09 15:38:15 crc kubenswrapper[4719]: I1009 15:38:15.592452 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.230659248 podStartE2EDuration="2.592429724s" podCreationTimestamp="2025-10-09 15:38:13 +0000 UTC" firstStartedPulling="2025-10-09 15:38:14.452453062 +0000 UTC m=+1199.962164347" lastFinishedPulling="2025-10-09 15:38:14.814223538 +0000 UTC m=+1200.323934823" observedRunningTime="2025-10-09 15:38:15.580262076 +0000 UTC m=+1201.089973361" watchObservedRunningTime="2025-10-09 15:38:15.592429724 +0000 UTC m=+1201.102141019" Oct 09 15:38:16 crc kubenswrapper[4719]: I1009 15:38:16.573719 4719 generic.go:334] "Generic (PLEG): container finished" podID="4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" containerID="568061fd1344d0ed2923d54e50ae80d71d28d0e14a068820f5b86efe7fe5ccd5" exitCode=0 Oct 09 15:38:16 crc kubenswrapper[4719]: I1009 15:38:16.573790 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac","Type":"ContainerDied","Data":"568061fd1344d0ed2923d54e50ae80d71d28d0e14a068820f5b86efe7fe5ccd5"} Oct 09 15:38:16 crc kubenswrapper[4719]: I1009 15:38:16.889700 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.057429 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.100211 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-run-httpd\") pod \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.100281 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-scripts\") pod \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.100374 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-sg-core-conf-yaml\") pod \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.100402 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-combined-ca-bundle\") pod \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.100460 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-config-data\") pod \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.100491 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cd5h\" (UniqueName: \"kubernetes.io/projected/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-kube-api-access-8cd5h\") pod \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.100570 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-log-httpd\") pod \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\" (UID: \"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac\") " Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.100571 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" (UID: "4369d7cd-4d06-428f-aa0e-fd0a1ed300ac"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.112590 4719 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.113747 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" (UID: "4369d7cd-4d06-428f-aa0e-fd0a1ed300ac"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.129635 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-scripts" (OuterVolumeSpecName: "scripts") pod "4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" (UID: "4369d7cd-4d06-428f-aa0e-fd0a1ed300ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.130703 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-kube-api-access-8cd5h" (OuterVolumeSpecName: "kube-api-access-8cd5h") pod "4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" (UID: "4369d7cd-4d06-428f-aa0e-fd0a1ed300ac"). InnerVolumeSpecName "kube-api-access-8cd5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.221750 4719 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.221784 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.221797 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cd5h\" (UniqueName: \"kubernetes.io/projected/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-kube-api-access-8cd5h\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.247570 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" (UID: "4369d7cd-4d06-428f-aa0e-fd0a1ed300ac"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.325385 4719 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.372118 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-config-data" (OuterVolumeSpecName: "config-data") pod "4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" (UID: "4369d7cd-4d06-428f-aa0e-fd0a1ed300ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.388053 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" (UID: "4369d7cd-4d06-428f-aa0e-fd0a1ed300ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.426606 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.426643 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.542036 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-9pqdd"] Oct 09 15:38:17 crc kubenswrapper[4719]: E1009 15:38:17.542414 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" containerName="ceilometer-central-agent" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.542430 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" containerName="ceilometer-central-agent" Oct 09 15:38:17 crc kubenswrapper[4719]: E1009 15:38:17.542457 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" containerName="sg-core" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.542463 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" containerName="sg-core" Oct 09 15:38:17 crc kubenswrapper[4719]: E1009 15:38:17.542474 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" containerName="proxy-httpd" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.542481 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" containerName="proxy-httpd" Oct 09 15:38:17 crc kubenswrapper[4719]: E1009 15:38:17.542490 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" containerName="ceilometer-notification-agent" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.542496 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" containerName="ceilometer-notification-agent" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.542678 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" containerName="ceilometer-central-agent" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.542692 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" containerName="sg-core" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.542706 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" containerName="proxy-httpd" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.542719 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" containerName="ceilometer-notification-agent" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.543365 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9pqdd" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.545674 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.547130 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.557071 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9pqdd"] Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.593845 4719 generic.go:334] "Generic (PLEG): container finished" podID="4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" containerID="fd0da4fa978485b84d3fe1df33b378fadca4abcda86474c9e8497fe666c6a6b2" exitCode=0 Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.593916 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac","Type":"ContainerDied","Data":"fd0da4fa978485b84d3fe1df33b378fadca4abcda86474c9e8497fe666c6a6b2"} Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.593940 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4369d7cd-4d06-428f-aa0e-fd0a1ed300ac","Type":"ContainerDied","Data":"121136afb18934c1d1c11c4dd6ca17a8890db9ae4cdb667a53f0f899a5b2289c"} Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.593962 4719 scope.go:117] "RemoveContainer" containerID="21eb77e81e5c5c5e92cee62aa6fea977469a904b378cbe5b5b62b7b1d513d52d" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.594107 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.628337 4719 scope.go:117] "RemoveContainer" containerID="29c519552f94b7ac4278dd882695b28b9ce3f8f5723bc75f81a2242c39a20575" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.631550 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e99f7a-bb5e-41c0-a55a-02b671a69ad8-config-data\") pod \"nova-cell0-cell-mapping-9pqdd\" (UID: \"04e99f7a-bb5e-41c0-a55a-02b671a69ad8\") " pod="openstack/nova-cell0-cell-mapping-9pqdd" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.631607 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e99f7a-bb5e-41c0-a55a-02b671a69ad8-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9pqdd\" (UID: \"04e99f7a-bb5e-41c0-a55a-02b671a69ad8\") " pod="openstack/nova-cell0-cell-mapping-9pqdd" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.631776 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04e99f7a-bb5e-41c0-a55a-02b671a69ad8-scripts\") pod \"nova-cell0-cell-mapping-9pqdd\" (UID: \"04e99f7a-bb5e-41c0-a55a-02b671a69ad8\") " pod="openstack/nova-cell0-cell-mapping-9pqdd" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.631833 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smmbn\" (UniqueName: \"kubernetes.io/projected/04e99f7a-bb5e-41c0-a55a-02b671a69ad8-kube-api-access-smmbn\") pod \"nova-cell0-cell-mapping-9pqdd\" (UID: \"04e99f7a-bb5e-41c0-a55a-02b671a69ad8\") " pod="openstack/nova-cell0-cell-mapping-9pqdd" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.644368 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.662890 4719 scope.go:117] "RemoveContainer" containerID="fd0da4fa978485b84d3fe1df33b378fadca4abcda86474c9e8497fe666c6a6b2" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.666407 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.699577 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.702182 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.712670 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.712762 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.712971 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.724634 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.726982 4719 scope.go:117] "RemoveContainer" containerID="568061fd1344d0ed2923d54e50ae80d71d28d0e14a068820f5b86efe7fe5ccd5" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.733166 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e99f7a-bb5e-41c0-a55a-02b671a69ad8-config-data\") pod \"nova-cell0-cell-mapping-9pqdd\" (UID: \"04e99f7a-bb5e-41c0-a55a-02b671a69ad8\") " pod="openstack/nova-cell0-cell-mapping-9pqdd" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.733204 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e99f7a-bb5e-41c0-a55a-02b671a69ad8-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9pqdd\" (UID: \"04e99f7a-bb5e-41c0-a55a-02b671a69ad8\") " pod="openstack/nova-cell0-cell-mapping-9pqdd" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.733319 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04e99f7a-bb5e-41c0-a55a-02b671a69ad8-scripts\") pod \"nova-cell0-cell-mapping-9pqdd\" (UID: \"04e99f7a-bb5e-41c0-a55a-02b671a69ad8\") " pod="openstack/nova-cell0-cell-mapping-9pqdd" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.733375 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smmbn\" (UniqueName: \"kubernetes.io/projected/04e99f7a-bb5e-41c0-a55a-02b671a69ad8-kube-api-access-smmbn\") pod \"nova-cell0-cell-mapping-9pqdd\" (UID: \"04e99f7a-bb5e-41c0-a55a-02b671a69ad8\") " pod="openstack/nova-cell0-cell-mapping-9pqdd" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.739263 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e99f7a-bb5e-41c0-a55a-02b671a69ad8-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9pqdd\" (UID: \"04e99f7a-bb5e-41c0-a55a-02b671a69ad8\") " pod="openstack/nova-cell0-cell-mapping-9pqdd" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.741367 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e99f7a-bb5e-41c0-a55a-02b671a69ad8-config-data\") pod \"nova-cell0-cell-mapping-9pqdd\" (UID: \"04e99f7a-bb5e-41c0-a55a-02b671a69ad8\") " pod="openstack/nova-cell0-cell-mapping-9pqdd" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.742611 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04e99f7a-bb5e-41c0-a55a-02b671a69ad8-scripts\") pod \"nova-cell0-cell-mapping-9pqdd\" (UID: \"04e99f7a-bb5e-41c0-a55a-02b671a69ad8\") " pod="openstack/nova-cell0-cell-mapping-9pqdd" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.757504 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.760308 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.768268 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.791401 4719 scope.go:117] "RemoveContainer" containerID="21eb77e81e5c5c5e92cee62aa6fea977469a904b378cbe5b5b62b7b1d513d52d" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.791864 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smmbn\" (UniqueName: \"kubernetes.io/projected/04e99f7a-bb5e-41c0-a55a-02b671a69ad8-kube-api-access-smmbn\") pod \"nova-cell0-cell-mapping-9pqdd\" (UID: \"04e99f7a-bb5e-41c0-a55a-02b671a69ad8\") " pod="openstack/nova-cell0-cell-mapping-9pqdd" Oct 09 15:38:17 crc kubenswrapper[4719]: E1009 15:38:17.793004 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21eb77e81e5c5c5e92cee62aa6fea977469a904b378cbe5b5b62b7b1d513d52d\": container with ID starting with 21eb77e81e5c5c5e92cee62aa6fea977469a904b378cbe5b5b62b7b1d513d52d not found: ID does not exist" containerID="21eb77e81e5c5c5e92cee62aa6fea977469a904b378cbe5b5b62b7b1d513d52d" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.793035 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21eb77e81e5c5c5e92cee62aa6fea977469a904b378cbe5b5b62b7b1d513d52d"} err="failed to get container status \"21eb77e81e5c5c5e92cee62aa6fea977469a904b378cbe5b5b62b7b1d513d52d\": rpc error: code = NotFound desc = could not find container \"21eb77e81e5c5c5e92cee62aa6fea977469a904b378cbe5b5b62b7b1d513d52d\": container with ID starting with 21eb77e81e5c5c5e92cee62aa6fea977469a904b378cbe5b5b62b7b1d513d52d not found: ID does not exist" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.793061 4719 scope.go:117] "RemoveContainer" containerID="29c519552f94b7ac4278dd882695b28b9ce3f8f5723bc75f81a2242c39a20575" Oct 09 15:38:17 crc kubenswrapper[4719]: E1009 15:38:17.793293 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29c519552f94b7ac4278dd882695b28b9ce3f8f5723bc75f81a2242c39a20575\": container with ID starting with 29c519552f94b7ac4278dd882695b28b9ce3f8f5723bc75f81a2242c39a20575 not found: ID does not exist" containerID="29c519552f94b7ac4278dd882695b28b9ce3f8f5723bc75f81a2242c39a20575" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.793394 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29c519552f94b7ac4278dd882695b28b9ce3f8f5723bc75f81a2242c39a20575"} err="failed to get container status \"29c519552f94b7ac4278dd882695b28b9ce3f8f5723bc75f81a2242c39a20575\": rpc error: code = NotFound desc = could not find container \"29c519552f94b7ac4278dd882695b28b9ce3f8f5723bc75f81a2242c39a20575\": container with ID starting with 29c519552f94b7ac4278dd882695b28b9ce3f8f5723bc75f81a2242c39a20575 not found: ID does not exist" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.793472 4719 scope.go:117] "RemoveContainer" containerID="fd0da4fa978485b84d3fe1df33b378fadca4abcda86474c9e8497fe666c6a6b2" Oct 09 15:38:17 crc kubenswrapper[4719]: E1009 15:38:17.793758 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd0da4fa978485b84d3fe1df33b378fadca4abcda86474c9e8497fe666c6a6b2\": container with ID starting with fd0da4fa978485b84d3fe1df33b378fadca4abcda86474c9e8497fe666c6a6b2 not found: ID does not exist" containerID="fd0da4fa978485b84d3fe1df33b378fadca4abcda86474c9e8497fe666c6a6b2" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.793836 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd0da4fa978485b84d3fe1df33b378fadca4abcda86474c9e8497fe666c6a6b2"} err="failed to get container status \"fd0da4fa978485b84d3fe1df33b378fadca4abcda86474c9e8497fe666c6a6b2\": rpc error: code = NotFound desc = could not find container \"fd0da4fa978485b84d3fe1df33b378fadca4abcda86474c9e8497fe666c6a6b2\": container with ID starting with fd0da4fa978485b84d3fe1df33b378fadca4abcda86474c9e8497fe666c6a6b2 not found: ID does not exist" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.793907 4719 scope.go:117] "RemoveContainer" containerID="568061fd1344d0ed2923d54e50ae80d71d28d0e14a068820f5b86efe7fe5ccd5" Oct 09 15:38:17 crc kubenswrapper[4719]: E1009 15:38:17.794141 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"568061fd1344d0ed2923d54e50ae80d71d28d0e14a068820f5b86efe7fe5ccd5\": container with ID starting with 568061fd1344d0ed2923d54e50ae80d71d28d0e14a068820f5b86efe7fe5ccd5 not found: ID does not exist" containerID="568061fd1344d0ed2923d54e50ae80d71d28d0e14a068820f5b86efe7fe5ccd5" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.794165 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"568061fd1344d0ed2923d54e50ae80d71d28d0e14a068820f5b86efe7fe5ccd5"} err="failed to get container status \"568061fd1344d0ed2923d54e50ae80d71d28d0e14a068820f5b86efe7fe5ccd5\": rpc error: code = NotFound desc = could not find container \"568061fd1344d0ed2923d54e50ae80d71d28d0e14a068820f5b86efe7fe5ccd5\": container with ID starting with 568061fd1344d0ed2923d54e50ae80d71d28d0e14a068820f5b86efe7fe5ccd5 not found: ID does not exist" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.838167 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45da18fe-c307-40ff-8a91-c86ba32c516d-logs\") pod \"nova-api-0\" (UID: \"45da18fe-c307-40ff-8a91-c86ba32c516d\") " pod="openstack/nova-api-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.838251 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-config-data\") pod \"ceilometer-0\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.838288 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.838335 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43123f9f-e7f7-4d3a-8856-18d324f55254-log-httpd\") pod \"ceilometer-0\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.838471 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwrsr\" (UniqueName: \"kubernetes.io/projected/45da18fe-c307-40ff-8a91-c86ba32c516d-kube-api-access-kwrsr\") pod \"nova-api-0\" (UID: \"45da18fe-c307-40ff-8a91-c86ba32c516d\") " pod="openstack/nova-api-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.838513 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45da18fe-c307-40ff-8a91-c86ba32c516d-config-data\") pod \"nova-api-0\" (UID: \"45da18fe-c307-40ff-8a91-c86ba32c516d\") " pod="openstack/nova-api-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.838716 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43123f9f-e7f7-4d3a-8856-18d324f55254-run-httpd\") pod \"ceilometer-0\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.838738 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.838925 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-scripts\") pod \"ceilometer-0\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.839040 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.839062 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjj8t\" (UniqueName: \"kubernetes.io/projected/43123f9f-e7f7-4d3a-8856-18d324f55254-kube-api-access-kjj8t\") pod \"ceilometer-0\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.839105 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45da18fe-c307-40ff-8a91-c86ba32c516d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"45da18fe-c307-40ff-8a91-c86ba32c516d\") " pod="openstack/nova-api-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.840935 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.884279 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9pqdd" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.894399 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.895728 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.905779 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.906620 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.940967 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwrsr\" (UniqueName: \"kubernetes.io/projected/45da18fe-c307-40ff-8a91-c86ba32c516d-kube-api-access-kwrsr\") pod \"nova-api-0\" (UID: \"45da18fe-c307-40ff-8a91-c86ba32c516d\") " pod="openstack/nova-api-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.941230 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45da18fe-c307-40ff-8a91-c86ba32c516d-config-data\") pod \"nova-api-0\" (UID: \"45da18fe-c307-40ff-8a91-c86ba32c516d\") " pod="openstack/nova-api-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.941466 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43123f9f-e7f7-4d3a-8856-18d324f55254-run-httpd\") pod \"ceilometer-0\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.941574 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.941701 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc447\" (UniqueName: \"kubernetes.io/projected/fee8c430-e8cc-47a8-8ec3-4594e0500a4f-kube-api-access-hc447\") pod \"nova-scheduler-0\" (UID: \"fee8c430-e8cc-47a8-8ec3-4594e0500a4f\") " pod="openstack/nova-scheduler-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.941794 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-scripts\") pod \"ceilometer-0\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.941895 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee8c430-e8cc-47a8-8ec3-4594e0500a4f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fee8c430-e8cc-47a8-8ec3-4594e0500a4f\") " pod="openstack/nova-scheduler-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.941996 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.942081 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjj8t\" (UniqueName: \"kubernetes.io/projected/43123f9f-e7f7-4d3a-8856-18d324f55254-kube-api-access-kjj8t\") pod \"ceilometer-0\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.942179 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45da18fe-c307-40ff-8a91-c86ba32c516d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"45da18fe-c307-40ff-8a91-c86ba32c516d\") " pod="openstack/nova-api-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.942284 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45da18fe-c307-40ff-8a91-c86ba32c516d-logs\") pod \"nova-api-0\" (UID: \"45da18fe-c307-40ff-8a91-c86ba32c516d\") " pod="openstack/nova-api-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.942396 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-config-data\") pod \"ceilometer-0\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.942496 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.942574 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43123f9f-e7f7-4d3a-8856-18d324f55254-log-httpd\") pod \"ceilometer-0\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.942657 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee8c430-e8cc-47a8-8ec3-4594e0500a4f-config-data\") pod \"nova-scheduler-0\" (UID: \"fee8c430-e8cc-47a8-8ec3-4594e0500a4f\") " pod="openstack/nova-scheduler-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.947992 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43123f9f-e7f7-4d3a-8856-18d324f55254-log-httpd\") pod \"ceilometer-0\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.960768 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43123f9f-e7f7-4d3a-8856-18d324f55254-run-httpd\") pod \"ceilometer-0\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.961119 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45da18fe-c307-40ff-8a91-c86ba32c516d-logs\") pod \"nova-api-0\" (UID: \"45da18fe-c307-40ff-8a91-c86ba32c516d\") " pod="openstack/nova-api-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.981206 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45da18fe-c307-40ff-8a91-c86ba32c516d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"45da18fe-c307-40ff-8a91-c86ba32c516d\") " pod="openstack/nova-api-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.981564 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.982155 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwrsr\" (UniqueName: \"kubernetes.io/projected/45da18fe-c307-40ff-8a91-c86ba32c516d-kube-api-access-kwrsr\") pod \"nova-api-0\" (UID: \"45da18fe-c307-40ff-8a91-c86ba32c516d\") " pod="openstack/nova-api-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.982747 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-scripts\") pod \"ceilometer-0\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.986930 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-config-data\") pod \"ceilometer-0\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.987039 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45da18fe-c307-40ff-8a91-c86ba32c516d-config-data\") pod \"nova-api-0\" (UID: \"45da18fe-c307-40ff-8a91-c86ba32c516d\") " pod="openstack/nova-api-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.987321 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.988814 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjj8t\" (UniqueName: \"kubernetes.io/projected/43123f9f-e7f7-4d3a-8856-18d324f55254-kube-api-access-kjj8t\") pod \"ceilometer-0\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " pod="openstack/ceilometer-0" Oct 09 15:38:17 crc kubenswrapper[4719]: I1009 15:38:17.990618 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " pod="openstack/ceilometer-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.043632 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.045249 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc447\" (UniqueName: \"kubernetes.io/projected/fee8c430-e8cc-47a8-8ec3-4594e0500a4f-kube-api-access-hc447\") pod \"nova-scheduler-0\" (UID: \"fee8c430-e8cc-47a8-8ec3-4594e0500a4f\") " pod="openstack/nova-scheduler-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.045328 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee8c430-e8cc-47a8-8ec3-4594e0500a4f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fee8c430-e8cc-47a8-8ec3-4594e0500a4f\") " pod="openstack/nova-scheduler-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.045413 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee8c430-e8cc-47a8-8ec3-4594e0500a4f-config-data\") pod \"nova-scheduler-0\" (UID: \"fee8c430-e8cc-47a8-8ec3-4594e0500a4f\") " pod="openstack/nova-scheduler-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.052036 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee8c430-e8cc-47a8-8ec3-4594e0500a4f-config-data\") pod \"nova-scheduler-0\" (UID: \"fee8c430-e8cc-47a8-8ec3-4594e0500a4f\") " pod="openstack/nova-scheduler-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.058078 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee8c430-e8cc-47a8-8ec3-4594e0500a4f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fee8c430-e8cc-47a8-8ec3-4594e0500a4f\") " pod="openstack/nova-scheduler-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.093949 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc447\" (UniqueName: \"kubernetes.io/projected/fee8c430-e8cc-47a8-8ec3-4594e0500a4f-kube-api-access-hc447\") pod \"nova-scheduler-0\" (UID: \"fee8c430-e8cc-47a8-8ec3-4594e0500a4f\") " pod="openstack/nova-scheduler-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.100422 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.102083 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.103592 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.105139 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.117337 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.124564 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.126748 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.130611 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.138826 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.150867 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb00a64b-377d-450c-81df-408b0790b21f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"eb00a64b-377d-450c-81df-408b0790b21f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.150933 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4hxq\" (UniqueName: \"kubernetes.io/projected/21278307-5a9e-4e08-92ef-c542da277f23-kube-api-access-s4hxq\") pod \"nova-metadata-0\" (UID: \"21278307-5a9e-4e08-92ef-c542da277f23\") " pod="openstack/nova-metadata-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.150958 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21278307-5a9e-4e08-92ef-c542da277f23-logs\") pod \"nova-metadata-0\" (UID: \"21278307-5a9e-4e08-92ef-c542da277f23\") " pod="openstack/nova-metadata-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.150977 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf4lb\" (UniqueName: \"kubernetes.io/projected/eb00a64b-377d-450c-81df-408b0790b21f-kube-api-access-cf4lb\") pod \"nova-cell1-novncproxy-0\" (UID: \"eb00a64b-377d-450c-81df-408b0790b21f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.151001 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb00a64b-377d-450c-81df-408b0790b21f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"eb00a64b-377d-450c-81df-408b0790b21f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.151019 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21278307-5a9e-4e08-92ef-c542da277f23-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"21278307-5a9e-4e08-92ef-c542da277f23\") " pod="openstack/nova-metadata-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.151067 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21278307-5a9e-4e08-92ef-c542da277f23-config-data\") pod \"nova-metadata-0\" (UID: \"21278307-5a9e-4e08-92ef-c542da277f23\") " pod="openstack/nova-metadata-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.163295 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58b454788c-qkz7q"] Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.165202 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b454788c-qkz7q" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.200525 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58b454788c-qkz7q"] Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.238130 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.256332 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21278307-5a9e-4e08-92ef-c542da277f23-config-data\") pod \"nova-metadata-0\" (UID: \"21278307-5a9e-4e08-92ef-c542da277f23\") " pod="openstack/nova-metadata-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.256399 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-ovsdbserver-sb\") pod \"dnsmasq-dns-58b454788c-qkz7q\" (UID: \"aa17763b-47cf-4305-a42e-4f43fa08e189\") " pod="openstack/dnsmasq-dns-58b454788c-qkz7q" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.256455 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-dns-svc\") pod \"dnsmasq-dns-58b454788c-qkz7q\" (UID: \"aa17763b-47cf-4305-a42e-4f43fa08e189\") " pod="openstack/dnsmasq-dns-58b454788c-qkz7q" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.256487 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-ovsdbserver-nb\") pod \"dnsmasq-dns-58b454788c-qkz7q\" (UID: \"aa17763b-47cf-4305-a42e-4f43fa08e189\") " pod="openstack/dnsmasq-dns-58b454788c-qkz7q" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.256506 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-config\") pod \"dnsmasq-dns-58b454788c-qkz7q\" (UID: \"aa17763b-47cf-4305-a42e-4f43fa08e189\") " pod="openstack/dnsmasq-dns-58b454788c-qkz7q" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.262315 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb00a64b-377d-450c-81df-408b0790b21f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"eb00a64b-377d-450c-81df-408b0790b21f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.262441 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-dns-swift-storage-0\") pod \"dnsmasq-dns-58b454788c-qkz7q\" (UID: \"aa17763b-47cf-4305-a42e-4f43fa08e189\") " pod="openstack/dnsmasq-dns-58b454788c-qkz7q" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.262502 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4hxq\" (UniqueName: \"kubernetes.io/projected/21278307-5a9e-4e08-92ef-c542da277f23-kube-api-access-s4hxq\") pod \"nova-metadata-0\" (UID: \"21278307-5a9e-4e08-92ef-c542da277f23\") " pod="openstack/nova-metadata-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.262533 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21278307-5a9e-4e08-92ef-c542da277f23-logs\") pod \"nova-metadata-0\" (UID: \"21278307-5a9e-4e08-92ef-c542da277f23\") " pod="openstack/nova-metadata-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.262561 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf4lb\" (UniqueName: \"kubernetes.io/projected/eb00a64b-377d-450c-81df-408b0790b21f-kube-api-access-cf4lb\") pod \"nova-cell1-novncproxy-0\" (UID: \"eb00a64b-377d-450c-81df-408b0790b21f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.262579 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkkmc\" (UniqueName: \"kubernetes.io/projected/aa17763b-47cf-4305-a42e-4f43fa08e189-kube-api-access-pkkmc\") pod \"dnsmasq-dns-58b454788c-qkz7q\" (UID: \"aa17763b-47cf-4305-a42e-4f43fa08e189\") " pod="openstack/dnsmasq-dns-58b454788c-qkz7q" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.264295 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21278307-5a9e-4e08-92ef-c542da277f23-logs\") pod \"nova-metadata-0\" (UID: \"21278307-5a9e-4e08-92ef-c542da277f23\") " pod="openstack/nova-metadata-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.269499 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb00a64b-377d-450c-81df-408b0790b21f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"eb00a64b-377d-450c-81df-408b0790b21f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.269546 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21278307-5a9e-4e08-92ef-c542da277f23-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"21278307-5a9e-4e08-92ef-c542da277f23\") " pod="openstack/nova-metadata-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.282192 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21278307-5a9e-4e08-92ef-c542da277f23-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"21278307-5a9e-4e08-92ef-c542da277f23\") " pod="openstack/nova-metadata-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.282234 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb00a64b-377d-450c-81df-408b0790b21f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"eb00a64b-377d-450c-81df-408b0790b21f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.283644 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21278307-5a9e-4e08-92ef-c542da277f23-config-data\") pod \"nova-metadata-0\" (UID: \"21278307-5a9e-4e08-92ef-c542da277f23\") " pod="openstack/nova-metadata-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.290040 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4hxq\" (UniqueName: \"kubernetes.io/projected/21278307-5a9e-4e08-92ef-c542da277f23-kube-api-access-s4hxq\") pod \"nova-metadata-0\" (UID: \"21278307-5a9e-4e08-92ef-c542da277f23\") " pod="openstack/nova-metadata-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.290822 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb00a64b-377d-450c-81df-408b0790b21f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"eb00a64b-377d-450c-81df-408b0790b21f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.291176 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf4lb\" (UniqueName: \"kubernetes.io/projected/eb00a64b-377d-450c-81df-408b0790b21f-kube-api-access-cf4lb\") pod \"nova-cell1-novncproxy-0\" (UID: \"eb00a64b-377d-450c-81df-408b0790b21f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.371017 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-dns-swift-storage-0\") pod \"dnsmasq-dns-58b454788c-qkz7q\" (UID: \"aa17763b-47cf-4305-a42e-4f43fa08e189\") " pod="openstack/dnsmasq-dns-58b454788c-qkz7q" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.371110 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkkmc\" (UniqueName: \"kubernetes.io/projected/aa17763b-47cf-4305-a42e-4f43fa08e189-kube-api-access-pkkmc\") pod \"dnsmasq-dns-58b454788c-qkz7q\" (UID: \"aa17763b-47cf-4305-a42e-4f43fa08e189\") " pod="openstack/dnsmasq-dns-58b454788c-qkz7q" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.371210 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-ovsdbserver-sb\") pod \"dnsmasq-dns-58b454788c-qkz7q\" (UID: \"aa17763b-47cf-4305-a42e-4f43fa08e189\") " pod="openstack/dnsmasq-dns-58b454788c-qkz7q" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.371244 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-dns-svc\") pod \"dnsmasq-dns-58b454788c-qkz7q\" (UID: \"aa17763b-47cf-4305-a42e-4f43fa08e189\") " pod="openstack/dnsmasq-dns-58b454788c-qkz7q" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.371265 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-config\") pod \"dnsmasq-dns-58b454788c-qkz7q\" (UID: \"aa17763b-47cf-4305-a42e-4f43fa08e189\") " pod="openstack/dnsmasq-dns-58b454788c-qkz7q" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.371280 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-ovsdbserver-nb\") pod \"dnsmasq-dns-58b454788c-qkz7q\" (UID: \"aa17763b-47cf-4305-a42e-4f43fa08e189\") " pod="openstack/dnsmasq-dns-58b454788c-qkz7q" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.372629 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-dns-swift-storage-0\") pod \"dnsmasq-dns-58b454788c-qkz7q\" (UID: \"aa17763b-47cf-4305-a42e-4f43fa08e189\") " pod="openstack/dnsmasq-dns-58b454788c-qkz7q" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.375539 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-ovsdbserver-sb\") pod \"dnsmasq-dns-58b454788c-qkz7q\" (UID: \"aa17763b-47cf-4305-a42e-4f43fa08e189\") " pod="openstack/dnsmasq-dns-58b454788c-qkz7q" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.376472 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-dns-svc\") pod \"dnsmasq-dns-58b454788c-qkz7q\" (UID: \"aa17763b-47cf-4305-a42e-4f43fa08e189\") " pod="openstack/dnsmasq-dns-58b454788c-qkz7q" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.381163 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-config\") pod \"dnsmasq-dns-58b454788c-qkz7q\" (UID: \"aa17763b-47cf-4305-a42e-4f43fa08e189\") " pod="openstack/dnsmasq-dns-58b454788c-qkz7q" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.381396 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-ovsdbserver-nb\") pod \"dnsmasq-dns-58b454788c-qkz7q\" (UID: \"aa17763b-47cf-4305-a42e-4f43fa08e189\") " pod="openstack/dnsmasq-dns-58b454788c-qkz7q" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.397327 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkkmc\" (UniqueName: \"kubernetes.io/projected/aa17763b-47cf-4305-a42e-4f43fa08e189-kube-api-access-pkkmc\") pod \"dnsmasq-dns-58b454788c-qkz7q\" (UID: \"aa17763b-47cf-4305-a42e-4f43fa08e189\") " pod="openstack/dnsmasq-dns-58b454788c-qkz7q" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.564385 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.576810 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9pqdd"] Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.589796 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.598598 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b454788c-qkz7q" Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.611011 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9pqdd" event={"ID":"04e99f7a-bb5e-41c0-a55a-02b671a69ad8","Type":"ContainerStarted","Data":"44bf0082f6ddd2fdbd9d9410befc95e8dad19acfcf8d0d9b49e4a0eef826f136"} Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.745989 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.786925 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:38:18 crc kubenswrapper[4719]: I1009 15:38:18.938469 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.068017 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bfqtw"] Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.075941 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bfqtw" Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.086175 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.086244 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.091425 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bfqtw"] Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.235319 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4369d7cd-4d06-428f-aa0e-fd0a1ed300ac" path="/var/lib/kubelet/pods/4369d7cd-4d06-428f-aa0e-fd0a1ed300ac/volumes" Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.305179 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lss7f\" (UniqueName: \"kubernetes.io/projected/2e7ae899-fa79-4024-a512-6a7648d7fd6a-kube-api-access-lss7f\") pod \"nova-cell1-conductor-db-sync-bfqtw\" (UID: \"2e7ae899-fa79-4024-a512-6a7648d7fd6a\") " pod="openstack/nova-cell1-conductor-db-sync-bfqtw" Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.305318 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7ae899-fa79-4024-a512-6a7648d7fd6a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bfqtw\" (UID: \"2e7ae899-fa79-4024-a512-6a7648d7fd6a\") " pod="openstack/nova-cell1-conductor-db-sync-bfqtw" Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.305342 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e7ae899-fa79-4024-a512-6a7648d7fd6a-config-data\") pod \"nova-cell1-conductor-db-sync-bfqtw\" (UID: \"2e7ae899-fa79-4024-a512-6a7648d7fd6a\") " pod="openstack/nova-cell1-conductor-db-sync-bfqtw" Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.305420 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e7ae899-fa79-4024-a512-6a7648d7fd6a-scripts\") pod \"nova-cell1-conductor-db-sync-bfqtw\" (UID: \"2e7ae899-fa79-4024-a512-6a7648d7fd6a\") " pod="openstack/nova-cell1-conductor-db-sync-bfqtw" Oct 09 15:38:19 crc kubenswrapper[4719]: W1009 15:38:19.317472 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21278307_5a9e_4e08_92ef_c542da277f23.slice/crio-459c970de7830a03005b97cbffec325ae7a0361be5b6757b22e14b2b43518df3 WatchSource:0}: Error finding container 459c970de7830a03005b97cbffec325ae7a0361be5b6757b22e14b2b43518df3: Status 404 returned error can't find the container with id 459c970de7830a03005b97cbffec325ae7a0361be5b6757b22e14b2b43518df3 Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.333948 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.352272 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.408565 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7ae899-fa79-4024-a512-6a7648d7fd6a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bfqtw\" (UID: \"2e7ae899-fa79-4024-a512-6a7648d7fd6a\") " pod="openstack/nova-cell1-conductor-db-sync-bfqtw" Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.408616 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e7ae899-fa79-4024-a512-6a7648d7fd6a-config-data\") pod \"nova-cell1-conductor-db-sync-bfqtw\" (UID: \"2e7ae899-fa79-4024-a512-6a7648d7fd6a\") " pod="openstack/nova-cell1-conductor-db-sync-bfqtw" Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.408682 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e7ae899-fa79-4024-a512-6a7648d7fd6a-scripts\") pod \"nova-cell1-conductor-db-sync-bfqtw\" (UID: \"2e7ae899-fa79-4024-a512-6a7648d7fd6a\") " pod="openstack/nova-cell1-conductor-db-sync-bfqtw" Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.408781 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lss7f\" (UniqueName: \"kubernetes.io/projected/2e7ae899-fa79-4024-a512-6a7648d7fd6a-kube-api-access-lss7f\") pod \"nova-cell1-conductor-db-sync-bfqtw\" (UID: \"2e7ae899-fa79-4024-a512-6a7648d7fd6a\") " pod="openstack/nova-cell1-conductor-db-sync-bfqtw" Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.420599 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e7ae899-fa79-4024-a512-6a7648d7fd6a-config-data\") pod \"nova-cell1-conductor-db-sync-bfqtw\" (UID: \"2e7ae899-fa79-4024-a512-6a7648d7fd6a\") " pod="openstack/nova-cell1-conductor-db-sync-bfqtw" Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.432367 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lss7f\" (UniqueName: \"kubernetes.io/projected/2e7ae899-fa79-4024-a512-6a7648d7fd6a-kube-api-access-lss7f\") pod \"nova-cell1-conductor-db-sync-bfqtw\" (UID: \"2e7ae899-fa79-4024-a512-6a7648d7fd6a\") " pod="openstack/nova-cell1-conductor-db-sync-bfqtw" Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.434395 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7ae899-fa79-4024-a512-6a7648d7fd6a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bfqtw\" (UID: \"2e7ae899-fa79-4024-a512-6a7648d7fd6a\") " pod="openstack/nova-cell1-conductor-db-sync-bfqtw" Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.436011 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e7ae899-fa79-4024-a512-6a7648d7fd6a-scripts\") pod \"nova-cell1-conductor-db-sync-bfqtw\" (UID: \"2e7ae899-fa79-4024-a512-6a7648d7fd6a\") " pod="openstack/nova-cell1-conductor-db-sync-bfqtw" Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.532933 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bfqtw" Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.586405 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58b454788c-qkz7q"] Oct 09 15:38:19 crc kubenswrapper[4719]: W1009 15:38:19.599827 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa17763b_47cf_4305_a42e_4f43fa08e189.slice/crio-435dc4f0c0b7855343d5257551ce797d9fae620a87a79f9c0709ece61a6b4abf WatchSource:0}: Error finding container 435dc4f0c0b7855343d5257551ce797d9fae620a87a79f9c0709ece61a6b4abf: Status 404 returned error can't find the container with id 435dc4f0c0b7855343d5257551ce797d9fae620a87a79f9c0709ece61a6b4abf Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.659720 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b454788c-qkz7q" event={"ID":"aa17763b-47cf-4305-a42e-4f43fa08e189","Type":"ContainerStarted","Data":"435dc4f0c0b7855343d5257551ce797d9fae620a87a79f9c0709ece61a6b4abf"} Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.660671 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45da18fe-c307-40ff-8a91-c86ba32c516d","Type":"ContainerStarted","Data":"13c844d2ad4e758817c0fe19ff41898a4586b9b2a8e12c8ebca6a4629ad693aa"} Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.671400 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21278307-5a9e-4e08-92ef-c542da277f23","Type":"ContainerStarted","Data":"459c970de7830a03005b97cbffec325ae7a0361be5b6757b22e14b2b43518df3"} Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.700210 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9pqdd" event={"ID":"04e99f7a-bb5e-41c0-a55a-02b671a69ad8","Type":"ContainerStarted","Data":"5922ebc8350beb92166ff13c09d24c6d1827a56668004ecfe76dccc58e0473c4"} Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.720590 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fee8c430-e8cc-47a8-8ec3-4594e0500a4f","Type":"ContainerStarted","Data":"0ca752f031ea6c3a75bc90752442b9d61eff4aa50fa837891ab74762dedb8395"} Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.735610 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-9pqdd" podStartSLOduration=2.735581672 podStartE2EDuration="2.735581672s" podCreationTimestamp="2025-10-09 15:38:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:38:19.727517875 +0000 UTC m=+1205.237229180" watchObservedRunningTime="2025-10-09 15:38:19.735581672 +0000 UTC m=+1205.245292957" Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.770103 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43123f9f-e7f7-4d3a-8856-18d324f55254","Type":"ContainerStarted","Data":"1a77d63c6cc6dce02e2f31d05cf6a08df242847bdb12cb57cb976fc66a6b931b"} Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.770545 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43123f9f-e7f7-4d3a-8856-18d324f55254","Type":"ContainerStarted","Data":"13683196bbc089128ffba8f7c3a522318bfacba03a462aa20a60501f2af95675"} Oct 09 15:38:19 crc kubenswrapper[4719]: I1009 15:38:19.774699 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"eb00a64b-377d-450c-81df-408b0790b21f","Type":"ContainerStarted","Data":"da767d8723596c6f0975a6248ab155721be50db2f13ee34758a1f0399b909010"} Oct 09 15:38:20 crc kubenswrapper[4719]: I1009 15:38:20.149344 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bfqtw"] Oct 09 15:38:20 crc kubenswrapper[4719]: W1009 15:38:20.186716 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e7ae899_fa79_4024_a512_6a7648d7fd6a.slice/crio-73936239a3e6201f63dfe553b4abab9e05a110219312981d2931645e7b66fc05 WatchSource:0}: Error finding container 73936239a3e6201f63dfe553b4abab9e05a110219312981d2931645e7b66fc05: Status 404 returned error can't find the container with id 73936239a3e6201f63dfe553b4abab9e05a110219312981d2931645e7b66fc05 Oct 09 15:38:20 crc kubenswrapper[4719]: I1009 15:38:20.819918 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43123f9f-e7f7-4d3a-8856-18d324f55254","Type":"ContainerStarted","Data":"ccd3ef7533995ff84e77db8707b05438e1cd30d5f79a8ec98eeeda690097c829"} Oct 09 15:38:20 crc kubenswrapper[4719]: I1009 15:38:20.828436 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bfqtw" event={"ID":"2e7ae899-fa79-4024-a512-6a7648d7fd6a","Type":"ContainerStarted","Data":"53b22611344ea6f695a15115e5a926986cee107c3833356c129aded91a400f7f"} Oct 09 15:38:20 crc kubenswrapper[4719]: I1009 15:38:20.828477 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bfqtw" event={"ID":"2e7ae899-fa79-4024-a512-6a7648d7fd6a","Type":"ContainerStarted","Data":"73936239a3e6201f63dfe553b4abab9e05a110219312981d2931645e7b66fc05"} Oct 09 15:38:20 crc kubenswrapper[4719]: I1009 15:38:20.840312 4719 generic.go:334] "Generic (PLEG): container finished" podID="aa17763b-47cf-4305-a42e-4f43fa08e189" containerID="4f30eeb640cf2e01c51d40c18b2e0cae635f32af4dedfa28679ca4200968a0b1" exitCode=0 Oct 09 15:38:20 crc kubenswrapper[4719]: I1009 15:38:20.841902 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b454788c-qkz7q" event={"ID":"aa17763b-47cf-4305-a42e-4f43fa08e189","Type":"ContainerDied","Data":"4f30eeb640cf2e01c51d40c18b2e0cae635f32af4dedfa28679ca4200968a0b1"} Oct 09 15:38:20 crc kubenswrapper[4719]: I1009 15:38:20.883550 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-bfqtw" podStartSLOduration=1.8835253600000001 podStartE2EDuration="1.88352536s" podCreationTimestamp="2025-10-09 15:38:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:38:20.848551152 +0000 UTC m=+1206.358262437" watchObservedRunningTime="2025-10-09 15:38:20.88352536 +0000 UTC m=+1206.393236645" Oct 09 15:38:21 crc kubenswrapper[4719]: I1009 15:38:21.362171 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 15:38:21 crc kubenswrapper[4719]: I1009 15:38:21.388350 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 15:38:21 crc kubenswrapper[4719]: I1009 15:38:21.876666 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43123f9f-e7f7-4d3a-8856-18d324f55254","Type":"ContainerStarted","Data":"71eccb3d761539feb14aee5d5a1a2bb43c9087261b73c09d77e0970f1eab8b52"} Oct 09 15:38:21 crc kubenswrapper[4719]: I1009 15:38:21.900733 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b454788c-qkz7q" event={"ID":"aa17763b-47cf-4305-a42e-4f43fa08e189","Type":"ContainerStarted","Data":"71045742e21a3be979a2a83e281e587df446cbee999bbdead78cb9ad7112f296"} Oct 09 15:38:21 crc kubenswrapper[4719]: I1009 15:38:21.901256 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58b454788c-qkz7q" Oct 09 15:38:21 crc kubenswrapper[4719]: I1009 15:38:21.938258 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58b454788c-qkz7q" podStartSLOduration=3.9382431479999997 podStartE2EDuration="3.938243148s" podCreationTimestamp="2025-10-09 15:38:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:38:21.934947303 +0000 UTC m=+1207.444658588" watchObservedRunningTime="2025-10-09 15:38:21.938243148 +0000 UTC m=+1207.447954443" Oct 09 15:38:23 crc kubenswrapper[4719]: I1009 15:38:23.953236 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 09 15:38:24 crc kubenswrapper[4719]: I1009 15:38:24.948382 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fee8c430-e8cc-47a8-8ec3-4594e0500a4f","Type":"ContainerStarted","Data":"1538201b0a110728260c6210d70d6a16c704d9a2341a2279330f4860bb0caabf"} Oct 09 15:38:24 crc kubenswrapper[4719]: I1009 15:38:24.951123 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43123f9f-e7f7-4d3a-8856-18d324f55254","Type":"ContainerStarted","Data":"e74d2203f8861cc51352ee49abbd21fbe35685c4741856ce50c7ade9ba91c969"} Oct 09 15:38:24 crc kubenswrapper[4719]: I1009 15:38:24.951257 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 15:38:24 crc kubenswrapper[4719]: I1009 15:38:24.953045 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"eb00a64b-377d-450c-81df-408b0790b21f","Type":"ContainerStarted","Data":"9f4263077898c99175651f39fa6fa5bf6e0de796ee696a530c40b8c9b7f37380"} Oct 09 15:38:24 crc kubenswrapper[4719]: I1009 15:38:24.953272 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="eb00a64b-377d-450c-81df-408b0790b21f" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://9f4263077898c99175651f39fa6fa5bf6e0de796ee696a530c40b8c9b7f37380" gracePeriod=30 Oct 09 15:38:24 crc kubenswrapper[4719]: I1009 15:38:24.973197 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45da18fe-c307-40ff-8a91-c86ba32c516d","Type":"ContainerStarted","Data":"1940216b7a6b1e443f7eddc56eca44a07a08dad184661ad4a7ebc644e5318ffd"} Oct 09 15:38:24 crc kubenswrapper[4719]: I1009 15:38:24.973256 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45da18fe-c307-40ff-8a91-c86ba32c516d","Type":"ContainerStarted","Data":"53a0cd3d06859214603d7e8e6f0377e80471d431dc3e8a7ff2dd4ca28a46a880"} Oct 09 15:38:24 crc kubenswrapper[4719]: I1009 15:38:24.977472 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.06113752 podStartE2EDuration="7.977449504s" podCreationTimestamp="2025-10-09 15:38:17 +0000 UTC" firstStartedPulling="2025-10-09 15:38:18.981144164 +0000 UTC m=+1204.490855449" lastFinishedPulling="2025-10-09 15:38:23.897456148 +0000 UTC m=+1209.407167433" observedRunningTime="2025-10-09 15:38:24.973280992 +0000 UTC m=+1210.482992277" watchObservedRunningTime="2025-10-09 15:38:24.977449504 +0000 UTC m=+1210.487160789" Oct 09 15:38:24 crc kubenswrapper[4719]: I1009 15:38:24.978946 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21278307-5a9e-4e08-92ef-c542da277f23","Type":"ContainerStarted","Data":"fe036faec1168d63b99ee7c9da0117b4cfefc47e3dfe304584c94cb2f3ef828c"} Oct 09 15:38:24 crc kubenswrapper[4719]: I1009 15:38:24.978986 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21278307-5a9e-4e08-92ef-c542da277f23","Type":"ContainerStarted","Data":"d1ae465431419c4b408d4b934d1f179ed83049a463947c8d966c2e403836d160"} Oct 09 15:38:24 crc kubenswrapper[4719]: I1009 15:38:24.979111 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="21278307-5a9e-4e08-92ef-c542da277f23" containerName="nova-metadata-log" containerID="cri-o://d1ae465431419c4b408d4b934d1f179ed83049a463947c8d966c2e403836d160" gracePeriod=30 Oct 09 15:38:24 crc kubenswrapper[4719]: I1009 15:38:24.979221 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="21278307-5a9e-4e08-92ef-c542da277f23" containerName="nova-metadata-metadata" containerID="cri-o://fe036faec1168d63b99ee7c9da0117b4cfefc47e3dfe304584c94cb2f3ef828c" gracePeriod=30 Oct 09 15:38:25 crc kubenswrapper[4719]: I1009 15:38:25.011128 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.902092841 podStartE2EDuration="8.01110457s" podCreationTimestamp="2025-10-09 15:38:17 +0000 UTC" firstStartedPulling="2025-10-09 15:38:18.842030551 +0000 UTC m=+1204.351741836" lastFinishedPulling="2025-10-09 15:38:23.95104228 +0000 UTC m=+1209.460753565" observedRunningTime="2025-10-09 15:38:24.991153943 +0000 UTC m=+1210.500865248" watchObservedRunningTime="2025-10-09 15:38:25.01110457 +0000 UTC m=+1210.520815855" Oct 09 15:38:25 crc kubenswrapper[4719]: I1009 15:38:25.020256 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.880538673 podStartE2EDuration="8.020233901s" podCreationTimestamp="2025-10-09 15:38:17 +0000 UTC" firstStartedPulling="2025-10-09 15:38:18.776067175 +0000 UTC m=+1204.285778460" lastFinishedPulling="2025-10-09 15:38:23.915762413 +0000 UTC m=+1209.425473688" observedRunningTime="2025-10-09 15:38:25.018831277 +0000 UTC m=+1210.528542572" watchObservedRunningTime="2025-10-09 15:38:25.020233901 +0000 UTC m=+1210.529945206" Oct 09 15:38:25 crc kubenswrapper[4719]: I1009 15:38:25.045777 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.485198687 podStartE2EDuration="8.045754847s" podCreationTimestamp="2025-10-09 15:38:17 +0000 UTC" firstStartedPulling="2025-10-09 15:38:19.331734163 +0000 UTC m=+1204.841445458" lastFinishedPulling="2025-10-09 15:38:23.892290333 +0000 UTC m=+1209.402001618" observedRunningTime="2025-10-09 15:38:25.036582913 +0000 UTC m=+1210.546294208" watchObservedRunningTime="2025-10-09 15:38:25.045754847 +0000 UTC m=+1210.555466132" Oct 09 15:38:25 crc kubenswrapper[4719]: I1009 15:38:25.062806 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.471900421 podStartE2EDuration="8.062780501s" podCreationTimestamp="2025-10-09 15:38:17 +0000 UTC" firstStartedPulling="2025-10-09 15:38:19.324871943 +0000 UTC m=+1204.834583228" lastFinishedPulling="2025-10-09 15:38:23.915752023 +0000 UTC m=+1209.425463308" observedRunningTime="2025-10-09 15:38:25.052782872 +0000 UTC m=+1210.562494157" watchObservedRunningTime="2025-10-09 15:38:25.062780501 +0000 UTC m=+1210.572491796" Oct 09 15:38:25 crc kubenswrapper[4719]: I1009 15:38:25.734287 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 15:38:25 crc kubenswrapper[4719]: I1009 15:38:25.779231 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21278307-5a9e-4e08-92ef-c542da277f23-config-data\") pod \"21278307-5a9e-4e08-92ef-c542da277f23\" (UID: \"21278307-5a9e-4e08-92ef-c542da277f23\") " Oct 09 15:38:25 crc kubenswrapper[4719]: I1009 15:38:25.779789 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4hxq\" (UniqueName: \"kubernetes.io/projected/21278307-5a9e-4e08-92ef-c542da277f23-kube-api-access-s4hxq\") pod \"21278307-5a9e-4e08-92ef-c542da277f23\" (UID: \"21278307-5a9e-4e08-92ef-c542da277f23\") " Oct 09 15:38:25 crc kubenswrapper[4719]: I1009 15:38:25.779861 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21278307-5a9e-4e08-92ef-c542da277f23-combined-ca-bundle\") pod \"21278307-5a9e-4e08-92ef-c542da277f23\" (UID: \"21278307-5a9e-4e08-92ef-c542da277f23\") " Oct 09 15:38:25 crc kubenswrapper[4719]: I1009 15:38:25.779994 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21278307-5a9e-4e08-92ef-c542da277f23-logs\") pod \"21278307-5a9e-4e08-92ef-c542da277f23\" (UID: \"21278307-5a9e-4e08-92ef-c542da277f23\") " Oct 09 15:38:25 crc kubenswrapper[4719]: I1009 15:38:25.780378 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21278307-5a9e-4e08-92ef-c542da277f23-logs" (OuterVolumeSpecName: "logs") pod "21278307-5a9e-4e08-92ef-c542da277f23" (UID: "21278307-5a9e-4e08-92ef-c542da277f23"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:38:25 crc kubenswrapper[4719]: I1009 15:38:25.780802 4719 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21278307-5a9e-4e08-92ef-c542da277f23-logs\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:25 crc kubenswrapper[4719]: I1009 15:38:25.796771 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21278307-5a9e-4e08-92ef-c542da277f23-kube-api-access-s4hxq" (OuterVolumeSpecName: "kube-api-access-s4hxq") pod "21278307-5a9e-4e08-92ef-c542da277f23" (UID: "21278307-5a9e-4e08-92ef-c542da277f23"). InnerVolumeSpecName "kube-api-access-s4hxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:38:25 crc kubenswrapper[4719]: I1009 15:38:25.828724 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21278307-5a9e-4e08-92ef-c542da277f23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21278307-5a9e-4e08-92ef-c542da277f23" (UID: "21278307-5a9e-4e08-92ef-c542da277f23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:38:25 crc kubenswrapper[4719]: I1009 15:38:25.881933 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21278307-5a9e-4e08-92ef-c542da277f23-config-data" (OuterVolumeSpecName: "config-data") pod "21278307-5a9e-4e08-92ef-c542da277f23" (UID: "21278307-5a9e-4e08-92ef-c542da277f23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:38:25 crc kubenswrapper[4719]: I1009 15:38:25.882041 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21278307-5a9e-4e08-92ef-c542da277f23-config-data\") pod \"21278307-5a9e-4e08-92ef-c542da277f23\" (UID: \"21278307-5a9e-4e08-92ef-c542da277f23\") " Oct 09 15:38:25 crc kubenswrapper[4719]: I1009 15:38:25.882889 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4hxq\" (UniqueName: \"kubernetes.io/projected/21278307-5a9e-4e08-92ef-c542da277f23-kube-api-access-s4hxq\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:25 crc kubenswrapper[4719]: I1009 15:38:25.882917 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21278307-5a9e-4e08-92ef-c542da277f23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:25 crc kubenswrapper[4719]: W1009 15:38:25.883013 4719 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/21278307-5a9e-4e08-92ef-c542da277f23/volumes/kubernetes.io~secret/config-data Oct 09 15:38:25 crc kubenswrapper[4719]: I1009 15:38:25.883030 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21278307-5a9e-4e08-92ef-c542da277f23-config-data" (OuterVolumeSpecName: "config-data") pod "21278307-5a9e-4e08-92ef-c542da277f23" (UID: "21278307-5a9e-4e08-92ef-c542da277f23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:38:25 crc kubenswrapper[4719]: I1009 15:38:25.986818 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21278307-5a9e-4e08-92ef-c542da277f23-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:25.999978 4719 generic.go:334] "Generic (PLEG): container finished" podID="21278307-5a9e-4e08-92ef-c542da277f23" containerID="fe036faec1168d63b99ee7c9da0117b4cfefc47e3dfe304584c94cb2f3ef828c" exitCode=0 Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.000021 4719 generic.go:334] "Generic (PLEG): container finished" podID="21278307-5a9e-4e08-92ef-c542da277f23" containerID="d1ae465431419c4b408d4b934d1f179ed83049a463947c8d966c2e403836d160" exitCode=143 Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.000329 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21278307-5a9e-4e08-92ef-c542da277f23","Type":"ContainerDied","Data":"fe036faec1168d63b99ee7c9da0117b4cfefc47e3dfe304584c94cb2f3ef828c"} Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.000461 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21278307-5a9e-4e08-92ef-c542da277f23","Type":"ContainerDied","Data":"d1ae465431419c4b408d4b934d1f179ed83049a463947c8d966c2e403836d160"} Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.000521 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21278307-5a9e-4e08-92ef-c542da277f23","Type":"ContainerDied","Data":"459c970de7830a03005b97cbffec325ae7a0361be5b6757b22e14b2b43518df3"} Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.000603 4719 scope.go:117] "RemoveContainer" containerID="fe036faec1168d63b99ee7c9da0117b4cfefc47e3dfe304584c94cb2f3ef828c" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.000851 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.026134 4719 scope.go:117] "RemoveContainer" containerID="d1ae465431419c4b408d4b934d1f179ed83049a463947c8d966c2e403836d160" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.050463 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.056386 4719 scope.go:117] "RemoveContainer" containerID="fe036faec1168d63b99ee7c9da0117b4cfefc47e3dfe304584c94cb2f3ef828c" Oct 09 15:38:26 crc kubenswrapper[4719]: E1009 15:38:26.059862 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe036faec1168d63b99ee7c9da0117b4cfefc47e3dfe304584c94cb2f3ef828c\": container with ID starting with fe036faec1168d63b99ee7c9da0117b4cfefc47e3dfe304584c94cb2f3ef828c not found: ID does not exist" containerID="fe036faec1168d63b99ee7c9da0117b4cfefc47e3dfe304584c94cb2f3ef828c" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.059906 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe036faec1168d63b99ee7c9da0117b4cfefc47e3dfe304584c94cb2f3ef828c"} err="failed to get container status \"fe036faec1168d63b99ee7c9da0117b4cfefc47e3dfe304584c94cb2f3ef828c\": rpc error: code = NotFound desc = could not find container \"fe036faec1168d63b99ee7c9da0117b4cfefc47e3dfe304584c94cb2f3ef828c\": container with ID starting with fe036faec1168d63b99ee7c9da0117b4cfefc47e3dfe304584c94cb2f3ef828c not found: ID does not exist" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.059929 4719 scope.go:117] "RemoveContainer" containerID="d1ae465431419c4b408d4b934d1f179ed83049a463947c8d966c2e403836d160" Oct 09 15:38:26 crc kubenswrapper[4719]: E1009 15:38:26.062452 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1ae465431419c4b408d4b934d1f179ed83049a463947c8d966c2e403836d160\": container with ID starting with d1ae465431419c4b408d4b934d1f179ed83049a463947c8d966c2e403836d160 not found: ID does not exist" containerID="d1ae465431419c4b408d4b934d1f179ed83049a463947c8d966c2e403836d160" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.062488 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1ae465431419c4b408d4b934d1f179ed83049a463947c8d966c2e403836d160"} err="failed to get container status \"d1ae465431419c4b408d4b934d1f179ed83049a463947c8d966c2e403836d160\": rpc error: code = NotFound desc = could not find container \"d1ae465431419c4b408d4b934d1f179ed83049a463947c8d966c2e403836d160\": container with ID starting with d1ae465431419c4b408d4b934d1f179ed83049a463947c8d966c2e403836d160 not found: ID does not exist" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.062516 4719 scope.go:117] "RemoveContainer" containerID="fe036faec1168d63b99ee7c9da0117b4cfefc47e3dfe304584c94cb2f3ef828c" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.066512 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.066740 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe036faec1168d63b99ee7c9da0117b4cfefc47e3dfe304584c94cb2f3ef828c"} err="failed to get container status \"fe036faec1168d63b99ee7c9da0117b4cfefc47e3dfe304584c94cb2f3ef828c\": rpc error: code = NotFound desc = could not find container \"fe036faec1168d63b99ee7c9da0117b4cfefc47e3dfe304584c94cb2f3ef828c\": container with ID starting with fe036faec1168d63b99ee7c9da0117b4cfefc47e3dfe304584c94cb2f3ef828c not found: ID does not exist" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.066856 4719 scope.go:117] "RemoveContainer" containerID="d1ae465431419c4b408d4b934d1f179ed83049a463947c8d966c2e403836d160" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.067287 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1ae465431419c4b408d4b934d1f179ed83049a463947c8d966c2e403836d160"} err="failed to get container status \"d1ae465431419c4b408d4b934d1f179ed83049a463947c8d966c2e403836d160\": rpc error: code = NotFound desc = could not find container \"d1ae465431419c4b408d4b934d1f179ed83049a463947c8d966c2e403836d160\": container with ID starting with d1ae465431419c4b408d4b934d1f179ed83049a463947c8d966c2e403836d160 not found: ID does not exist" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.080999 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 09 15:38:26 crc kubenswrapper[4719]: E1009 15:38:26.081695 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21278307-5a9e-4e08-92ef-c542da277f23" containerName="nova-metadata-metadata" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.081783 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="21278307-5a9e-4e08-92ef-c542da277f23" containerName="nova-metadata-metadata" Oct 09 15:38:26 crc kubenswrapper[4719]: E1009 15:38:26.081862 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21278307-5a9e-4e08-92ef-c542da277f23" containerName="nova-metadata-log" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.081934 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="21278307-5a9e-4e08-92ef-c542da277f23" containerName="nova-metadata-log" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.082216 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="21278307-5a9e-4e08-92ef-c542da277f23" containerName="nova-metadata-log" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.082301 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="21278307-5a9e-4e08-92ef-c542da277f23" containerName="nova-metadata-metadata" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.083474 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.090414 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.094190 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.096068 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.190730 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0\") " pod="openstack/nova-metadata-0" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.190861 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qv75\" (UniqueName: \"kubernetes.io/projected/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-kube-api-access-8qv75\") pod \"nova-metadata-0\" (UID: \"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0\") " pod="openstack/nova-metadata-0" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.190907 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-logs\") pod \"nova-metadata-0\" (UID: \"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0\") " pod="openstack/nova-metadata-0" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.190924 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-config-data\") pod \"nova-metadata-0\" (UID: \"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0\") " pod="openstack/nova-metadata-0" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.190948 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0\") " pod="openstack/nova-metadata-0" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.292078 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-logs\") pod \"nova-metadata-0\" (UID: \"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0\") " pod="openstack/nova-metadata-0" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.292121 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-config-data\") pod \"nova-metadata-0\" (UID: \"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0\") " pod="openstack/nova-metadata-0" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.292158 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0\") " pod="openstack/nova-metadata-0" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.292245 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0\") " pod="openstack/nova-metadata-0" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.292369 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qv75\" (UniqueName: \"kubernetes.io/projected/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-kube-api-access-8qv75\") pod \"nova-metadata-0\" (UID: \"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0\") " pod="openstack/nova-metadata-0" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.292644 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-logs\") pod \"nova-metadata-0\" (UID: \"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0\") " pod="openstack/nova-metadata-0" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.296036 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0\") " pod="openstack/nova-metadata-0" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.296167 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0\") " pod="openstack/nova-metadata-0" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.296871 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-config-data\") pod \"nova-metadata-0\" (UID: \"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0\") " pod="openstack/nova-metadata-0" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.311981 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qv75\" (UniqueName: \"kubernetes.io/projected/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-kube-api-access-8qv75\") pod \"nova-metadata-0\" (UID: \"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0\") " pod="openstack/nova-metadata-0" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.414806 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 15:38:26 crc kubenswrapper[4719]: I1009 15:38:26.923232 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 15:38:27 crc kubenswrapper[4719]: I1009 15:38:27.013582 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0","Type":"ContainerStarted","Data":"3050d018a0cfb0fe999eddba0f3210a8d5ac9008e2acdd89e59270f9bf206019"} Oct 09 15:38:27 crc kubenswrapper[4719]: I1009 15:38:27.176965 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21278307-5a9e-4e08-92ef-c542da277f23" path="/var/lib/kubelet/pods/21278307-5a9e-4e08-92ef-c542da277f23/volumes" Oct 09 15:38:28 crc kubenswrapper[4719]: I1009 15:38:28.024971 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0","Type":"ContainerStarted","Data":"98c0cb9ce5a63804e16eb0d5836f16910e27ea620086bbedaad8f96eec006f7b"} Oct 09 15:38:28 crc kubenswrapper[4719]: I1009 15:38:28.025290 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0","Type":"ContainerStarted","Data":"aa213c549cc003ac52e3c26bab0c3ba0f9d52c6af9cdbcae00bf6f9e6a11647c"} Oct 09 15:38:28 crc kubenswrapper[4719]: I1009 15:38:28.045790 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.045769901 podStartE2EDuration="2.045769901s" podCreationTimestamp="2025-10-09 15:38:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:38:28.044961106 +0000 UTC m=+1213.554672411" watchObservedRunningTime="2025-10-09 15:38:28.045769901 +0000 UTC m=+1213.555481206" Oct 09 15:38:28 crc kubenswrapper[4719]: I1009 15:38:28.104965 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 15:38:28 crc kubenswrapper[4719]: I1009 15:38:28.105024 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 15:38:28 crc kubenswrapper[4719]: I1009 15:38:28.239012 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 09 15:38:28 crc kubenswrapper[4719]: I1009 15:38:28.240835 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 09 15:38:28 crc kubenswrapper[4719]: I1009 15:38:28.269985 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 09 15:38:28 crc kubenswrapper[4719]: I1009 15:38:28.564746 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:28 crc kubenswrapper[4719]: I1009 15:38:28.600526 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58b454788c-qkz7q" Oct 09 15:38:28 crc kubenswrapper[4719]: I1009 15:38:28.674150 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77b756999f-5ptkd"] Oct 09 15:38:28 crc kubenswrapper[4719]: I1009 15:38:28.674594 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77b756999f-5ptkd" podUID="f7540714-0deb-4ba6-8709-846457e19966" containerName="dnsmasq-dns" containerID="cri-o://583be44f9b958a7b642c9052de211535ca718085fd44ce13e329ad6434301c54" gracePeriod=10 Oct 09 15:38:29 crc kubenswrapper[4719]: I1009 15:38:29.060146 4719 generic.go:334] "Generic (PLEG): container finished" podID="f7540714-0deb-4ba6-8709-846457e19966" containerID="583be44f9b958a7b642c9052de211535ca718085fd44ce13e329ad6434301c54" exitCode=0 Oct 09 15:38:29 crc kubenswrapper[4719]: I1009 15:38:29.060299 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77b756999f-5ptkd" event={"ID":"f7540714-0deb-4ba6-8709-846457e19966","Type":"ContainerDied","Data":"583be44f9b958a7b642c9052de211535ca718085fd44ce13e329ad6434301c54"} Oct 09 15:38:29 crc kubenswrapper[4719]: I1009 15:38:29.101579 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 09 15:38:29 crc kubenswrapper[4719]: I1009 15:38:29.191492 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="45da18fe-c307-40ff-8a91-c86ba32c516d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.210:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 15:38:29 crc kubenswrapper[4719]: I1009 15:38:29.191582 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="45da18fe-c307-40ff-8a91-c86ba32c516d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.210:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 15:38:29 crc kubenswrapper[4719]: I1009 15:38:29.249055 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77b756999f-5ptkd" Oct 09 15:38:29 crc kubenswrapper[4719]: I1009 15:38:29.359633 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-ovsdbserver-sb\") pod \"f7540714-0deb-4ba6-8709-846457e19966\" (UID: \"f7540714-0deb-4ba6-8709-846457e19966\") " Oct 09 15:38:29 crc kubenswrapper[4719]: I1009 15:38:29.359793 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-dns-swift-storage-0\") pod \"f7540714-0deb-4ba6-8709-846457e19966\" (UID: \"f7540714-0deb-4ba6-8709-846457e19966\") " Oct 09 15:38:29 crc kubenswrapper[4719]: I1009 15:38:29.359859 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-ovsdbserver-nb\") pod \"f7540714-0deb-4ba6-8709-846457e19966\" (UID: \"f7540714-0deb-4ba6-8709-846457e19966\") " Oct 09 15:38:29 crc kubenswrapper[4719]: I1009 15:38:29.359938 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-df5zw\" (UniqueName: \"kubernetes.io/projected/f7540714-0deb-4ba6-8709-846457e19966-kube-api-access-df5zw\") pod \"f7540714-0deb-4ba6-8709-846457e19966\" (UID: \"f7540714-0deb-4ba6-8709-846457e19966\") " Oct 09 15:38:29 crc kubenswrapper[4719]: I1009 15:38:29.359962 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-dns-svc\") pod \"f7540714-0deb-4ba6-8709-846457e19966\" (UID: \"f7540714-0deb-4ba6-8709-846457e19966\") " Oct 09 15:38:29 crc kubenswrapper[4719]: I1009 15:38:29.360001 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-config\") pod \"f7540714-0deb-4ba6-8709-846457e19966\" (UID: \"f7540714-0deb-4ba6-8709-846457e19966\") " Oct 09 15:38:29 crc kubenswrapper[4719]: I1009 15:38:29.367261 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7540714-0deb-4ba6-8709-846457e19966-kube-api-access-df5zw" (OuterVolumeSpecName: "kube-api-access-df5zw") pod "f7540714-0deb-4ba6-8709-846457e19966" (UID: "f7540714-0deb-4ba6-8709-846457e19966"). InnerVolumeSpecName "kube-api-access-df5zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:38:29 crc kubenswrapper[4719]: I1009 15:38:29.432007 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f7540714-0deb-4ba6-8709-846457e19966" (UID: "f7540714-0deb-4ba6-8709-846457e19966"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:38:29 crc kubenswrapper[4719]: I1009 15:38:29.470741 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-df5zw\" (UniqueName: \"kubernetes.io/projected/f7540714-0deb-4ba6-8709-846457e19966-kube-api-access-df5zw\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:29 crc kubenswrapper[4719]: I1009 15:38:29.470777 4719 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:29 crc kubenswrapper[4719]: I1009 15:38:29.479992 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f7540714-0deb-4ba6-8709-846457e19966" (UID: "f7540714-0deb-4ba6-8709-846457e19966"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:38:29 crc kubenswrapper[4719]: I1009 15:38:29.520928 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-config" (OuterVolumeSpecName: "config") pod "f7540714-0deb-4ba6-8709-846457e19966" (UID: "f7540714-0deb-4ba6-8709-846457e19966"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:38:29 crc kubenswrapper[4719]: I1009 15:38:29.535169 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f7540714-0deb-4ba6-8709-846457e19966" (UID: "f7540714-0deb-4ba6-8709-846457e19966"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:38:29 crc kubenswrapper[4719]: I1009 15:38:29.538296 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f7540714-0deb-4ba6-8709-846457e19966" (UID: "f7540714-0deb-4ba6-8709-846457e19966"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:38:29 crc kubenswrapper[4719]: I1009 15:38:29.572642 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:29 crc kubenswrapper[4719]: I1009 15:38:29.572677 4719 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:29 crc kubenswrapper[4719]: I1009 15:38:29.572688 4719 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:29 crc kubenswrapper[4719]: I1009 15:38:29.572697 4719 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7540714-0deb-4ba6-8709-846457e19966-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:30 crc kubenswrapper[4719]: I1009 15:38:30.070998 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77b756999f-5ptkd" Oct 09 15:38:30 crc kubenswrapper[4719]: I1009 15:38:30.071582 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77b756999f-5ptkd" event={"ID":"f7540714-0deb-4ba6-8709-846457e19966","Type":"ContainerDied","Data":"d8521268a7bed8a58a83e7f4b2e0bb8e6726efbeceff9348c0e2edea9734ea88"} Oct 09 15:38:30 crc kubenswrapper[4719]: I1009 15:38:30.071630 4719 scope.go:117] "RemoveContainer" containerID="583be44f9b958a7b642c9052de211535ca718085fd44ce13e329ad6434301c54" Oct 09 15:38:30 crc kubenswrapper[4719]: I1009 15:38:30.095988 4719 scope.go:117] "RemoveContainer" containerID="8acc01f937a88ec70793212478f7312dc7f746927a49e045013394d8fa0b2e6c" Oct 09 15:38:30 crc kubenswrapper[4719]: I1009 15:38:30.125415 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77b756999f-5ptkd"] Oct 09 15:38:30 crc kubenswrapper[4719]: I1009 15:38:30.130760 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77b756999f-5ptkd"] Oct 09 15:38:31 crc kubenswrapper[4719]: I1009 15:38:31.083022 4719 generic.go:334] "Generic (PLEG): container finished" podID="04e99f7a-bb5e-41c0-a55a-02b671a69ad8" containerID="5922ebc8350beb92166ff13c09d24c6d1827a56668004ecfe76dccc58e0473c4" exitCode=0 Oct 09 15:38:31 crc kubenswrapper[4719]: I1009 15:38:31.083117 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9pqdd" event={"ID":"04e99f7a-bb5e-41c0-a55a-02b671a69ad8","Type":"ContainerDied","Data":"5922ebc8350beb92166ff13c09d24c6d1827a56668004ecfe76dccc58e0473c4"} Oct 09 15:38:31 crc kubenswrapper[4719]: I1009 15:38:31.173879 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7540714-0deb-4ba6-8709-846457e19966" path="/var/lib/kubelet/pods/f7540714-0deb-4ba6-8709-846457e19966/volumes" Oct 09 15:38:31 crc kubenswrapper[4719]: I1009 15:38:31.416515 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 15:38:31 crc kubenswrapper[4719]: I1009 15:38:31.416567 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 15:38:32 crc kubenswrapper[4719]: I1009 15:38:32.093854 4719 generic.go:334] "Generic (PLEG): container finished" podID="2e7ae899-fa79-4024-a512-6a7648d7fd6a" containerID="53b22611344ea6f695a15115e5a926986cee107c3833356c129aded91a400f7f" exitCode=0 Oct 09 15:38:32 crc kubenswrapper[4719]: I1009 15:38:32.094029 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bfqtw" event={"ID":"2e7ae899-fa79-4024-a512-6a7648d7fd6a","Type":"ContainerDied","Data":"53b22611344ea6f695a15115e5a926986cee107c3833356c129aded91a400f7f"} Oct 09 15:38:32 crc kubenswrapper[4719]: I1009 15:38:32.551845 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9pqdd" Oct 09 15:38:32 crc kubenswrapper[4719]: I1009 15:38:32.627407 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e99f7a-bb5e-41c0-a55a-02b671a69ad8-config-data\") pod \"04e99f7a-bb5e-41c0-a55a-02b671a69ad8\" (UID: \"04e99f7a-bb5e-41c0-a55a-02b671a69ad8\") " Oct 09 15:38:32 crc kubenswrapper[4719]: I1009 15:38:32.627730 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smmbn\" (UniqueName: \"kubernetes.io/projected/04e99f7a-bb5e-41c0-a55a-02b671a69ad8-kube-api-access-smmbn\") pod \"04e99f7a-bb5e-41c0-a55a-02b671a69ad8\" (UID: \"04e99f7a-bb5e-41c0-a55a-02b671a69ad8\") " Oct 09 15:38:32 crc kubenswrapper[4719]: I1009 15:38:32.627804 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e99f7a-bb5e-41c0-a55a-02b671a69ad8-combined-ca-bundle\") pod \"04e99f7a-bb5e-41c0-a55a-02b671a69ad8\" (UID: \"04e99f7a-bb5e-41c0-a55a-02b671a69ad8\") " Oct 09 15:38:32 crc kubenswrapper[4719]: I1009 15:38:32.627835 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04e99f7a-bb5e-41c0-a55a-02b671a69ad8-scripts\") pod \"04e99f7a-bb5e-41c0-a55a-02b671a69ad8\" (UID: \"04e99f7a-bb5e-41c0-a55a-02b671a69ad8\") " Oct 09 15:38:32 crc kubenswrapper[4719]: I1009 15:38:32.634378 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04e99f7a-bb5e-41c0-a55a-02b671a69ad8-scripts" (OuterVolumeSpecName: "scripts") pod "04e99f7a-bb5e-41c0-a55a-02b671a69ad8" (UID: "04e99f7a-bb5e-41c0-a55a-02b671a69ad8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:38:32 crc kubenswrapper[4719]: I1009 15:38:32.634528 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04e99f7a-bb5e-41c0-a55a-02b671a69ad8-kube-api-access-smmbn" (OuterVolumeSpecName: "kube-api-access-smmbn") pod "04e99f7a-bb5e-41c0-a55a-02b671a69ad8" (UID: "04e99f7a-bb5e-41c0-a55a-02b671a69ad8"). InnerVolumeSpecName "kube-api-access-smmbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:38:32 crc kubenswrapper[4719]: I1009 15:38:32.657207 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04e99f7a-bb5e-41c0-a55a-02b671a69ad8-config-data" (OuterVolumeSpecName: "config-data") pod "04e99f7a-bb5e-41c0-a55a-02b671a69ad8" (UID: "04e99f7a-bb5e-41c0-a55a-02b671a69ad8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:38:32 crc kubenswrapper[4719]: I1009 15:38:32.660028 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04e99f7a-bb5e-41c0-a55a-02b671a69ad8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04e99f7a-bb5e-41c0-a55a-02b671a69ad8" (UID: "04e99f7a-bb5e-41c0-a55a-02b671a69ad8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:38:32 crc kubenswrapper[4719]: I1009 15:38:32.729804 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e99f7a-bb5e-41c0-a55a-02b671a69ad8-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:32 crc kubenswrapper[4719]: I1009 15:38:32.729845 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smmbn\" (UniqueName: \"kubernetes.io/projected/04e99f7a-bb5e-41c0-a55a-02b671a69ad8-kube-api-access-smmbn\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:32 crc kubenswrapper[4719]: I1009 15:38:32.729863 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e99f7a-bb5e-41c0-a55a-02b671a69ad8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:32 crc kubenswrapper[4719]: I1009 15:38:32.729874 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04e99f7a-bb5e-41c0-a55a-02b671a69ad8-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:33 crc kubenswrapper[4719]: I1009 15:38:33.103013 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9pqdd" Oct 09 15:38:33 crc kubenswrapper[4719]: I1009 15:38:33.103009 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9pqdd" event={"ID":"04e99f7a-bb5e-41c0-a55a-02b671a69ad8","Type":"ContainerDied","Data":"44bf0082f6ddd2fdbd9d9410befc95e8dad19acfcf8d0d9b49e4a0eef826f136"} Oct 09 15:38:33 crc kubenswrapper[4719]: I1009 15:38:33.103048 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44bf0082f6ddd2fdbd9d9410befc95e8dad19acfcf8d0d9b49e4a0eef826f136" Oct 09 15:38:33 crc kubenswrapper[4719]: I1009 15:38:33.316425 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 15:38:33 crc kubenswrapper[4719]: I1009 15:38:33.316983 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="45da18fe-c307-40ff-8a91-c86ba32c516d" containerName="nova-api-log" containerID="cri-o://53a0cd3d06859214603d7e8e6f0377e80471d431dc3e8a7ff2dd4ca28a46a880" gracePeriod=30 Oct 09 15:38:33 crc kubenswrapper[4719]: I1009 15:38:33.317514 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="45da18fe-c307-40ff-8a91-c86ba32c516d" containerName="nova-api-api" containerID="cri-o://1940216b7a6b1e443f7eddc56eca44a07a08dad184661ad4a7ebc644e5318ffd" gracePeriod=30 Oct 09 15:38:33 crc kubenswrapper[4719]: I1009 15:38:33.352460 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 15:38:33 crc kubenswrapper[4719]: I1009 15:38:33.352706 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="fee8c430-e8cc-47a8-8ec3-4594e0500a4f" containerName="nova-scheduler-scheduler" containerID="cri-o://1538201b0a110728260c6210d70d6a16c704d9a2341a2279330f4860bb0caabf" gracePeriod=30 Oct 09 15:38:33 crc kubenswrapper[4719]: I1009 15:38:33.373893 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 15:38:33 crc kubenswrapper[4719]: I1009 15:38:33.374165 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0" containerName="nova-metadata-metadata" containerID="cri-o://98c0cb9ce5a63804e16eb0d5836f16910e27ea620086bbedaad8f96eec006f7b" gracePeriod=30 Oct 09 15:38:33 crc kubenswrapper[4719]: I1009 15:38:33.374320 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0" containerName="nova-metadata-log" containerID="cri-o://aa213c549cc003ac52e3c26bab0c3ba0f9d52c6af9cdbcae00bf6f9e6a11647c" gracePeriod=30 Oct 09 15:38:33 crc kubenswrapper[4719]: I1009 15:38:33.668041 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bfqtw" Oct 09 15:38:33 crc kubenswrapper[4719]: I1009 15:38:33.758043 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e7ae899-fa79-4024-a512-6a7648d7fd6a-config-data\") pod \"2e7ae899-fa79-4024-a512-6a7648d7fd6a\" (UID: \"2e7ae899-fa79-4024-a512-6a7648d7fd6a\") " Oct 09 15:38:33 crc kubenswrapper[4719]: I1009 15:38:33.758932 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7ae899-fa79-4024-a512-6a7648d7fd6a-combined-ca-bundle\") pod \"2e7ae899-fa79-4024-a512-6a7648d7fd6a\" (UID: \"2e7ae899-fa79-4024-a512-6a7648d7fd6a\") " Oct 09 15:38:33 crc kubenswrapper[4719]: I1009 15:38:33.758968 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e7ae899-fa79-4024-a512-6a7648d7fd6a-scripts\") pod \"2e7ae899-fa79-4024-a512-6a7648d7fd6a\" (UID: \"2e7ae899-fa79-4024-a512-6a7648d7fd6a\") " Oct 09 15:38:33 crc kubenswrapper[4719]: I1009 15:38:33.759175 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lss7f\" (UniqueName: \"kubernetes.io/projected/2e7ae899-fa79-4024-a512-6a7648d7fd6a-kube-api-access-lss7f\") pod \"2e7ae899-fa79-4024-a512-6a7648d7fd6a\" (UID: \"2e7ae899-fa79-4024-a512-6a7648d7fd6a\") " Oct 09 15:38:33 crc kubenswrapper[4719]: I1009 15:38:33.766166 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e7ae899-fa79-4024-a512-6a7648d7fd6a-kube-api-access-lss7f" (OuterVolumeSpecName: "kube-api-access-lss7f") pod "2e7ae899-fa79-4024-a512-6a7648d7fd6a" (UID: "2e7ae899-fa79-4024-a512-6a7648d7fd6a"). InnerVolumeSpecName "kube-api-access-lss7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:38:33 crc kubenswrapper[4719]: I1009 15:38:33.768435 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e7ae899-fa79-4024-a512-6a7648d7fd6a-scripts" (OuterVolumeSpecName: "scripts") pod "2e7ae899-fa79-4024-a512-6a7648d7fd6a" (UID: "2e7ae899-fa79-4024-a512-6a7648d7fd6a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:38:33 crc kubenswrapper[4719]: I1009 15:38:33.802652 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e7ae899-fa79-4024-a512-6a7648d7fd6a-config-data" (OuterVolumeSpecName: "config-data") pod "2e7ae899-fa79-4024-a512-6a7648d7fd6a" (UID: "2e7ae899-fa79-4024-a512-6a7648d7fd6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:38:33 crc kubenswrapper[4719]: I1009 15:38:33.828828 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e7ae899-fa79-4024-a512-6a7648d7fd6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e7ae899-fa79-4024-a512-6a7648d7fd6a" (UID: "2e7ae899-fa79-4024-a512-6a7648d7fd6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:38:33 crc kubenswrapper[4719]: I1009 15:38:33.862113 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e7ae899-fa79-4024-a512-6a7648d7fd6a-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:33 crc kubenswrapper[4719]: I1009 15:38:33.862317 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7ae899-fa79-4024-a512-6a7648d7fd6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:33 crc kubenswrapper[4719]: I1009 15:38:33.862333 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e7ae899-fa79-4024-a512-6a7648d7fd6a-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:33 crc kubenswrapper[4719]: I1009 15:38:33.862361 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lss7f\" (UniqueName: \"kubernetes.io/projected/2e7ae899-fa79-4024-a512-6a7648d7fd6a-kube-api-access-lss7f\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:33 crc kubenswrapper[4719]: I1009 15:38:33.957075 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.064654 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qv75\" (UniqueName: \"kubernetes.io/projected/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-kube-api-access-8qv75\") pod \"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0\" (UID: \"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0\") " Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.064848 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-nova-metadata-tls-certs\") pod \"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0\" (UID: \"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0\") " Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.065001 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-combined-ca-bundle\") pod \"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0\" (UID: \"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0\") " Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.065094 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-config-data\") pod \"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0\" (UID: \"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0\") " Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.065147 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-logs\") pod \"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0\" (UID: \"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0\") " Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.065625 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-logs" (OuterVolumeSpecName: "logs") pod "77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0" (UID: "77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.069822 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-kube-api-access-8qv75" (OuterVolumeSpecName: "kube-api-access-8qv75") pod "77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0" (UID: "77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0"). InnerVolumeSpecName "kube-api-access-8qv75". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.095683 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0" (UID: "77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.100532 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-config-data" (OuterVolumeSpecName: "config-data") pod "77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0" (UID: "77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.116582 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bfqtw" event={"ID":"2e7ae899-fa79-4024-a512-6a7648d7fd6a","Type":"ContainerDied","Data":"73936239a3e6201f63dfe553b4abab9e05a110219312981d2931645e7b66fc05"} Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.116631 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73936239a3e6201f63dfe553b4abab9e05a110219312981d2931645e7b66fc05" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.116690 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bfqtw" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.120929 4719 generic.go:334] "Generic (PLEG): container finished" podID="45da18fe-c307-40ff-8a91-c86ba32c516d" containerID="53a0cd3d06859214603d7e8e6f0377e80471d431dc3e8a7ff2dd4ca28a46a880" exitCode=143 Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.120988 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45da18fe-c307-40ff-8a91-c86ba32c516d","Type":"ContainerDied","Data":"53a0cd3d06859214603d7e8e6f0377e80471d431dc3e8a7ff2dd4ca28a46a880"} Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.130896 4719 generic.go:334] "Generic (PLEG): container finished" podID="77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0" containerID="98c0cb9ce5a63804e16eb0d5836f16910e27ea620086bbedaad8f96eec006f7b" exitCode=0 Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.130939 4719 generic.go:334] "Generic (PLEG): container finished" podID="77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0" containerID="aa213c549cc003ac52e3c26bab0c3ba0f9d52c6af9cdbcae00bf6f9e6a11647c" exitCode=143 Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.130963 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0","Type":"ContainerDied","Data":"98c0cb9ce5a63804e16eb0d5836f16910e27ea620086bbedaad8f96eec006f7b"} Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.130997 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0","Type":"ContainerDied","Data":"aa213c549cc003ac52e3c26bab0c3ba0f9d52c6af9cdbcae00bf6f9e6a11647c"} Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.131013 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0","Type":"ContainerDied","Data":"3050d018a0cfb0fe999eddba0f3210a8d5ac9008e2acdd89e59270f9bf206019"} Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.131031 4719 scope.go:117] "RemoveContainer" containerID="98c0cb9ce5a63804e16eb0d5836f16910e27ea620086bbedaad8f96eec006f7b" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.131181 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.143570 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0" (UID: "77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.167849 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.167885 4719 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-logs\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.167899 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qv75\" (UniqueName: \"kubernetes.io/projected/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-kube-api-access-8qv75\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.167911 4719 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.167924 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.185221 4719 scope.go:117] "RemoveContainer" containerID="aa213c549cc003ac52e3c26bab0c3ba0f9d52c6af9cdbcae00bf6f9e6a11647c" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.232263 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 09 15:38:34 crc kubenswrapper[4719]: E1009 15:38:34.232783 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04e99f7a-bb5e-41c0-a55a-02b671a69ad8" containerName="nova-manage" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.232799 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e99f7a-bb5e-41c0-a55a-02b671a69ad8" containerName="nova-manage" Oct 09 15:38:34 crc kubenswrapper[4719]: E1009 15:38:34.232811 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0" containerName="nova-metadata-log" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.232821 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0" containerName="nova-metadata-log" Oct 09 15:38:34 crc kubenswrapper[4719]: E1009 15:38:34.232845 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7540714-0deb-4ba6-8709-846457e19966" containerName="init" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.232853 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7540714-0deb-4ba6-8709-846457e19966" containerName="init" Oct 09 15:38:34 crc kubenswrapper[4719]: E1009 15:38:34.232873 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7540714-0deb-4ba6-8709-846457e19966" containerName="dnsmasq-dns" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.232881 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7540714-0deb-4ba6-8709-846457e19966" containerName="dnsmasq-dns" Oct 09 15:38:34 crc kubenswrapper[4719]: E1009 15:38:34.232892 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e7ae899-fa79-4024-a512-6a7648d7fd6a" containerName="nova-cell1-conductor-db-sync" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.232899 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e7ae899-fa79-4024-a512-6a7648d7fd6a" containerName="nova-cell1-conductor-db-sync" Oct 09 15:38:34 crc kubenswrapper[4719]: E1009 15:38:34.232917 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0" containerName="nova-metadata-metadata" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.232923 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0" containerName="nova-metadata-metadata" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.233157 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0" containerName="nova-metadata-metadata" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.233170 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0" containerName="nova-metadata-log" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.233183 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="04e99f7a-bb5e-41c0-a55a-02b671a69ad8" containerName="nova-manage" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.233194 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7540714-0deb-4ba6-8709-846457e19966" containerName="dnsmasq-dns" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.233207 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e7ae899-fa79-4024-a512-6a7648d7fd6a" containerName="nova-cell1-conductor-db-sync" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.233912 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.242017 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.249626 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.270123 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcfrp\" (UniqueName: \"kubernetes.io/projected/d0cbbab9-8de8-43f9-bf34-b235d2fb4400-kube-api-access-hcfrp\") pod \"nova-cell1-conductor-0\" (UID: \"d0cbbab9-8de8-43f9-bf34-b235d2fb4400\") " pod="openstack/nova-cell1-conductor-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.270191 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0cbbab9-8de8-43f9-bf34-b235d2fb4400-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d0cbbab9-8de8-43f9-bf34-b235d2fb4400\") " pod="openstack/nova-cell1-conductor-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.270319 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0cbbab9-8de8-43f9-bf34-b235d2fb4400-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d0cbbab9-8de8-43f9-bf34-b235d2fb4400\") " pod="openstack/nova-cell1-conductor-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.298792 4719 scope.go:117] "RemoveContainer" containerID="98c0cb9ce5a63804e16eb0d5836f16910e27ea620086bbedaad8f96eec006f7b" Oct 09 15:38:34 crc kubenswrapper[4719]: E1009 15:38:34.304016 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98c0cb9ce5a63804e16eb0d5836f16910e27ea620086bbedaad8f96eec006f7b\": container with ID starting with 98c0cb9ce5a63804e16eb0d5836f16910e27ea620086bbedaad8f96eec006f7b not found: ID does not exist" containerID="98c0cb9ce5a63804e16eb0d5836f16910e27ea620086bbedaad8f96eec006f7b" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.304065 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98c0cb9ce5a63804e16eb0d5836f16910e27ea620086bbedaad8f96eec006f7b"} err="failed to get container status \"98c0cb9ce5a63804e16eb0d5836f16910e27ea620086bbedaad8f96eec006f7b\": rpc error: code = NotFound desc = could not find container \"98c0cb9ce5a63804e16eb0d5836f16910e27ea620086bbedaad8f96eec006f7b\": container with ID starting with 98c0cb9ce5a63804e16eb0d5836f16910e27ea620086bbedaad8f96eec006f7b not found: ID does not exist" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.304093 4719 scope.go:117] "RemoveContainer" containerID="aa213c549cc003ac52e3c26bab0c3ba0f9d52c6af9cdbcae00bf6f9e6a11647c" Oct 09 15:38:34 crc kubenswrapper[4719]: E1009 15:38:34.304793 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa213c549cc003ac52e3c26bab0c3ba0f9d52c6af9cdbcae00bf6f9e6a11647c\": container with ID starting with aa213c549cc003ac52e3c26bab0c3ba0f9d52c6af9cdbcae00bf6f9e6a11647c not found: ID does not exist" containerID="aa213c549cc003ac52e3c26bab0c3ba0f9d52c6af9cdbcae00bf6f9e6a11647c" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.305397 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa213c549cc003ac52e3c26bab0c3ba0f9d52c6af9cdbcae00bf6f9e6a11647c"} err="failed to get container status \"aa213c549cc003ac52e3c26bab0c3ba0f9d52c6af9cdbcae00bf6f9e6a11647c\": rpc error: code = NotFound desc = could not find container \"aa213c549cc003ac52e3c26bab0c3ba0f9d52c6af9cdbcae00bf6f9e6a11647c\": container with ID starting with aa213c549cc003ac52e3c26bab0c3ba0f9d52c6af9cdbcae00bf6f9e6a11647c not found: ID does not exist" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.305418 4719 scope.go:117] "RemoveContainer" containerID="98c0cb9ce5a63804e16eb0d5836f16910e27ea620086bbedaad8f96eec006f7b" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.305893 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98c0cb9ce5a63804e16eb0d5836f16910e27ea620086bbedaad8f96eec006f7b"} err="failed to get container status \"98c0cb9ce5a63804e16eb0d5836f16910e27ea620086bbedaad8f96eec006f7b\": rpc error: code = NotFound desc = could not find container \"98c0cb9ce5a63804e16eb0d5836f16910e27ea620086bbedaad8f96eec006f7b\": container with ID starting with 98c0cb9ce5a63804e16eb0d5836f16910e27ea620086bbedaad8f96eec006f7b not found: ID does not exist" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.305980 4719 scope.go:117] "RemoveContainer" containerID="aa213c549cc003ac52e3c26bab0c3ba0f9d52c6af9cdbcae00bf6f9e6a11647c" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.306338 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa213c549cc003ac52e3c26bab0c3ba0f9d52c6af9cdbcae00bf6f9e6a11647c"} err="failed to get container status \"aa213c549cc003ac52e3c26bab0c3ba0f9d52c6af9cdbcae00bf6f9e6a11647c\": rpc error: code = NotFound desc = could not find container \"aa213c549cc003ac52e3c26bab0c3ba0f9d52c6af9cdbcae00bf6f9e6a11647c\": container with ID starting with aa213c549cc003ac52e3c26bab0c3ba0f9d52c6af9cdbcae00bf6f9e6a11647c not found: ID does not exist" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.374681 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcfrp\" (UniqueName: \"kubernetes.io/projected/d0cbbab9-8de8-43f9-bf34-b235d2fb4400-kube-api-access-hcfrp\") pod \"nova-cell1-conductor-0\" (UID: \"d0cbbab9-8de8-43f9-bf34-b235d2fb4400\") " pod="openstack/nova-cell1-conductor-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.374745 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0cbbab9-8de8-43f9-bf34-b235d2fb4400-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d0cbbab9-8de8-43f9-bf34-b235d2fb4400\") " pod="openstack/nova-cell1-conductor-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.374855 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0cbbab9-8de8-43f9-bf34-b235d2fb4400-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d0cbbab9-8de8-43f9-bf34-b235d2fb4400\") " pod="openstack/nova-cell1-conductor-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.381227 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0cbbab9-8de8-43f9-bf34-b235d2fb4400-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d0cbbab9-8de8-43f9-bf34-b235d2fb4400\") " pod="openstack/nova-cell1-conductor-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.382032 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0cbbab9-8de8-43f9-bf34-b235d2fb4400-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d0cbbab9-8de8-43f9-bf34-b235d2fb4400\") " pod="openstack/nova-cell1-conductor-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.398403 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcfrp\" (UniqueName: \"kubernetes.io/projected/d0cbbab9-8de8-43f9-bf34-b235d2fb4400-kube-api-access-hcfrp\") pod \"nova-cell1-conductor-0\" (UID: \"d0cbbab9-8de8-43f9-bf34-b235d2fb4400\") " pod="openstack/nova-cell1-conductor-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.480401 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.491992 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.504378 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.507276 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.509316 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.510087 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.553398 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.586569 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.680594 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/659a2cfd-faec-4ff3-9a23-7733a138493e-config-data\") pod \"nova-metadata-0\" (UID: \"659a2cfd-faec-4ff3-9a23-7733a138493e\") " pod="openstack/nova-metadata-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.681037 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/659a2cfd-faec-4ff3-9a23-7733a138493e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"659a2cfd-faec-4ff3-9a23-7733a138493e\") " pod="openstack/nova-metadata-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.681069 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/659a2cfd-faec-4ff3-9a23-7733a138493e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"659a2cfd-faec-4ff3-9a23-7733a138493e\") " pod="openstack/nova-metadata-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.681129 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/659a2cfd-faec-4ff3-9a23-7733a138493e-logs\") pod \"nova-metadata-0\" (UID: \"659a2cfd-faec-4ff3-9a23-7733a138493e\") " pod="openstack/nova-metadata-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.681247 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm4lx\" (UniqueName: \"kubernetes.io/projected/659a2cfd-faec-4ff3-9a23-7733a138493e-kube-api-access-cm4lx\") pod \"nova-metadata-0\" (UID: \"659a2cfd-faec-4ff3-9a23-7733a138493e\") " pod="openstack/nova-metadata-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.782494 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/659a2cfd-faec-4ff3-9a23-7733a138493e-config-data\") pod \"nova-metadata-0\" (UID: \"659a2cfd-faec-4ff3-9a23-7733a138493e\") " pod="openstack/nova-metadata-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.782569 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/659a2cfd-faec-4ff3-9a23-7733a138493e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"659a2cfd-faec-4ff3-9a23-7733a138493e\") " pod="openstack/nova-metadata-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.782593 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/659a2cfd-faec-4ff3-9a23-7733a138493e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"659a2cfd-faec-4ff3-9a23-7733a138493e\") " pod="openstack/nova-metadata-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.782653 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/659a2cfd-faec-4ff3-9a23-7733a138493e-logs\") pod \"nova-metadata-0\" (UID: \"659a2cfd-faec-4ff3-9a23-7733a138493e\") " pod="openstack/nova-metadata-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.782767 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm4lx\" (UniqueName: \"kubernetes.io/projected/659a2cfd-faec-4ff3-9a23-7733a138493e-kube-api-access-cm4lx\") pod \"nova-metadata-0\" (UID: \"659a2cfd-faec-4ff3-9a23-7733a138493e\") " pod="openstack/nova-metadata-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.791041 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/659a2cfd-faec-4ff3-9a23-7733a138493e-logs\") pod \"nova-metadata-0\" (UID: \"659a2cfd-faec-4ff3-9a23-7733a138493e\") " pod="openstack/nova-metadata-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.798118 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/659a2cfd-faec-4ff3-9a23-7733a138493e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"659a2cfd-faec-4ff3-9a23-7733a138493e\") " pod="openstack/nova-metadata-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.798181 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/659a2cfd-faec-4ff3-9a23-7733a138493e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"659a2cfd-faec-4ff3-9a23-7733a138493e\") " pod="openstack/nova-metadata-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.798257 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/659a2cfd-faec-4ff3-9a23-7733a138493e-config-data\") pod \"nova-metadata-0\" (UID: \"659a2cfd-faec-4ff3-9a23-7733a138493e\") " pod="openstack/nova-metadata-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.807400 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm4lx\" (UniqueName: \"kubernetes.io/projected/659a2cfd-faec-4ff3-9a23-7733a138493e-kube-api-access-cm4lx\") pod \"nova-metadata-0\" (UID: \"659a2cfd-faec-4ff3-9a23-7733a138493e\") " pod="openstack/nova-metadata-0" Oct 09 15:38:34 crc kubenswrapper[4719]: I1009 15:38:34.875191 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 15:38:35 crc kubenswrapper[4719]: I1009 15:38:35.095570 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 09 15:38:35 crc kubenswrapper[4719]: I1009 15:38:35.143016 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d0cbbab9-8de8-43f9-bf34-b235d2fb4400","Type":"ContainerStarted","Data":"b153b25b1a803c1c3167602f16a16f2a009b8fd0d65b6783d80b2a5ced3be15f"} Oct 09 15:38:35 crc kubenswrapper[4719]: I1009 15:38:35.173237 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0" path="/var/lib/kubelet/pods/77ca4c71-59a6-49b9-b2eb-bbe3531dd5d0/volumes" Oct 09 15:38:35 crc kubenswrapper[4719]: I1009 15:38:35.294787 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 15:38:35 crc kubenswrapper[4719]: W1009 15:38:35.300750 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod659a2cfd_faec_4ff3_9a23_7733a138493e.slice/crio-63edd1bec258e02f9625c5172cda1c3ff57b09a0f893da809d86c4ee51e27f5f WatchSource:0}: Error finding container 63edd1bec258e02f9625c5172cda1c3ff57b09a0f893da809d86c4ee51e27f5f: Status 404 returned error can't find the container with id 63edd1bec258e02f9625c5172cda1c3ff57b09a0f893da809d86c4ee51e27f5f Oct 09 15:38:35 crc kubenswrapper[4719]: I1009 15:38:35.668116 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 15:38:35 crc kubenswrapper[4719]: I1009 15:38:35.806771 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45da18fe-c307-40ff-8a91-c86ba32c516d-combined-ca-bundle\") pod \"45da18fe-c307-40ff-8a91-c86ba32c516d\" (UID: \"45da18fe-c307-40ff-8a91-c86ba32c516d\") " Oct 09 15:38:35 crc kubenswrapper[4719]: I1009 15:38:35.807228 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45da18fe-c307-40ff-8a91-c86ba32c516d-logs\") pod \"45da18fe-c307-40ff-8a91-c86ba32c516d\" (UID: \"45da18fe-c307-40ff-8a91-c86ba32c516d\") " Oct 09 15:38:35 crc kubenswrapper[4719]: I1009 15:38:35.807688 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45da18fe-c307-40ff-8a91-c86ba32c516d-logs" (OuterVolumeSpecName: "logs") pod "45da18fe-c307-40ff-8a91-c86ba32c516d" (UID: "45da18fe-c307-40ff-8a91-c86ba32c516d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:38:35 crc kubenswrapper[4719]: I1009 15:38:35.807720 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwrsr\" (UniqueName: \"kubernetes.io/projected/45da18fe-c307-40ff-8a91-c86ba32c516d-kube-api-access-kwrsr\") pod \"45da18fe-c307-40ff-8a91-c86ba32c516d\" (UID: \"45da18fe-c307-40ff-8a91-c86ba32c516d\") " Oct 09 15:38:35 crc kubenswrapper[4719]: I1009 15:38:35.807865 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45da18fe-c307-40ff-8a91-c86ba32c516d-config-data\") pod \"45da18fe-c307-40ff-8a91-c86ba32c516d\" (UID: \"45da18fe-c307-40ff-8a91-c86ba32c516d\") " Oct 09 15:38:35 crc kubenswrapper[4719]: I1009 15:38:35.808695 4719 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45da18fe-c307-40ff-8a91-c86ba32c516d-logs\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:35 crc kubenswrapper[4719]: I1009 15:38:35.810828 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45da18fe-c307-40ff-8a91-c86ba32c516d-kube-api-access-kwrsr" (OuterVolumeSpecName: "kube-api-access-kwrsr") pod "45da18fe-c307-40ff-8a91-c86ba32c516d" (UID: "45da18fe-c307-40ff-8a91-c86ba32c516d"). InnerVolumeSpecName "kube-api-access-kwrsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:38:35 crc kubenswrapper[4719]: I1009 15:38:35.838689 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45da18fe-c307-40ff-8a91-c86ba32c516d-config-data" (OuterVolumeSpecName: "config-data") pod "45da18fe-c307-40ff-8a91-c86ba32c516d" (UID: "45da18fe-c307-40ff-8a91-c86ba32c516d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:38:35 crc kubenswrapper[4719]: I1009 15:38:35.838820 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45da18fe-c307-40ff-8a91-c86ba32c516d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45da18fe-c307-40ff-8a91-c86ba32c516d" (UID: "45da18fe-c307-40ff-8a91-c86ba32c516d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:38:35 crc kubenswrapper[4719]: I1009 15:38:35.910411 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45da18fe-c307-40ff-8a91-c86ba32c516d-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:35 crc kubenswrapper[4719]: I1009 15:38:35.910439 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45da18fe-c307-40ff-8a91-c86ba32c516d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:35 crc kubenswrapper[4719]: I1009 15:38:35.910449 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwrsr\" (UniqueName: \"kubernetes.io/projected/45da18fe-c307-40ff-8a91-c86ba32c516d-kube-api-access-kwrsr\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.152494 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"d0cbbab9-8de8-43f9-bf34-b235d2fb4400","Type":"ContainerStarted","Data":"21e3b4a8dbb98ea3b28ffa11cc3b2bea010d1d30de4be4a1ea5ae90a74db3713"} Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.152562 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.158673 4719 generic.go:334] "Generic (PLEG): container finished" podID="45da18fe-c307-40ff-8a91-c86ba32c516d" containerID="1940216b7a6b1e443f7eddc56eca44a07a08dad184661ad4a7ebc644e5318ffd" exitCode=0 Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.158799 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.158823 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45da18fe-c307-40ff-8a91-c86ba32c516d","Type":"ContainerDied","Data":"1940216b7a6b1e443f7eddc56eca44a07a08dad184661ad4a7ebc644e5318ffd"} Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.158877 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45da18fe-c307-40ff-8a91-c86ba32c516d","Type":"ContainerDied","Data":"13c844d2ad4e758817c0fe19ff41898a4586b9b2a8e12c8ebca6a4629ad693aa"} Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.158923 4719 scope.go:117] "RemoveContainer" containerID="1940216b7a6b1e443f7eddc56eca44a07a08dad184661ad4a7ebc644e5318ffd" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.164082 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"659a2cfd-faec-4ff3-9a23-7733a138493e","Type":"ContainerStarted","Data":"9ca0d5390efd60e078c476f987e29cd72f157ed67e6f8a5cc2f5aaa89d9cb60b"} Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.164163 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"659a2cfd-faec-4ff3-9a23-7733a138493e","Type":"ContainerStarted","Data":"966f250211675f8690fa96c8f29689409f69e49fceaf7fabbbb79a060e17a791"} Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.164179 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"659a2cfd-faec-4ff3-9a23-7733a138493e","Type":"ContainerStarted","Data":"63edd1bec258e02f9625c5172cda1c3ff57b09a0f893da809d86c4ee51e27f5f"} Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.197419 4719 scope.go:117] "RemoveContainer" containerID="53a0cd3d06859214603d7e8e6f0377e80471d431dc3e8a7ff2dd4ca28a46a880" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.206226 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.206204188 podStartE2EDuration="2.206204188s" podCreationTimestamp="2025-10-09 15:38:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:38:36.176571331 +0000 UTC m=+1221.686282626" watchObservedRunningTime="2025-10-09 15:38:36.206204188 +0000 UTC m=+1221.715915463" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.208099 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.208074837 podStartE2EDuration="2.208074837s" podCreationTimestamp="2025-10-09 15:38:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:38:36.197999665 +0000 UTC m=+1221.707710960" watchObservedRunningTime="2025-10-09 15:38:36.208074837 +0000 UTC m=+1221.717786132" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.229096 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.235254 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.242757 4719 scope.go:117] "RemoveContainer" containerID="1940216b7a6b1e443f7eddc56eca44a07a08dad184661ad4a7ebc644e5318ffd" Oct 09 15:38:36 crc kubenswrapper[4719]: E1009 15:38:36.243445 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1940216b7a6b1e443f7eddc56eca44a07a08dad184661ad4a7ebc644e5318ffd\": container with ID starting with 1940216b7a6b1e443f7eddc56eca44a07a08dad184661ad4a7ebc644e5318ffd not found: ID does not exist" containerID="1940216b7a6b1e443f7eddc56eca44a07a08dad184661ad4a7ebc644e5318ffd" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.243484 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1940216b7a6b1e443f7eddc56eca44a07a08dad184661ad4a7ebc644e5318ffd"} err="failed to get container status \"1940216b7a6b1e443f7eddc56eca44a07a08dad184661ad4a7ebc644e5318ffd\": rpc error: code = NotFound desc = could not find container \"1940216b7a6b1e443f7eddc56eca44a07a08dad184661ad4a7ebc644e5318ffd\": container with ID starting with 1940216b7a6b1e443f7eddc56eca44a07a08dad184661ad4a7ebc644e5318ffd not found: ID does not exist" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.243507 4719 scope.go:117] "RemoveContainer" containerID="53a0cd3d06859214603d7e8e6f0377e80471d431dc3e8a7ff2dd4ca28a46a880" Oct 09 15:38:36 crc kubenswrapper[4719]: E1009 15:38:36.244017 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53a0cd3d06859214603d7e8e6f0377e80471d431dc3e8a7ff2dd4ca28a46a880\": container with ID starting with 53a0cd3d06859214603d7e8e6f0377e80471d431dc3e8a7ff2dd4ca28a46a880 not found: ID does not exist" containerID="53a0cd3d06859214603d7e8e6f0377e80471d431dc3e8a7ff2dd4ca28a46a880" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.244168 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53a0cd3d06859214603d7e8e6f0377e80471d431dc3e8a7ff2dd4ca28a46a880"} err="failed to get container status \"53a0cd3d06859214603d7e8e6f0377e80471d431dc3e8a7ff2dd4ca28a46a880\": rpc error: code = NotFound desc = could not find container \"53a0cd3d06859214603d7e8e6f0377e80471d431dc3e8a7ff2dd4ca28a46a880\": container with ID starting with 53a0cd3d06859214603d7e8e6f0377e80471d431dc3e8a7ff2dd4ca28a46a880 not found: ID does not exist" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.250420 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 09 15:38:36 crc kubenswrapper[4719]: E1009 15:38:36.250835 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45da18fe-c307-40ff-8a91-c86ba32c516d" containerName="nova-api-log" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.250851 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="45da18fe-c307-40ff-8a91-c86ba32c516d" containerName="nova-api-log" Oct 09 15:38:36 crc kubenswrapper[4719]: E1009 15:38:36.250861 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45da18fe-c307-40ff-8a91-c86ba32c516d" containerName="nova-api-api" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.250867 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="45da18fe-c307-40ff-8a91-c86ba32c516d" containerName="nova-api-api" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.251079 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="45da18fe-c307-40ff-8a91-c86ba32c516d" containerName="nova-api-api" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.251097 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="45da18fe-c307-40ff-8a91-c86ba32c516d" containerName="nova-api-log" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.252454 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.258728 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.280023 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.319768 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w66cf\" (UniqueName: \"kubernetes.io/projected/1adfbd0c-70c5-41c6-b400-7ef501bcd042-kube-api-access-w66cf\") pod \"nova-api-0\" (UID: \"1adfbd0c-70c5-41c6-b400-7ef501bcd042\") " pod="openstack/nova-api-0" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.319825 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1adfbd0c-70c5-41c6-b400-7ef501bcd042-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1adfbd0c-70c5-41c6-b400-7ef501bcd042\") " pod="openstack/nova-api-0" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.319993 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1adfbd0c-70c5-41c6-b400-7ef501bcd042-logs\") pod \"nova-api-0\" (UID: \"1adfbd0c-70c5-41c6-b400-7ef501bcd042\") " pod="openstack/nova-api-0" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.320193 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1adfbd0c-70c5-41c6-b400-7ef501bcd042-config-data\") pod \"nova-api-0\" (UID: \"1adfbd0c-70c5-41c6-b400-7ef501bcd042\") " pod="openstack/nova-api-0" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.421952 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w66cf\" (UniqueName: \"kubernetes.io/projected/1adfbd0c-70c5-41c6-b400-7ef501bcd042-kube-api-access-w66cf\") pod \"nova-api-0\" (UID: \"1adfbd0c-70c5-41c6-b400-7ef501bcd042\") " pod="openstack/nova-api-0" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.422305 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1adfbd0c-70c5-41c6-b400-7ef501bcd042-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1adfbd0c-70c5-41c6-b400-7ef501bcd042\") " pod="openstack/nova-api-0" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.423154 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1adfbd0c-70c5-41c6-b400-7ef501bcd042-logs\") pod \"nova-api-0\" (UID: \"1adfbd0c-70c5-41c6-b400-7ef501bcd042\") " pod="openstack/nova-api-0" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.423214 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1adfbd0c-70c5-41c6-b400-7ef501bcd042-logs\") pod \"nova-api-0\" (UID: \"1adfbd0c-70c5-41c6-b400-7ef501bcd042\") " pod="openstack/nova-api-0" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.423297 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1adfbd0c-70c5-41c6-b400-7ef501bcd042-config-data\") pod \"nova-api-0\" (UID: \"1adfbd0c-70c5-41c6-b400-7ef501bcd042\") " pod="openstack/nova-api-0" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.427596 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1adfbd0c-70c5-41c6-b400-7ef501bcd042-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1adfbd0c-70c5-41c6-b400-7ef501bcd042\") " pod="openstack/nova-api-0" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.428067 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1adfbd0c-70c5-41c6-b400-7ef501bcd042-config-data\") pod \"nova-api-0\" (UID: \"1adfbd0c-70c5-41c6-b400-7ef501bcd042\") " pod="openstack/nova-api-0" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.442144 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w66cf\" (UniqueName: \"kubernetes.io/projected/1adfbd0c-70c5-41c6-b400-7ef501bcd042-kube-api-access-w66cf\") pod \"nova-api-0\" (UID: \"1adfbd0c-70c5-41c6-b400-7ef501bcd042\") " pod="openstack/nova-api-0" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.592502 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.976983 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:38:36 crc kubenswrapper[4719]: I1009 15:38:36.977304 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.073409 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.098912 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.189253 4719 generic.go:334] "Generic (PLEG): container finished" podID="fee8c430-e8cc-47a8-8ec3-4594e0500a4f" containerID="1538201b0a110728260c6210d70d6a16c704d9a2341a2279330f4860bb0caabf" exitCode=0 Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.189303 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45da18fe-c307-40ff-8a91-c86ba32c516d" path="/var/lib/kubelet/pods/45da18fe-c307-40ff-8a91-c86ba32c516d/volumes" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.193258 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fee8c430-e8cc-47a8-8ec3-4594e0500a4f","Type":"ContainerDied","Data":"1538201b0a110728260c6210d70d6a16c704d9a2341a2279330f4860bb0caabf"} Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.193296 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fee8c430-e8cc-47a8-8ec3-4594e0500a4f","Type":"ContainerDied","Data":"0ca752f031ea6c3a75bc90752442b9d61eff4aa50fa837891ab74762dedb8395"} Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.193310 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1adfbd0c-70c5-41c6-b400-7ef501bcd042","Type":"ContainerStarted","Data":"36716262a04acac1623801da02f3cc6354851bc1916bee3452c39c51a5ef8c60"} Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.193328 4719 scope.go:117] "RemoveContainer" containerID="1538201b0a110728260c6210d70d6a16c704d9a2341a2279330f4860bb0caabf" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.196557 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.213133 4719 scope.go:117] "RemoveContainer" containerID="1538201b0a110728260c6210d70d6a16c704d9a2341a2279330f4860bb0caabf" Oct 09 15:38:37 crc kubenswrapper[4719]: E1009 15:38:37.215743 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1538201b0a110728260c6210d70d6a16c704d9a2341a2279330f4860bb0caabf\": container with ID starting with 1538201b0a110728260c6210d70d6a16c704d9a2341a2279330f4860bb0caabf not found: ID does not exist" containerID="1538201b0a110728260c6210d70d6a16c704d9a2341a2279330f4860bb0caabf" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.215796 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1538201b0a110728260c6210d70d6a16c704d9a2341a2279330f4860bb0caabf"} err="failed to get container status \"1538201b0a110728260c6210d70d6a16c704d9a2341a2279330f4860bb0caabf\": rpc error: code = NotFound desc = could not find container \"1538201b0a110728260c6210d70d6a16c704d9a2341a2279330f4860bb0caabf\": container with ID starting with 1538201b0a110728260c6210d70d6a16c704d9a2341a2279330f4860bb0caabf not found: ID does not exist" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.240202 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee8c430-e8cc-47a8-8ec3-4594e0500a4f-config-data\") pod \"fee8c430-e8cc-47a8-8ec3-4594e0500a4f\" (UID: \"fee8c430-e8cc-47a8-8ec3-4594e0500a4f\") " Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.240269 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee8c430-e8cc-47a8-8ec3-4594e0500a4f-combined-ca-bundle\") pod \"fee8c430-e8cc-47a8-8ec3-4594e0500a4f\" (UID: \"fee8c430-e8cc-47a8-8ec3-4594e0500a4f\") " Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.240302 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc447\" (UniqueName: \"kubernetes.io/projected/fee8c430-e8cc-47a8-8ec3-4594e0500a4f-kube-api-access-hc447\") pod \"fee8c430-e8cc-47a8-8ec3-4594e0500a4f\" (UID: \"fee8c430-e8cc-47a8-8ec3-4594e0500a4f\") " Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.247577 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fee8c430-e8cc-47a8-8ec3-4594e0500a4f-kube-api-access-hc447" (OuterVolumeSpecName: "kube-api-access-hc447") pod "fee8c430-e8cc-47a8-8ec3-4594e0500a4f" (UID: "fee8c430-e8cc-47a8-8ec3-4594e0500a4f"). InnerVolumeSpecName "kube-api-access-hc447". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.277630 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee8c430-e8cc-47a8-8ec3-4594e0500a4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fee8c430-e8cc-47a8-8ec3-4594e0500a4f" (UID: "fee8c430-e8cc-47a8-8ec3-4594e0500a4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.280893 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee8c430-e8cc-47a8-8ec3-4594e0500a4f-config-data" (OuterVolumeSpecName: "config-data") pod "fee8c430-e8cc-47a8-8ec3-4594e0500a4f" (UID: "fee8c430-e8cc-47a8-8ec3-4594e0500a4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.342746 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee8c430-e8cc-47a8-8ec3-4594e0500a4f-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.342778 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee8c430-e8cc-47a8-8ec3-4594e0500a4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.342788 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc447\" (UniqueName: \"kubernetes.io/projected/fee8c430-e8cc-47a8-8ec3-4594e0500a4f-kube-api-access-hc447\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.551460 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.560936 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.585684 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 15:38:37 crc kubenswrapper[4719]: E1009 15:38:37.586227 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee8c430-e8cc-47a8-8ec3-4594e0500a4f" containerName="nova-scheduler-scheduler" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.586249 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee8c430-e8cc-47a8-8ec3-4594e0500a4f" containerName="nova-scheduler-scheduler" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.586493 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee8c430-e8cc-47a8-8ec3-4594e0500a4f" containerName="nova-scheduler-scheduler" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.587383 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.589082 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.595937 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.647306 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf9c16f-67c5-40df-8a0d-9993341b5f6e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bbf9c16f-67c5-40df-8a0d-9993341b5f6e\") " pod="openstack/nova-scheduler-0" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.647566 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf9c16f-67c5-40df-8a0d-9993341b5f6e-config-data\") pod \"nova-scheduler-0\" (UID: \"bbf9c16f-67c5-40df-8a0d-9993341b5f6e\") " pod="openstack/nova-scheduler-0" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.647733 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw5vc\" (UniqueName: \"kubernetes.io/projected/bbf9c16f-67c5-40df-8a0d-9993341b5f6e-kube-api-access-pw5vc\") pod \"nova-scheduler-0\" (UID: \"bbf9c16f-67c5-40df-8a0d-9993341b5f6e\") " pod="openstack/nova-scheduler-0" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.749562 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw5vc\" (UniqueName: \"kubernetes.io/projected/bbf9c16f-67c5-40df-8a0d-9993341b5f6e-kube-api-access-pw5vc\") pod \"nova-scheduler-0\" (UID: \"bbf9c16f-67c5-40df-8a0d-9993341b5f6e\") " pod="openstack/nova-scheduler-0" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.749723 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf9c16f-67c5-40df-8a0d-9993341b5f6e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bbf9c16f-67c5-40df-8a0d-9993341b5f6e\") " pod="openstack/nova-scheduler-0" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.749817 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf9c16f-67c5-40df-8a0d-9993341b5f6e-config-data\") pod \"nova-scheduler-0\" (UID: \"bbf9c16f-67c5-40df-8a0d-9993341b5f6e\") " pod="openstack/nova-scheduler-0" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.757097 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf9c16f-67c5-40df-8a0d-9993341b5f6e-config-data\") pod \"nova-scheduler-0\" (UID: \"bbf9c16f-67c5-40df-8a0d-9993341b5f6e\") " pod="openstack/nova-scheduler-0" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.757187 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf9c16f-67c5-40df-8a0d-9993341b5f6e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bbf9c16f-67c5-40df-8a0d-9993341b5f6e\") " pod="openstack/nova-scheduler-0" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.764520 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw5vc\" (UniqueName: \"kubernetes.io/projected/bbf9c16f-67c5-40df-8a0d-9993341b5f6e-kube-api-access-pw5vc\") pod \"nova-scheduler-0\" (UID: \"bbf9c16f-67c5-40df-8a0d-9993341b5f6e\") " pod="openstack/nova-scheduler-0" Oct 09 15:38:37 crc kubenswrapper[4719]: I1009 15:38:37.903430 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 15:38:38 crc kubenswrapper[4719]: I1009 15:38:38.245601 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1adfbd0c-70c5-41c6-b400-7ef501bcd042","Type":"ContainerStarted","Data":"bde976d8d7298c597ed2591752bb5c5b31dd4c52cf3165d99be7ce2c48d999a2"} Oct 09 15:38:38 crc kubenswrapper[4719]: I1009 15:38:38.245678 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1adfbd0c-70c5-41c6-b400-7ef501bcd042","Type":"ContainerStarted","Data":"ba527df7ae0358360bde7c2cf1f92ba95bb2669735179bd8458ac4af5a2aa4a9"} Oct 09 15:38:38 crc kubenswrapper[4719]: I1009 15:38:38.277976 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.2779595329999998 podStartE2EDuration="2.277959533s" podCreationTimestamp="2025-10-09 15:38:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:38:38.273239181 +0000 UTC m=+1223.782950476" watchObservedRunningTime="2025-10-09 15:38:38.277959533 +0000 UTC m=+1223.787670808" Oct 09 15:38:38 crc kubenswrapper[4719]: I1009 15:38:38.384445 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 15:38:38 crc kubenswrapper[4719]: W1009 15:38:38.404031 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbf9c16f_67c5_40df_8a0d_9993341b5f6e.slice/crio-281554f8ece5e15cbda19db9a7c3d02453a91453796e0bc88e259c5582e21e1f WatchSource:0}: Error finding container 281554f8ece5e15cbda19db9a7c3d02453a91453796e0bc88e259c5582e21e1f: Status 404 returned error can't find the container with id 281554f8ece5e15cbda19db9a7c3d02453a91453796e0bc88e259c5582e21e1f Oct 09 15:38:39 crc kubenswrapper[4719]: I1009 15:38:39.180413 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fee8c430-e8cc-47a8-8ec3-4594e0500a4f" path="/var/lib/kubelet/pods/fee8c430-e8cc-47a8-8ec3-4594e0500a4f/volumes" Oct 09 15:38:39 crc kubenswrapper[4719]: I1009 15:38:39.257309 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bbf9c16f-67c5-40df-8a0d-9993341b5f6e","Type":"ContainerStarted","Data":"4b2d1444fd94ba0dc2b092803cdf0e9f3380acaff7b82ceaf5076e613b95c91f"} Oct 09 15:38:39 crc kubenswrapper[4719]: I1009 15:38:39.257391 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bbf9c16f-67c5-40df-8a0d-9993341b5f6e","Type":"ContainerStarted","Data":"281554f8ece5e15cbda19db9a7c3d02453a91453796e0bc88e259c5582e21e1f"} Oct 09 15:38:39 crc kubenswrapper[4719]: I1009 15:38:39.279701 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.279684619 podStartE2EDuration="2.279684619s" podCreationTimestamp="2025-10-09 15:38:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:38:39.273800951 +0000 UTC m=+1224.783512246" watchObservedRunningTime="2025-10-09 15:38:39.279684619 +0000 UTC m=+1224.789395904" Oct 09 15:38:39 crc kubenswrapper[4719]: I1009 15:38:39.875558 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 15:38:39 crc kubenswrapper[4719]: I1009 15:38:39.875871 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 15:38:42 crc kubenswrapper[4719]: I1009 15:38:42.904316 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 09 15:38:44 crc kubenswrapper[4719]: I1009 15:38:44.616953 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 09 15:38:44 crc kubenswrapper[4719]: I1009 15:38:44.876404 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 09 15:38:44 crc kubenswrapper[4719]: I1009 15:38:44.876453 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 09 15:38:45 crc kubenswrapper[4719]: I1009 15:38:45.891707 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="659a2cfd-faec-4ff3-9a23-7733a138493e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.218:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 15:38:45 crc kubenswrapper[4719]: I1009 15:38:45.891716 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="659a2cfd-faec-4ff3-9a23-7733a138493e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.218:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 15:38:46 crc kubenswrapper[4719]: I1009 15:38:46.593127 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 15:38:46 crc kubenswrapper[4719]: I1009 15:38:46.593191 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 15:38:47 crc kubenswrapper[4719]: I1009 15:38:47.675624 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1adfbd0c-70c5-41c6-b400-7ef501bcd042" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.219:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 15:38:47 crc kubenswrapper[4719]: I1009 15:38:47.675616 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1adfbd0c-70c5-41c6-b400-7ef501bcd042" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.219:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 15:38:47 crc kubenswrapper[4719]: I1009 15:38:47.904595 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 09 15:38:47 crc kubenswrapper[4719]: I1009 15:38:47.932043 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 09 15:38:48 crc kubenswrapper[4719]: I1009 15:38:48.063074 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 09 15:38:48 crc kubenswrapper[4719]: I1009 15:38:48.390816 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 09 15:38:55 crc kubenswrapper[4719]: I1009 15:38:55.035793 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 09 15:38:55 crc kubenswrapper[4719]: I1009 15:38:55.037689 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 09 15:38:55 crc kubenswrapper[4719]: I1009 15:38:55.042972 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 09 15:38:55 crc kubenswrapper[4719]: I1009 15:38:55.429855 4719 generic.go:334] "Generic (PLEG): container finished" podID="eb00a64b-377d-450c-81df-408b0790b21f" containerID="9f4263077898c99175651f39fa6fa5bf6e0de796ee696a530c40b8c9b7f37380" exitCode=137 Oct 09 15:38:55 crc kubenswrapper[4719]: I1009 15:38:55.429943 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"eb00a64b-377d-450c-81df-408b0790b21f","Type":"ContainerDied","Data":"9f4263077898c99175651f39fa6fa5bf6e0de796ee696a530c40b8c9b7f37380"} Oct 09 15:38:55 crc kubenswrapper[4719]: I1009 15:38:55.430851 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"eb00a64b-377d-450c-81df-408b0790b21f","Type":"ContainerDied","Data":"da767d8723596c6f0975a6248ab155721be50db2f13ee34758a1f0399b909010"} Oct 09 15:38:55 crc kubenswrapper[4719]: I1009 15:38:55.430982 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da767d8723596c6f0975a6248ab155721be50db2f13ee34758a1f0399b909010" Oct 09 15:38:55 crc kubenswrapper[4719]: I1009 15:38:55.435267 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 09 15:38:55 crc kubenswrapper[4719]: I1009 15:38:55.499773 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:55 crc kubenswrapper[4719]: I1009 15:38:55.627497 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cf4lb\" (UniqueName: \"kubernetes.io/projected/eb00a64b-377d-450c-81df-408b0790b21f-kube-api-access-cf4lb\") pod \"eb00a64b-377d-450c-81df-408b0790b21f\" (UID: \"eb00a64b-377d-450c-81df-408b0790b21f\") " Oct 09 15:38:55 crc kubenswrapper[4719]: I1009 15:38:55.627853 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb00a64b-377d-450c-81df-408b0790b21f-combined-ca-bundle\") pod \"eb00a64b-377d-450c-81df-408b0790b21f\" (UID: \"eb00a64b-377d-450c-81df-408b0790b21f\") " Oct 09 15:38:55 crc kubenswrapper[4719]: I1009 15:38:55.627971 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb00a64b-377d-450c-81df-408b0790b21f-config-data\") pod \"eb00a64b-377d-450c-81df-408b0790b21f\" (UID: \"eb00a64b-377d-450c-81df-408b0790b21f\") " Oct 09 15:38:55 crc kubenswrapper[4719]: I1009 15:38:55.633204 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb00a64b-377d-450c-81df-408b0790b21f-kube-api-access-cf4lb" (OuterVolumeSpecName: "kube-api-access-cf4lb") pod "eb00a64b-377d-450c-81df-408b0790b21f" (UID: "eb00a64b-377d-450c-81df-408b0790b21f"). InnerVolumeSpecName "kube-api-access-cf4lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:38:55 crc kubenswrapper[4719]: I1009 15:38:55.657828 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb00a64b-377d-450c-81df-408b0790b21f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb00a64b-377d-450c-81df-408b0790b21f" (UID: "eb00a64b-377d-450c-81df-408b0790b21f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:38:55 crc kubenswrapper[4719]: I1009 15:38:55.660567 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb00a64b-377d-450c-81df-408b0790b21f-config-data" (OuterVolumeSpecName: "config-data") pod "eb00a64b-377d-450c-81df-408b0790b21f" (UID: "eb00a64b-377d-450c-81df-408b0790b21f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:38:55 crc kubenswrapper[4719]: I1009 15:38:55.730043 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb00a64b-377d-450c-81df-408b0790b21f-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:55 crc kubenswrapper[4719]: I1009 15:38:55.730078 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cf4lb\" (UniqueName: \"kubernetes.io/projected/eb00a64b-377d-450c-81df-408b0790b21f-kube-api-access-cf4lb\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:55 crc kubenswrapper[4719]: I1009 15:38:55.730091 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb00a64b-377d-450c-81df-408b0790b21f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.438902 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.487054 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.498713 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.507743 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 15:38:56 crc kubenswrapper[4719]: E1009 15:38:56.508299 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb00a64b-377d-450c-81df-408b0790b21f" containerName="nova-cell1-novncproxy-novncproxy" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.508325 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb00a64b-377d-450c-81df-408b0790b21f" containerName="nova-cell1-novncproxy-novncproxy" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.508592 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb00a64b-377d-450c-81df-408b0790b21f" containerName="nova-cell1-novncproxy-novncproxy" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.509522 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.517523 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.520033 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.520221 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.523287 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.600276 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.600686 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.608604 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.608964 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.648620 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ff2cdae-bf76-4452-9d8d-26560a89a2da-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0ff2cdae-bf76-4452-9d8d-26560a89a2da\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.649628 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ff2cdae-bf76-4452-9d8d-26560a89a2da-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0ff2cdae-bf76-4452-9d8d-26560a89a2da\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.649733 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ff2cdae-bf76-4452-9d8d-26560a89a2da-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0ff2cdae-bf76-4452-9d8d-26560a89a2da\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.649874 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ff2cdae-bf76-4452-9d8d-26560a89a2da-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0ff2cdae-bf76-4452-9d8d-26560a89a2da\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.650044 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tcp5\" (UniqueName: \"kubernetes.io/projected/0ff2cdae-bf76-4452-9d8d-26560a89a2da-kube-api-access-4tcp5\") pod \"nova-cell1-novncproxy-0\" (UID: \"0ff2cdae-bf76-4452-9d8d-26560a89a2da\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.757626 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tcp5\" (UniqueName: \"kubernetes.io/projected/0ff2cdae-bf76-4452-9d8d-26560a89a2da-kube-api-access-4tcp5\") pod \"nova-cell1-novncproxy-0\" (UID: \"0ff2cdae-bf76-4452-9d8d-26560a89a2da\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.757683 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ff2cdae-bf76-4452-9d8d-26560a89a2da-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0ff2cdae-bf76-4452-9d8d-26560a89a2da\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.757750 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ff2cdae-bf76-4452-9d8d-26560a89a2da-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0ff2cdae-bf76-4452-9d8d-26560a89a2da\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.757782 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ff2cdae-bf76-4452-9d8d-26560a89a2da-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0ff2cdae-bf76-4452-9d8d-26560a89a2da\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.757828 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ff2cdae-bf76-4452-9d8d-26560a89a2da-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0ff2cdae-bf76-4452-9d8d-26560a89a2da\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.768099 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ff2cdae-bf76-4452-9d8d-26560a89a2da-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0ff2cdae-bf76-4452-9d8d-26560a89a2da\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.768166 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ff2cdae-bf76-4452-9d8d-26560a89a2da-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0ff2cdae-bf76-4452-9d8d-26560a89a2da\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.768301 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ff2cdae-bf76-4452-9d8d-26560a89a2da-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0ff2cdae-bf76-4452-9d8d-26560a89a2da\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.775126 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ff2cdae-bf76-4452-9d8d-26560a89a2da-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0ff2cdae-bf76-4452-9d8d-26560a89a2da\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.777982 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tcp5\" (UniqueName: \"kubernetes.io/projected/0ff2cdae-bf76-4452-9d8d-26560a89a2da-kube-api-access-4tcp5\") pod \"nova-cell1-novncproxy-0\" (UID: \"0ff2cdae-bf76-4452-9d8d-26560a89a2da\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:56 crc kubenswrapper[4719]: I1009 15:38:56.838150 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.175042 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb00a64b-377d-450c-81df-408b0790b21f" path="/var/lib/kubelet/pods/eb00a64b-377d-450c-81df-408b0790b21f/volumes" Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.322888 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 15:38:57 crc kubenswrapper[4719]: W1009 15:38:57.329479 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ff2cdae_bf76_4452_9d8d_26560a89a2da.slice/crio-a073b68f53130a310cb5ac49904ce2f997c8fb623bdc24f1a11ee61450eda081 WatchSource:0}: Error finding container a073b68f53130a310cb5ac49904ce2f997c8fb623bdc24f1a11ee61450eda081: Status 404 returned error can't find the container with id a073b68f53130a310cb5ac49904ce2f997c8fb623bdc24f1a11ee61450eda081 Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.474448 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0ff2cdae-bf76-4452-9d8d-26560a89a2da","Type":"ContainerStarted","Data":"a073b68f53130a310cb5ac49904ce2f997c8fb623bdc24f1a11ee61450eda081"} Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.474845 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.484108 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.679191 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66f8586597-6gv4h"] Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.682166 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66f8586597-6gv4h" Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.691881 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66f8586597-6gv4h"] Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.780089 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgb6m\" (UniqueName: \"kubernetes.io/projected/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-kube-api-access-hgb6m\") pod \"dnsmasq-dns-66f8586597-6gv4h\" (UID: \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\") " pod="openstack/dnsmasq-dns-66f8586597-6gv4h" Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.780149 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-dns-swift-storage-0\") pod \"dnsmasq-dns-66f8586597-6gv4h\" (UID: \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\") " pod="openstack/dnsmasq-dns-66f8586597-6gv4h" Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.780171 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-ovsdbserver-nb\") pod \"dnsmasq-dns-66f8586597-6gv4h\" (UID: \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\") " pod="openstack/dnsmasq-dns-66f8586597-6gv4h" Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.780401 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-ovsdbserver-sb\") pod \"dnsmasq-dns-66f8586597-6gv4h\" (UID: \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\") " pod="openstack/dnsmasq-dns-66f8586597-6gv4h" Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.780455 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-config\") pod \"dnsmasq-dns-66f8586597-6gv4h\" (UID: \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\") " pod="openstack/dnsmasq-dns-66f8586597-6gv4h" Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.780589 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-dns-svc\") pod \"dnsmasq-dns-66f8586597-6gv4h\" (UID: \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\") " pod="openstack/dnsmasq-dns-66f8586597-6gv4h" Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.882819 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgb6m\" (UniqueName: \"kubernetes.io/projected/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-kube-api-access-hgb6m\") pod \"dnsmasq-dns-66f8586597-6gv4h\" (UID: \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\") " pod="openstack/dnsmasq-dns-66f8586597-6gv4h" Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.882876 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-dns-swift-storage-0\") pod \"dnsmasq-dns-66f8586597-6gv4h\" (UID: \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\") " pod="openstack/dnsmasq-dns-66f8586597-6gv4h" Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.882899 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-ovsdbserver-nb\") pod \"dnsmasq-dns-66f8586597-6gv4h\" (UID: \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\") " pod="openstack/dnsmasq-dns-66f8586597-6gv4h" Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.882961 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-ovsdbserver-sb\") pod \"dnsmasq-dns-66f8586597-6gv4h\" (UID: \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\") " pod="openstack/dnsmasq-dns-66f8586597-6gv4h" Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.882986 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-config\") pod \"dnsmasq-dns-66f8586597-6gv4h\" (UID: \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\") " pod="openstack/dnsmasq-dns-66f8586597-6gv4h" Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.883013 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-dns-svc\") pod \"dnsmasq-dns-66f8586597-6gv4h\" (UID: \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\") " pod="openstack/dnsmasq-dns-66f8586597-6gv4h" Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.886070 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-dns-swift-storage-0\") pod \"dnsmasq-dns-66f8586597-6gv4h\" (UID: \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\") " pod="openstack/dnsmasq-dns-66f8586597-6gv4h" Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.886177 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-dns-svc\") pod \"dnsmasq-dns-66f8586597-6gv4h\" (UID: \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\") " pod="openstack/dnsmasq-dns-66f8586597-6gv4h" Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.886225 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-ovsdbserver-sb\") pod \"dnsmasq-dns-66f8586597-6gv4h\" (UID: \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\") " pod="openstack/dnsmasq-dns-66f8586597-6gv4h" Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.887417 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-ovsdbserver-nb\") pod \"dnsmasq-dns-66f8586597-6gv4h\" (UID: \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\") " pod="openstack/dnsmasq-dns-66f8586597-6gv4h" Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.889826 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-config\") pod \"dnsmasq-dns-66f8586597-6gv4h\" (UID: \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\") " pod="openstack/dnsmasq-dns-66f8586597-6gv4h" Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.902156 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgb6m\" (UniqueName: \"kubernetes.io/projected/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-kube-api-access-hgb6m\") pod \"dnsmasq-dns-66f8586597-6gv4h\" (UID: \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\") " pod="openstack/dnsmasq-dns-66f8586597-6gv4h" Oct 09 15:38:57 crc kubenswrapper[4719]: I1009 15:38:57.999834 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66f8586597-6gv4h" Oct 09 15:38:58 crc kubenswrapper[4719]: I1009 15:38:58.464320 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66f8586597-6gv4h"] Oct 09 15:38:58 crc kubenswrapper[4719]: I1009 15:38:58.496784 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66f8586597-6gv4h" event={"ID":"8c6a2f16-2424-4091-97d8-6e5dc05d37a6","Type":"ContainerStarted","Data":"468395e7beedf62a40cc721d19e5bfcb5a004418e47c7883874297cdb840e7fb"} Oct 09 15:38:58 crc kubenswrapper[4719]: I1009 15:38:58.500027 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0ff2cdae-bf76-4452-9d8d-26560a89a2da","Type":"ContainerStarted","Data":"742c846d380a139c2d52ef2c2ce58d08a030c8d1624b07af9f8b1442449feaf5"} Oct 09 15:38:59 crc kubenswrapper[4719]: I1009 15:38:59.512411 4719 generic.go:334] "Generic (PLEG): container finished" podID="8c6a2f16-2424-4091-97d8-6e5dc05d37a6" containerID="b63a8cff8817e9d33c85cb502d6125eef9a53a513adf6c4918b89de7ddfb4296" exitCode=0 Oct 09 15:38:59 crc kubenswrapper[4719]: I1009 15:38:59.514319 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66f8586597-6gv4h" event={"ID":"8c6a2f16-2424-4091-97d8-6e5dc05d37a6","Type":"ContainerDied","Data":"b63a8cff8817e9d33c85cb502d6125eef9a53a513adf6c4918b89de7ddfb4296"} Oct 09 15:38:59 crc kubenswrapper[4719]: I1009 15:38:59.549579 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.549561007 podStartE2EDuration="3.549561007s" podCreationTimestamp="2025-10-09 15:38:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:38:58.542960025 +0000 UTC m=+1244.052671310" watchObservedRunningTime="2025-10-09 15:38:59.549561007 +0000 UTC m=+1245.059272292" Oct 09 15:39:00 crc kubenswrapper[4719]: I1009 15:39:00.105288 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 15:39:00 crc kubenswrapper[4719]: I1009 15:39:00.336237 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:39:00 crc kubenswrapper[4719]: I1009 15:39:00.336843 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="43123f9f-e7f7-4d3a-8856-18d324f55254" containerName="ceilometer-central-agent" containerID="cri-o://1a77d63c6cc6dce02e2f31d05cf6a08df242847bdb12cb57cb976fc66a6b931b" gracePeriod=30 Oct 09 15:39:00 crc kubenswrapper[4719]: I1009 15:39:00.336918 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="43123f9f-e7f7-4d3a-8856-18d324f55254" containerName="proxy-httpd" containerID="cri-o://e74d2203f8861cc51352ee49abbd21fbe35685c4741856ce50c7ade9ba91c969" gracePeriod=30 Oct 09 15:39:00 crc kubenswrapper[4719]: I1009 15:39:00.336962 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="43123f9f-e7f7-4d3a-8856-18d324f55254" containerName="ceilometer-notification-agent" containerID="cri-o://ccd3ef7533995ff84e77db8707b05438e1cd30d5f79a8ec98eeeda690097c829" gracePeriod=30 Oct 09 15:39:00 crc kubenswrapper[4719]: I1009 15:39:00.336953 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="43123f9f-e7f7-4d3a-8856-18d324f55254" containerName="sg-core" containerID="cri-o://71eccb3d761539feb14aee5d5a1a2bb43c9087261b73c09d77e0970f1eab8b52" gracePeriod=30 Oct 09 15:39:00 crc kubenswrapper[4719]: I1009 15:39:00.539372 4719 generic.go:334] "Generic (PLEG): container finished" podID="43123f9f-e7f7-4d3a-8856-18d324f55254" containerID="e74d2203f8861cc51352ee49abbd21fbe35685c4741856ce50c7ade9ba91c969" exitCode=0 Oct 09 15:39:00 crc kubenswrapper[4719]: I1009 15:39:00.539403 4719 generic.go:334] "Generic (PLEG): container finished" podID="43123f9f-e7f7-4d3a-8856-18d324f55254" containerID="71eccb3d761539feb14aee5d5a1a2bb43c9087261b73c09d77e0970f1eab8b52" exitCode=2 Oct 09 15:39:00 crc kubenswrapper[4719]: I1009 15:39:00.539447 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43123f9f-e7f7-4d3a-8856-18d324f55254","Type":"ContainerDied","Data":"e74d2203f8861cc51352ee49abbd21fbe35685c4741856ce50c7ade9ba91c969"} Oct 09 15:39:00 crc kubenswrapper[4719]: I1009 15:39:00.539473 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43123f9f-e7f7-4d3a-8856-18d324f55254","Type":"ContainerDied","Data":"71eccb3d761539feb14aee5d5a1a2bb43c9087261b73c09d77e0970f1eab8b52"} Oct 09 15:39:00 crc kubenswrapper[4719]: I1009 15:39:00.543572 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1adfbd0c-70c5-41c6-b400-7ef501bcd042" containerName="nova-api-log" containerID="cri-o://ba527df7ae0358360bde7c2cf1f92ba95bb2669735179bd8458ac4af5a2aa4a9" gracePeriod=30 Oct 09 15:39:00 crc kubenswrapper[4719]: I1009 15:39:00.543678 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66f8586597-6gv4h" event={"ID":"8c6a2f16-2424-4091-97d8-6e5dc05d37a6","Type":"ContainerStarted","Data":"2ac08dbed9e2e6db28246046df1340751706e82529cc7966a834efd45ee905ef"} Oct 09 15:39:00 crc kubenswrapper[4719]: I1009 15:39:00.543767 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1adfbd0c-70c5-41c6-b400-7ef501bcd042" containerName="nova-api-api" containerID="cri-o://bde976d8d7298c597ed2591752bb5c5b31dd4c52cf3165d99be7ce2c48d999a2" gracePeriod=30 Oct 09 15:39:00 crc kubenswrapper[4719]: I1009 15:39:00.544718 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66f8586597-6gv4h" Oct 09 15:39:01 crc kubenswrapper[4719]: I1009 15:39:01.555268 4719 generic.go:334] "Generic (PLEG): container finished" podID="1adfbd0c-70c5-41c6-b400-7ef501bcd042" containerID="ba527df7ae0358360bde7c2cf1f92ba95bb2669735179bd8458ac4af5a2aa4a9" exitCode=143 Oct 09 15:39:01 crc kubenswrapper[4719]: I1009 15:39:01.555390 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1adfbd0c-70c5-41c6-b400-7ef501bcd042","Type":"ContainerDied","Data":"ba527df7ae0358360bde7c2cf1f92ba95bb2669735179bd8458ac4af5a2aa4a9"} Oct 09 15:39:01 crc kubenswrapper[4719]: I1009 15:39:01.558557 4719 generic.go:334] "Generic (PLEG): container finished" podID="43123f9f-e7f7-4d3a-8856-18d324f55254" containerID="1a77d63c6cc6dce02e2f31d05cf6a08df242847bdb12cb57cb976fc66a6b931b" exitCode=0 Oct 09 15:39:01 crc kubenswrapper[4719]: I1009 15:39:01.558942 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43123f9f-e7f7-4d3a-8856-18d324f55254","Type":"ContainerDied","Data":"1a77d63c6cc6dce02e2f31d05cf6a08df242847bdb12cb57cb976fc66a6b931b"} Oct 09 15:39:01 crc kubenswrapper[4719]: I1009 15:39:01.839002 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:39:02 crc kubenswrapper[4719]: I1009 15:39:02.569410 4719 generic.go:334] "Generic (PLEG): container finished" podID="1adfbd0c-70c5-41c6-b400-7ef501bcd042" containerID="bde976d8d7298c597ed2591752bb5c5b31dd4c52cf3165d99be7ce2c48d999a2" exitCode=0 Oct 09 15:39:02 crc kubenswrapper[4719]: I1009 15:39:02.569602 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1adfbd0c-70c5-41c6-b400-7ef501bcd042","Type":"ContainerDied","Data":"bde976d8d7298c597ed2591752bb5c5b31dd4c52cf3165d99be7ce2c48d999a2"} Oct 09 15:39:02 crc kubenswrapper[4719]: I1009 15:39:02.569781 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1adfbd0c-70c5-41c6-b400-7ef501bcd042","Type":"ContainerDied","Data":"36716262a04acac1623801da02f3cc6354851bc1916bee3452c39c51a5ef8c60"} Oct 09 15:39:02 crc kubenswrapper[4719]: I1009 15:39:02.569801 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36716262a04acac1623801da02f3cc6354851bc1916bee3452c39c51a5ef8c60" Oct 09 15:39:02 crc kubenswrapper[4719]: I1009 15:39:02.592705 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 15:39:02 crc kubenswrapper[4719]: I1009 15:39:02.615774 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66f8586597-6gv4h" podStartSLOduration=5.615753826 podStartE2EDuration="5.615753826s" podCreationTimestamp="2025-10-09 15:38:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:39:00.566998626 +0000 UTC m=+1246.076709911" watchObservedRunningTime="2025-10-09 15:39:02.615753826 +0000 UTC m=+1248.125465111" Oct 09 15:39:02 crc kubenswrapper[4719]: I1009 15:39:02.682754 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1adfbd0c-70c5-41c6-b400-7ef501bcd042-logs\") pod \"1adfbd0c-70c5-41c6-b400-7ef501bcd042\" (UID: \"1adfbd0c-70c5-41c6-b400-7ef501bcd042\") " Oct 09 15:39:02 crc kubenswrapper[4719]: I1009 15:39:02.682847 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1adfbd0c-70c5-41c6-b400-7ef501bcd042-combined-ca-bundle\") pod \"1adfbd0c-70c5-41c6-b400-7ef501bcd042\" (UID: \"1adfbd0c-70c5-41c6-b400-7ef501bcd042\") " Oct 09 15:39:02 crc kubenswrapper[4719]: I1009 15:39:02.682900 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w66cf\" (UniqueName: \"kubernetes.io/projected/1adfbd0c-70c5-41c6-b400-7ef501bcd042-kube-api-access-w66cf\") pod \"1adfbd0c-70c5-41c6-b400-7ef501bcd042\" (UID: \"1adfbd0c-70c5-41c6-b400-7ef501bcd042\") " Oct 09 15:39:02 crc kubenswrapper[4719]: I1009 15:39:02.682921 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1adfbd0c-70c5-41c6-b400-7ef501bcd042-config-data\") pod \"1adfbd0c-70c5-41c6-b400-7ef501bcd042\" (UID: \"1adfbd0c-70c5-41c6-b400-7ef501bcd042\") " Oct 09 15:39:02 crc kubenswrapper[4719]: I1009 15:39:02.683335 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1adfbd0c-70c5-41c6-b400-7ef501bcd042-logs" (OuterVolumeSpecName: "logs") pod "1adfbd0c-70c5-41c6-b400-7ef501bcd042" (UID: "1adfbd0c-70c5-41c6-b400-7ef501bcd042"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:39:02 crc kubenswrapper[4719]: I1009 15:39:02.683728 4719 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1adfbd0c-70c5-41c6-b400-7ef501bcd042-logs\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:02 crc kubenswrapper[4719]: I1009 15:39:02.690601 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1adfbd0c-70c5-41c6-b400-7ef501bcd042-kube-api-access-w66cf" (OuterVolumeSpecName: "kube-api-access-w66cf") pod "1adfbd0c-70c5-41c6-b400-7ef501bcd042" (UID: "1adfbd0c-70c5-41c6-b400-7ef501bcd042"). InnerVolumeSpecName "kube-api-access-w66cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:39:02 crc kubenswrapper[4719]: I1009 15:39:02.719791 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1adfbd0c-70c5-41c6-b400-7ef501bcd042-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1adfbd0c-70c5-41c6-b400-7ef501bcd042" (UID: "1adfbd0c-70c5-41c6-b400-7ef501bcd042"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:39:02 crc kubenswrapper[4719]: I1009 15:39:02.723249 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1adfbd0c-70c5-41c6-b400-7ef501bcd042-config-data" (OuterVolumeSpecName: "config-data") pod "1adfbd0c-70c5-41c6-b400-7ef501bcd042" (UID: "1adfbd0c-70c5-41c6-b400-7ef501bcd042"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:39:02 crc kubenswrapper[4719]: I1009 15:39:02.785478 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1adfbd0c-70c5-41c6-b400-7ef501bcd042-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:02 crc kubenswrapper[4719]: I1009 15:39:02.785536 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w66cf\" (UniqueName: \"kubernetes.io/projected/1adfbd0c-70c5-41c6-b400-7ef501bcd042-kube-api-access-w66cf\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:02 crc kubenswrapper[4719]: I1009 15:39:02.785551 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1adfbd0c-70c5-41c6-b400-7ef501bcd042-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.586382 4719 generic.go:334] "Generic (PLEG): container finished" podID="43123f9f-e7f7-4d3a-8856-18d324f55254" containerID="ccd3ef7533995ff84e77db8707b05438e1cd30d5f79a8ec98eeeda690097c829" exitCode=0 Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.586814 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.588126 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43123f9f-e7f7-4d3a-8856-18d324f55254","Type":"ContainerDied","Data":"ccd3ef7533995ff84e77db8707b05438e1cd30d5f79a8ec98eeeda690097c829"} Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.616473 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.633642 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.646481 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 09 15:39:03 crc kubenswrapper[4719]: E1009 15:39:03.646956 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1adfbd0c-70c5-41c6-b400-7ef501bcd042" containerName="nova-api-log" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.646973 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="1adfbd0c-70c5-41c6-b400-7ef501bcd042" containerName="nova-api-log" Oct 09 15:39:03 crc kubenswrapper[4719]: E1009 15:39:03.646997 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1adfbd0c-70c5-41c6-b400-7ef501bcd042" containerName="nova-api-api" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.647004 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="1adfbd0c-70c5-41c6-b400-7ef501bcd042" containerName="nova-api-api" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.647196 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="1adfbd0c-70c5-41c6-b400-7ef501bcd042" containerName="nova-api-log" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.647219 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="1adfbd0c-70c5-41c6-b400-7ef501bcd042" containerName="nova-api-api" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.648632 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.656393 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.657192 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.657491 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.658133 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.700733 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49188dc1-6a08-454e-8a32-a2d252d091b3-config-data\") pod \"nova-api-0\" (UID: \"49188dc1-6a08-454e-8a32-a2d252d091b3\") " pod="openstack/nova-api-0" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.700851 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49188dc1-6a08-454e-8a32-a2d252d091b3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"49188dc1-6a08-454e-8a32-a2d252d091b3\") " pod="openstack/nova-api-0" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.700888 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49188dc1-6a08-454e-8a32-a2d252d091b3-public-tls-certs\") pod \"nova-api-0\" (UID: \"49188dc1-6a08-454e-8a32-a2d252d091b3\") " pod="openstack/nova-api-0" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.700911 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49188dc1-6a08-454e-8a32-a2d252d091b3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"49188dc1-6a08-454e-8a32-a2d252d091b3\") " pod="openstack/nova-api-0" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.700938 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvjg8\" (UniqueName: \"kubernetes.io/projected/49188dc1-6a08-454e-8a32-a2d252d091b3-kube-api-access-gvjg8\") pod \"nova-api-0\" (UID: \"49188dc1-6a08-454e-8a32-a2d252d091b3\") " pod="openstack/nova-api-0" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.701223 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49188dc1-6a08-454e-8a32-a2d252d091b3-logs\") pod \"nova-api-0\" (UID: \"49188dc1-6a08-454e-8a32-a2d252d091b3\") " pod="openstack/nova-api-0" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.803136 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49188dc1-6a08-454e-8a32-a2d252d091b3-config-data\") pod \"nova-api-0\" (UID: \"49188dc1-6a08-454e-8a32-a2d252d091b3\") " pod="openstack/nova-api-0" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.803271 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49188dc1-6a08-454e-8a32-a2d252d091b3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"49188dc1-6a08-454e-8a32-a2d252d091b3\") " pod="openstack/nova-api-0" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.803295 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49188dc1-6a08-454e-8a32-a2d252d091b3-public-tls-certs\") pod \"nova-api-0\" (UID: \"49188dc1-6a08-454e-8a32-a2d252d091b3\") " pod="openstack/nova-api-0" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.803338 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49188dc1-6a08-454e-8a32-a2d252d091b3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"49188dc1-6a08-454e-8a32-a2d252d091b3\") " pod="openstack/nova-api-0" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.803397 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvjg8\" (UniqueName: \"kubernetes.io/projected/49188dc1-6a08-454e-8a32-a2d252d091b3-kube-api-access-gvjg8\") pod \"nova-api-0\" (UID: \"49188dc1-6a08-454e-8a32-a2d252d091b3\") " pod="openstack/nova-api-0" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.808674 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49188dc1-6a08-454e-8a32-a2d252d091b3-logs\") pod \"nova-api-0\" (UID: \"49188dc1-6a08-454e-8a32-a2d252d091b3\") " pod="openstack/nova-api-0" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.809149 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49188dc1-6a08-454e-8a32-a2d252d091b3-logs\") pod \"nova-api-0\" (UID: \"49188dc1-6a08-454e-8a32-a2d252d091b3\") " pod="openstack/nova-api-0" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.809184 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49188dc1-6a08-454e-8a32-a2d252d091b3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"49188dc1-6a08-454e-8a32-a2d252d091b3\") " pod="openstack/nova-api-0" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.809776 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49188dc1-6a08-454e-8a32-a2d252d091b3-public-tls-certs\") pod \"nova-api-0\" (UID: \"49188dc1-6a08-454e-8a32-a2d252d091b3\") " pod="openstack/nova-api-0" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.817129 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49188dc1-6a08-454e-8a32-a2d252d091b3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"49188dc1-6a08-454e-8a32-a2d252d091b3\") " pod="openstack/nova-api-0" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.817895 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49188dc1-6a08-454e-8a32-a2d252d091b3-config-data\") pod \"nova-api-0\" (UID: \"49188dc1-6a08-454e-8a32-a2d252d091b3\") " pod="openstack/nova-api-0" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.829840 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvjg8\" (UniqueName: \"kubernetes.io/projected/49188dc1-6a08-454e-8a32-a2d252d091b3-kube-api-access-gvjg8\") pod \"nova-api-0\" (UID: \"49188dc1-6a08-454e-8a32-a2d252d091b3\") " pod="openstack/nova-api-0" Oct 09 15:39:03 crc kubenswrapper[4719]: I1009 15:39:03.964677 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.066362 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.113381 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-sg-core-conf-yaml\") pod \"43123f9f-e7f7-4d3a-8856-18d324f55254\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.113437 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjj8t\" (UniqueName: \"kubernetes.io/projected/43123f9f-e7f7-4d3a-8856-18d324f55254-kube-api-access-kjj8t\") pod \"43123f9f-e7f7-4d3a-8856-18d324f55254\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.113474 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-combined-ca-bundle\") pod \"43123f9f-e7f7-4d3a-8856-18d324f55254\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.113582 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-config-data\") pod \"43123f9f-e7f7-4d3a-8856-18d324f55254\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.113610 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-scripts\") pod \"43123f9f-e7f7-4d3a-8856-18d324f55254\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.113646 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43123f9f-e7f7-4d3a-8856-18d324f55254-run-httpd\") pod \"43123f9f-e7f7-4d3a-8856-18d324f55254\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.113757 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43123f9f-e7f7-4d3a-8856-18d324f55254-log-httpd\") pod \"43123f9f-e7f7-4d3a-8856-18d324f55254\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.113784 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-ceilometer-tls-certs\") pod \"43123f9f-e7f7-4d3a-8856-18d324f55254\" (UID: \"43123f9f-e7f7-4d3a-8856-18d324f55254\") " Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.115708 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43123f9f-e7f7-4d3a-8856-18d324f55254-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "43123f9f-e7f7-4d3a-8856-18d324f55254" (UID: "43123f9f-e7f7-4d3a-8856-18d324f55254"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.115839 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43123f9f-e7f7-4d3a-8856-18d324f55254-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "43123f9f-e7f7-4d3a-8856-18d324f55254" (UID: "43123f9f-e7f7-4d3a-8856-18d324f55254"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.126536 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-scripts" (OuterVolumeSpecName: "scripts") pod "43123f9f-e7f7-4d3a-8856-18d324f55254" (UID: "43123f9f-e7f7-4d3a-8856-18d324f55254"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.177685 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43123f9f-e7f7-4d3a-8856-18d324f55254-kube-api-access-kjj8t" (OuterVolumeSpecName: "kube-api-access-kjj8t") pod "43123f9f-e7f7-4d3a-8856-18d324f55254" (UID: "43123f9f-e7f7-4d3a-8856-18d324f55254"). InnerVolumeSpecName "kube-api-access-kjj8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.223156 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.223190 4719 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43123f9f-e7f7-4d3a-8856-18d324f55254-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.223204 4719 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43123f9f-e7f7-4d3a-8856-18d324f55254-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.223214 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjj8t\" (UniqueName: \"kubernetes.io/projected/43123f9f-e7f7-4d3a-8856-18d324f55254-kube-api-access-kjj8t\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.245884 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "43123f9f-e7f7-4d3a-8856-18d324f55254" (UID: "43123f9f-e7f7-4d3a-8856-18d324f55254"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.284058 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "43123f9f-e7f7-4d3a-8856-18d324f55254" (UID: "43123f9f-e7f7-4d3a-8856-18d324f55254"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.284761 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43123f9f-e7f7-4d3a-8856-18d324f55254" (UID: "43123f9f-e7f7-4d3a-8856-18d324f55254"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.324736 4719 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.324762 4719 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.324771 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.336753 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-config-data" (OuterVolumeSpecName: "config-data") pod "43123f9f-e7f7-4d3a-8856-18d324f55254" (UID: "43123f9f-e7f7-4d3a-8856-18d324f55254"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.426973 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43123f9f-e7f7-4d3a-8856-18d324f55254-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.604833 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43123f9f-e7f7-4d3a-8856-18d324f55254","Type":"ContainerDied","Data":"13683196bbc089128ffba8f7c3a522318bfacba03a462aa20a60501f2af95675"} Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.604881 4719 scope.go:117] "RemoveContainer" containerID="e74d2203f8861cc51352ee49abbd21fbe35685c4741856ce50c7ade9ba91c969" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.605018 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: W1009 15:39:04.631585 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49188dc1_6a08_454e_8a32_a2d252d091b3.slice/crio-296a66a9466482feb48f8f45722dfda99793fc873bdb93bacb2df204900fea5f WatchSource:0}: Error finding container 296a66a9466482feb48f8f45722dfda99793fc873bdb93bacb2df204900fea5f: Status 404 returned error can't find the container with id 296a66a9466482feb48f8f45722dfda99793fc873bdb93bacb2df204900fea5f Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.638859 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.644778 4719 scope.go:117] "RemoveContainer" containerID="71eccb3d761539feb14aee5d5a1a2bb43c9087261b73c09d77e0970f1eab8b52" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.703321 4719 scope.go:117] "RemoveContainer" containerID="ccd3ef7533995ff84e77db8707b05438e1cd30d5f79a8ec98eeeda690097c829" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.704748 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.721147 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.733460 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:39:04 crc kubenswrapper[4719]: E1009 15:39:04.733871 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43123f9f-e7f7-4d3a-8856-18d324f55254" containerName="sg-core" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.733887 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="43123f9f-e7f7-4d3a-8856-18d324f55254" containerName="sg-core" Oct 09 15:39:04 crc kubenswrapper[4719]: E1009 15:39:04.733907 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43123f9f-e7f7-4d3a-8856-18d324f55254" containerName="ceilometer-central-agent" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.733913 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="43123f9f-e7f7-4d3a-8856-18d324f55254" containerName="ceilometer-central-agent" Oct 09 15:39:04 crc kubenswrapper[4719]: E1009 15:39:04.733938 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43123f9f-e7f7-4d3a-8856-18d324f55254" containerName="proxy-httpd" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.733944 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="43123f9f-e7f7-4d3a-8856-18d324f55254" containerName="proxy-httpd" Oct 09 15:39:04 crc kubenswrapper[4719]: E1009 15:39:04.733954 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43123f9f-e7f7-4d3a-8856-18d324f55254" containerName="ceilometer-notification-agent" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.733960 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="43123f9f-e7f7-4d3a-8856-18d324f55254" containerName="ceilometer-notification-agent" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.734151 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="43123f9f-e7f7-4d3a-8856-18d324f55254" containerName="proxy-httpd" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.734164 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="43123f9f-e7f7-4d3a-8856-18d324f55254" containerName="ceilometer-central-agent" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.734174 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="43123f9f-e7f7-4d3a-8856-18d324f55254" containerName="ceilometer-notification-agent" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.734190 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="43123f9f-e7f7-4d3a-8856-18d324f55254" containerName="sg-core" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.735985 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.738793 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.738793 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.738949 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.741192 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.801431 4719 scope.go:117] "RemoveContainer" containerID="1a77d63c6cc6dce02e2f31d05cf6a08df242847bdb12cb57cb976fc66a6b931b" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.836020 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2a64def-d060-46e7-8792-835bb734a809-config-data\") pod \"ceilometer-0\" (UID: \"c2a64def-d060-46e7-8792-835bb734a809\") " pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.836070 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2a64def-d060-46e7-8792-835bb734a809-scripts\") pod \"ceilometer-0\" (UID: \"c2a64def-d060-46e7-8792-835bb734a809\") " pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.836093 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2a64def-d060-46e7-8792-835bb734a809-run-httpd\") pod \"ceilometer-0\" (UID: \"c2a64def-d060-46e7-8792-835bb734a809\") " pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.836109 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a64def-d060-46e7-8792-835bb734a809-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c2a64def-d060-46e7-8792-835bb734a809\") " pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.836247 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c2a64def-d060-46e7-8792-835bb734a809-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c2a64def-d060-46e7-8792-835bb734a809\") " pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.836316 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c8j7\" (UniqueName: \"kubernetes.io/projected/c2a64def-d060-46e7-8792-835bb734a809-kube-api-access-9c8j7\") pod \"ceilometer-0\" (UID: \"c2a64def-d060-46e7-8792-835bb734a809\") " pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.836431 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2a64def-d060-46e7-8792-835bb734a809-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c2a64def-d060-46e7-8792-835bb734a809\") " pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.836530 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2a64def-d060-46e7-8792-835bb734a809-log-httpd\") pod \"ceilometer-0\" (UID: \"c2a64def-d060-46e7-8792-835bb734a809\") " pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.939029 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2a64def-d060-46e7-8792-835bb734a809-config-data\") pod \"ceilometer-0\" (UID: \"c2a64def-d060-46e7-8792-835bb734a809\") " pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.939082 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2a64def-d060-46e7-8792-835bb734a809-scripts\") pod \"ceilometer-0\" (UID: \"c2a64def-d060-46e7-8792-835bb734a809\") " pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.939103 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2a64def-d060-46e7-8792-835bb734a809-run-httpd\") pod \"ceilometer-0\" (UID: \"c2a64def-d060-46e7-8792-835bb734a809\") " pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.939117 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a64def-d060-46e7-8792-835bb734a809-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c2a64def-d060-46e7-8792-835bb734a809\") " pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.939141 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c2a64def-d060-46e7-8792-835bb734a809-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c2a64def-d060-46e7-8792-835bb734a809\") " pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.939168 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c8j7\" (UniqueName: \"kubernetes.io/projected/c2a64def-d060-46e7-8792-835bb734a809-kube-api-access-9c8j7\") pod \"ceilometer-0\" (UID: \"c2a64def-d060-46e7-8792-835bb734a809\") " pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.939200 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2a64def-d060-46e7-8792-835bb734a809-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c2a64def-d060-46e7-8792-835bb734a809\") " pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.939239 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2a64def-d060-46e7-8792-835bb734a809-log-httpd\") pod \"ceilometer-0\" (UID: \"c2a64def-d060-46e7-8792-835bb734a809\") " pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.939984 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2a64def-d060-46e7-8792-835bb734a809-log-httpd\") pod \"ceilometer-0\" (UID: \"c2a64def-d060-46e7-8792-835bb734a809\") " pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.940488 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c2a64def-d060-46e7-8792-835bb734a809-run-httpd\") pod \"ceilometer-0\" (UID: \"c2a64def-d060-46e7-8792-835bb734a809\") " pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.946129 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c2a64def-d060-46e7-8792-835bb734a809-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c2a64def-d060-46e7-8792-835bb734a809\") " pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.946235 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2a64def-d060-46e7-8792-835bb734a809-scripts\") pod \"ceilometer-0\" (UID: \"c2a64def-d060-46e7-8792-835bb734a809\") " pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.946239 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2a64def-d060-46e7-8792-835bb734a809-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c2a64def-d060-46e7-8792-835bb734a809\") " pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.947046 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a64def-d060-46e7-8792-835bb734a809-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c2a64def-d060-46e7-8792-835bb734a809\") " pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.947821 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2a64def-d060-46e7-8792-835bb734a809-config-data\") pod \"ceilometer-0\" (UID: \"c2a64def-d060-46e7-8792-835bb734a809\") " pod="openstack/ceilometer-0" Oct 09 15:39:04 crc kubenswrapper[4719]: I1009 15:39:04.956988 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c8j7\" (UniqueName: \"kubernetes.io/projected/c2a64def-d060-46e7-8792-835bb734a809-kube-api-access-9c8j7\") pod \"ceilometer-0\" (UID: \"c2a64def-d060-46e7-8792-835bb734a809\") " pod="openstack/ceilometer-0" Oct 09 15:39:05 crc kubenswrapper[4719]: I1009 15:39:05.119475 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 15:39:05 crc kubenswrapper[4719]: I1009 15:39:05.177876 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1adfbd0c-70c5-41c6-b400-7ef501bcd042" path="/var/lib/kubelet/pods/1adfbd0c-70c5-41c6-b400-7ef501bcd042/volumes" Oct 09 15:39:05 crc kubenswrapper[4719]: I1009 15:39:05.178520 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43123f9f-e7f7-4d3a-8856-18d324f55254" path="/var/lib/kubelet/pods/43123f9f-e7f7-4d3a-8856-18d324f55254/volumes" Oct 09 15:39:06 crc kubenswrapper[4719]: I1009 15:39:05.579177 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 15:39:06 crc kubenswrapper[4719]: W1009 15:39:05.588101 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2a64def_d060_46e7_8792_835bb734a809.slice/crio-717266a1c4d94afcdf42019fcc6b124db45167877cd61b70788b4ef8d609a64c WatchSource:0}: Error finding container 717266a1c4d94afcdf42019fcc6b124db45167877cd61b70788b4ef8d609a64c: Status 404 returned error can't find the container with id 717266a1c4d94afcdf42019fcc6b124db45167877cd61b70788b4ef8d609a64c Oct 09 15:39:06 crc kubenswrapper[4719]: I1009 15:39:05.618317 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49188dc1-6a08-454e-8a32-a2d252d091b3","Type":"ContainerStarted","Data":"c4d683f6292ad40ff086bbfa4aa6994834f0291f82e2412f0d1d6e375b8d4db7"} Oct 09 15:39:06 crc kubenswrapper[4719]: I1009 15:39:05.618368 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49188dc1-6a08-454e-8a32-a2d252d091b3","Type":"ContainerStarted","Data":"526e227ba2c04dab1e155386c4f4ca41809d5d65e90839701d141bf9042dd6b4"} Oct 09 15:39:06 crc kubenswrapper[4719]: I1009 15:39:05.618382 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49188dc1-6a08-454e-8a32-a2d252d091b3","Type":"ContainerStarted","Data":"296a66a9466482feb48f8f45722dfda99793fc873bdb93bacb2df204900fea5f"} Oct 09 15:39:06 crc kubenswrapper[4719]: I1009 15:39:05.619330 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2a64def-d060-46e7-8792-835bb734a809","Type":"ContainerStarted","Data":"717266a1c4d94afcdf42019fcc6b124db45167877cd61b70788b4ef8d609a64c"} Oct 09 15:39:06 crc kubenswrapper[4719]: I1009 15:39:05.645838 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.64581524 podStartE2EDuration="2.64581524s" podCreationTimestamp="2025-10-09 15:39:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:39:05.63610178 +0000 UTC m=+1251.145813075" watchObservedRunningTime="2025-10-09 15:39:05.64581524 +0000 UTC m=+1251.155526525" Oct 09 15:39:06 crc kubenswrapper[4719]: I1009 15:39:06.641337 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2a64def-d060-46e7-8792-835bb734a809","Type":"ContainerStarted","Data":"efbeb8d4aa4b14e0084dce3f93a29432c6651c25f3ed009d24330c3d25d21b78"} Oct 09 15:39:06 crc kubenswrapper[4719]: I1009 15:39:06.642133 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2a64def-d060-46e7-8792-835bb734a809","Type":"ContainerStarted","Data":"1c00dc01002de6c3160ef4823fd4ec4c6433b71435245bc40b94b6e096331789"} Oct 09 15:39:06 crc kubenswrapper[4719]: I1009 15:39:06.839096 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:39:06 crc kubenswrapper[4719]: I1009 15:39:06.857277 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:39:06 crc kubenswrapper[4719]: I1009 15:39:06.978960 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:39:06 crc kubenswrapper[4719]: I1009 15:39:06.979449 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:39:06 crc kubenswrapper[4719]: I1009 15:39:06.979591 4719 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 15:39:06 crc kubenswrapper[4719]: I1009 15:39:06.980295 4719 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f9a20c39c2315beb69542aebd5bd73add66f7a319edb9051e38f9e594c365d9"} pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 15:39:06 crc kubenswrapper[4719]: I1009 15:39:06.980449 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" containerID="cri-o://3f9a20c39c2315beb69542aebd5bd73add66f7a319edb9051e38f9e594c365d9" gracePeriod=600 Oct 09 15:39:07 crc kubenswrapper[4719]: I1009 15:39:07.650322 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2a64def-d060-46e7-8792-835bb734a809","Type":"ContainerStarted","Data":"f404af0c2d6d06841fcacbed9d4e0cff9a070d076c35046268387c1d3bde5d3c"} Oct 09 15:39:07 crc kubenswrapper[4719]: I1009 15:39:07.653509 4719 generic.go:334] "Generic (PLEG): container finished" podID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerID="3f9a20c39c2315beb69542aebd5bd73add66f7a319edb9051e38f9e594c365d9" exitCode=0 Oct 09 15:39:07 crc kubenswrapper[4719]: I1009 15:39:07.654745 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerDied","Data":"3f9a20c39c2315beb69542aebd5bd73add66f7a319edb9051e38f9e594c365d9"} Oct 09 15:39:07 crc kubenswrapper[4719]: I1009 15:39:07.654803 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerStarted","Data":"a69453ebf4e1aaf18164eaf7feb2c37cbe0797331962fcb2782850eab34c0940"} Oct 09 15:39:07 crc kubenswrapper[4719]: I1009 15:39:07.654828 4719 scope.go:117] "RemoveContainer" containerID="b3d6b70762c7cbd23b776b68a38ad2cc0ec2c06605dcc7efb9d87a0020d07dde" Oct 09 15:39:07 crc kubenswrapper[4719]: I1009 15:39:07.678723 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 09 15:39:07 crc kubenswrapper[4719]: I1009 15:39:07.923798 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-6wvgx"] Oct 09 15:39:07 crc kubenswrapper[4719]: I1009 15:39:07.925854 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6wvgx" Oct 09 15:39:07 crc kubenswrapper[4719]: I1009 15:39:07.929227 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 09 15:39:07 crc kubenswrapper[4719]: I1009 15:39:07.929298 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 09 15:39:07 crc kubenswrapper[4719]: I1009 15:39:07.959265 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6wvgx"] Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:07.999972 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d393af38-b13f-4f11-bff8-c10fd25fe75d-config-data\") pod \"nova-cell1-cell-mapping-6wvgx\" (UID: \"d393af38-b13f-4f11-bff8-c10fd25fe75d\") " pod="openstack/nova-cell1-cell-mapping-6wvgx" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.000022 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p298c\" (UniqueName: \"kubernetes.io/projected/d393af38-b13f-4f11-bff8-c10fd25fe75d-kube-api-access-p298c\") pod \"nova-cell1-cell-mapping-6wvgx\" (UID: \"d393af38-b13f-4f11-bff8-c10fd25fe75d\") " pod="openstack/nova-cell1-cell-mapping-6wvgx" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.000059 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d393af38-b13f-4f11-bff8-c10fd25fe75d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6wvgx\" (UID: \"d393af38-b13f-4f11-bff8-c10fd25fe75d\") " pod="openstack/nova-cell1-cell-mapping-6wvgx" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.000416 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d393af38-b13f-4f11-bff8-c10fd25fe75d-scripts\") pod \"nova-cell1-cell-mapping-6wvgx\" (UID: \"d393af38-b13f-4f11-bff8-c10fd25fe75d\") " pod="openstack/nova-cell1-cell-mapping-6wvgx" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.002527 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66f8586597-6gv4h" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.081880 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58b454788c-qkz7q"] Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.082162 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58b454788c-qkz7q" podUID="aa17763b-47cf-4305-a42e-4f43fa08e189" containerName="dnsmasq-dns" containerID="cri-o://71045742e21a3be979a2a83e281e587df446cbee999bbdead78cb9ad7112f296" gracePeriod=10 Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.103640 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d393af38-b13f-4f11-bff8-c10fd25fe75d-scripts\") pod \"nova-cell1-cell-mapping-6wvgx\" (UID: \"d393af38-b13f-4f11-bff8-c10fd25fe75d\") " pod="openstack/nova-cell1-cell-mapping-6wvgx" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.104047 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d393af38-b13f-4f11-bff8-c10fd25fe75d-config-data\") pod \"nova-cell1-cell-mapping-6wvgx\" (UID: \"d393af38-b13f-4f11-bff8-c10fd25fe75d\") " pod="openstack/nova-cell1-cell-mapping-6wvgx" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.104066 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p298c\" (UniqueName: \"kubernetes.io/projected/d393af38-b13f-4f11-bff8-c10fd25fe75d-kube-api-access-p298c\") pod \"nova-cell1-cell-mapping-6wvgx\" (UID: \"d393af38-b13f-4f11-bff8-c10fd25fe75d\") " pod="openstack/nova-cell1-cell-mapping-6wvgx" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.104105 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d393af38-b13f-4f11-bff8-c10fd25fe75d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6wvgx\" (UID: \"d393af38-b13f-4f11-bff8-c10fd25fe75d\") " pod="openstack/nova-cell1-cell-mapping-6wvgx" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.111319 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d393af38-b13f-4f11-bff8-c10fd25fe75d-scripts\") pod \"nova-cell1-cell-mapping-6wvgx\" (UID: \"d393af38-b13f-4f11-bff8-c10fd25fe75d\") " pod="openstack/nova-cell1-cell-mapping-6wvgx" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.112913 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d393af38-b13f-4f11-bff8-c10fd25fe75d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6wvgx\" (UID: \"d393af38-b13f-4f11-bff8-c10fd25fe75d\") " pod="openstack/nova-cell1-cell-mapping-6wvgx" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.113033 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d393af38-b13f-4f11-bff8-c10fd25fe75d-config-data\") pod \"nova-cell1-cell-mapping-6wvgx\" (UID: \"d393af38-b13f-4f11-bff8-c10fd25fe75d\") " pod="openstack/nova-cell1-cell-mapping-6wvgx" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.124047 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p298c\" (UniqueName: \"kubernetes.io/projected/d393af38-b13f-4f11-bff8-c10fd25fe75d-kube-api-access-p298c\") pod \"nova-cell1-cell-mapping-6wvgx\" (UID: \"d393af38-b13f-4f11-bff8-c10fd25fe75d\") " pod="openstack/nova-cell1-cell-mapping-6wvgx" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.256251 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6wvgx" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.657012 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b454788c-qkz7q" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.678286 4719 generic.go:334] "Generic (PLEG): container finished" podID="aa17763b-47cf-4305-a42e-4f43fa08e189" containerID="71045742e21a3be979a2a83e281e587df446cbee999bbdead78cb9ad7112f296" exitCode=0 Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.678647 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b454788c-qkz7q" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.678647 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b454788c-qkz7q" event={"ID":"aa17763b-47cf-4305-a42e-4f43fa08e189","Type":"ContainerDied","Data":"71045742e21a3be979a2a83e281e587df446cbee999bbdead78cb9ad7112f296"} Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.678767 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b454788c-qkz7q" event={"ID":"aa17763b-47cf-4305-a42e-4f43fa08e189","Type":"ContainerDied","Data":"435dc4f0c0b7855343d5257551ce797d9fae620a87a79f9c0709ece61a6b4abf"} Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.678790 4719 scope.go:117] "RemoveContainer" containerID="71045742e21a3be979a2a83e281e587df446cbee999bbdead78cb9ad7112f296" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.718689 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-ovsdbserver-nb\") pod \"aa17763b-47cf-4305-a42e-4f43fa08e189\" (UID: \"aa17763b-47cf-4305-a42e-4f43fa08e189\") " Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.718821 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-ovsdbserver-sb\") pod \"aa17763b-47cf-4305-a42e-4f43fa08e189\" (UID: \"aa17763b-47cf-4305-a42e-4f43fa08e189\") " Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.718865 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-dns-swift-storage-0\") pod \"aa17763b-47cf-4305-a42e-4f43fa08e189\" (UID: \"aa17763b-47cf-4305-a42e-4f43fa08e189\") " Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.718891 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-config\") pod \"aa17763b-47cf-4305-a42e-4f43fa08e189\" (UID: \"aa17763b-47cf-4305-a42e-4f43fa08e189\") " Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.718953 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkkmc\" (UniqueName: \"kubernetes.io/projected/aa17763b-47cf-4305-a42e-4f43fa08e189-kube-api-access-pkkmc\") pod \"aa17763b-47cf-4305-a42e-4f43fa08e189\" (UID: \"aa17763b-47cf-4305-a42e-4f43fa08e189\") " Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.719109 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-dns-svc\") pod \"aa17763b-47cf-4305-a42e-4f43fa08e189\" (UID: \"aa17763b-47cf-4305-a42e-4f43fa08e189\") " Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.726477 4719 scope.go:117] "RemoveContainer" containerID="4f30eeb640cf2e01c51d40c18b2e0cae635f32af4dedfa28679ca4200968a0b1" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.726692 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa17763b-47cf-4305-a42e-4f43fa08e189-kube-api-access-pkkmc" (OuterVolumeSpecName: "kube-api-access-pkkmc") pod "aa17763b-47cf-4305-a42e-4f43fa08e189" (UID: "aa17763b-47cf-4305-a42e-4f43fa08e189"). InnerVolumeSpecName "kube-api-access-pkkmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.766526 4719 scope.go:117] "RemoveContainer" containerID="71045742e21a3be979a2a83e281e587df446cbee999bbdead78cb9ad7112f296" Oct 09 15:39:08 crc kubenswrapper[4719]: E1009 15:39:08.766947 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71045742e21a3be979a2a83e281e587df446cbee999bbdead78cb9ad7112f296\": container with ID starting with 71045742e21a3be979a2a83e281e587df446cbee999bbdead78cb9ad7112f296 not found: ID does not exist" containerID="71045742e21a3be979a2a83e281e587df446cbee999bbdead78cb9ad7112f296" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.766985 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71045742e21a3be979a2a83e281e587df446cbee999bbdead78cb9ad7112f296"} err="failed to get container status \"71045742e21a3be979a2a83e281e587df446cbee999bbdead78cb9ad7112f296\": rpc error: code = NotFound desc = could not find container \"71045742e21a3be979a2a83e281e587df446cbee999bbdead78cb9ad7112f296\": container with ID starting with 71045742e21a3be979a2a83e281e587df446cbee999bbdead78cb9ad7112f296 not found: ID does not exist" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.767040 4719 scope.go:117] "RemoveContainer" containerID="4f30eeb640cf2e01c51d40c18b2e0cae635f32af4dedfa28679ca4200968a0b1" Oct 09 15:39:08 crc kubenswrapper[4719]: E1009 15:39:08.767631 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f30eeb640cf2e01c51d40c18b2e0cae635f32af4dedfa28679ca4200968a0b1\": container with ID starting with 4f30eeb640cf2e01c51d40c18b2e0cae635f32af4dedfa28679ca4200968a0b1 not found: ID does not exist" containerID="4f30eeb640cf2e01c51d40c18b2e0cae635f32af4dedfa28679ca4200968a0b1" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.767672 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f30eeb640cf2e01c51d40c18b2e0cae635f32af4dedfa28679ca4200968a0b1"} err="failed to get container status \"4f30eeb640cf2e01c51d40c18b2e0cae635f32af4dedfa28679ca4200968a0b1\": rpc error: code = NotFound desc = could not find container \"4f30eeb640cf2e01c51d40c18b2e0cae635f32af4dedfa28679ca4200968a0b1\": container with ID starting with 4f30eeb640cf2e01c51d40c18b2e0cae635f32af4dedfa28679ca4200968a0b1 not found: ID does not exist" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.822215 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkkmc\" (UniqueName: \"kubernetes.io/projected/aa17763b-47cf-4305-a42e-4f43fa08e189-kube-api-access-pkkmc\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.855615 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6wvgx"] Oct 09 15:39:08 crc kubenswrapper[4719]: W1009 15:39:08.858969 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd393af38_b13f_4f11_bff8_c10fd25fe75d.slice/crio-21cf6786aca5dca2a8c20b162f38b58ee0c90ef16bc504c24c8715b8fe7e6394 WatchSource:0}: Error finding container 21cf6786aca5dca2a8c20b162f38b58ee0c90ef16bc504c24c8715b8fe7e6394: Status 404 returned error can't find the container with id 21cf6786aca5dca2a8c20b162f38b58ee0c90ef16bc504c24c8715b8fe7e6394 Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.894827 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aa17763b-47cf-4305-a42e-4f43fa08e189" (UID: "aa17763b-47cf-4305-a42e-4f43fa08e189"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.901416 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-config" (OuterVolumeSpecName: "config") pod "aa17763b-47cf-4305-a42e-4f43fa08e189" (UID: "aa17763b-47cf-4305-a42e-4f43fa08e189"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.905543 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aa17763b-47cf-4305-a42e-4f43fa08e189" (UID: "aa17763b-47cf-4305-a42e-4f43fa08e189"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.913789 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aa17763b-47cf-4305-a42e-4f43fa08e189" (UID: "aa17763b-47cf-4305-a42e-4f43fa08e189"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.918280 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aa17763b-47cf-4305-a42e-4f43fa08e189" (UID: "aa17763b-47cf-4305-a42e-4f43fa08e189"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.923916 4719 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.923938 4719 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.923950 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.923962 4719 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:08 crc kubenswrapper[4719]: I1009 15:39:08.923970 4719 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa17763b-47cf-4305-a42e-4f43fa08e189-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:09 crc kubenswrapper[4719]: I1009 15:39:09.025794 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58b454788c-qkz7q"] Oct 09 15:39:09 crc kubenswrapper[4719]: I1009 15:39:09.038159 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58b454788c-qkz7q"] Oct 09 15:39:09 crc kubenswrapper[4719]: I1009 15:39:09.179515 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa17763b-47cf-4305-a42e-4f43fa08e189" path="/var/lib/kubelet/pods/aa17763b-47cf-4305-a42e-4f43fa08e189/volumes" Oct 09 15:39:09 crc kubenswrapper[4719]: I1009 15:39:09.692962 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6wvgx" event={"ID":"d393af38-b13f-4f11-bff8-c10fd25fe75d","Type":"ContainerStarted","Data":"e5560106bcb3aed30021975f8311fe1f4b36ab807bfa40286e680a4cef3b8000"} Oct 09 15:39:09 crc kubenswrapper[4719]: I1009 15:39:09.693269 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6wvgx" event={"ID":"d393af38-b13f-4f11-bff8-c10fd25fe75d","Type":"ContainerStarted","Data":"21cf6786aca5dca2a8c20b162f38b58ee0c90ef16bc504c24c8715b8fe7e6394"} Oct 09 15:39:09 crc kubenswrapper[4719]: I1009 15:39:09.708120 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c2a64def-d060-46e7-8792-835bb734a809","Type":"ContainerStarted","Data":"ca5af0d425498763b57fa4dd31b51aa437ff563ee6fe404e4bf236741b82492b"} Oct 09 15:39:09 crc kubenswrapper[4719]: I1009 15:39:09.708277 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 15:39:09 crc kubenswrapper[4719]: I1009 15:39:09.711270 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-6wvgx" podStartSLOduration=2.711253027 podStartE2EDuration="2.711253027s" podCreationTimestamp="2025-10-09 15:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:39:09.709520161 +0000 UTC m=+1255.219231456" watchObservedRunningTime="2025-10-09 15:39:09.711253027 +0000 UTC m=+1255.220964332" Oct 09 15:39:09 crc kubenswrapper[4719]: I1009 15:39:09.733115 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.785503284 podStartE2EDuration="5.733096914s" podCreationTimestamp="2025-10-09 15:39:04 +0000 UTC" firstStartedPulling="2025-10-09 15:39:05.590309538 +0000 UTC m=+1251.100020823" lastFinishedPulling="2025-10-09 15:39:08.537903168 +0000 UTC m=+1254.047614453" observedRunningTime="2025-10-09 15:39:09.729655154 +0000 UTC m=+1255.239366449" watchObservedRunningTime="2025-10-09 15:39:09.733096914 +0000 UTC m=+1255.242808199" Oct 09 15:39:13 crc kubenswrapper[4719]: I1009 15:39:13.599753 4719 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58b454788c-qkz7q" podUID="aa17763b-47cf-4305-a42e-4f43fa08e189" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.214:5353: i/o timeout" Oct 09 15:39:13 crc kubenswrapper[4719]: I1009 15:39:13.965601 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 15:39:13 crc kubenswrapper[4719]: I1009 15:39:13.966045 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 15:39:14 crc kubenswrapper[4719]: I1009 15:39:14.752423 4719 generic.go:334] "Generic (PLEG): container finished" podID="d393af38-b13f-4f11-bff8-c10fd25fe75d" containerID="e5560106bcb3aed30021975f8311fe1f4b36ab807bfa40286e680a4cef3b8000" exitCode=0 Oct 09 15:39:14 crc kubenswrapper[4719]: I1009 15:39:14.752509 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6wvgx" event={"ID":"d393af38-b13f-4f11-bff8-c10fd25fe75d","Type":"ContainerDied","Data":"e5560106bcb3aed30021975f8311fe1f4b36ab807bfa40286e680a4cef3b8000"} Oct 09 15:39:14 crc kubenswrapper[4719]: I1009 15:39:14.979475 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="49188dc1-6a08-454e-8a32-a2d252d091b3" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.223:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 15:39:14 crc kubenswrapper[4719]: I1009 15:39:14.979475 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="49188dc1-6a08-454e-8a32-a2d252d091b3" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.223:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 15:39:16 crc kubenswrapper[4719]: I1009 15:39:16.143954 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6wvgx" Oct 09 15:39:16 crc kubenswrapper[4719]: I1009 15:39:16.272033 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p298c\" (UniqueName: \"kubernetes.io/projected/d393af38-b13f-4f11-bff8-c10fd25fe75d-kube-api-access-p298c\") pod \"d393af38-b13f-4f11-bff8-c10fd25fe75d\" (UID: \"d393af38-b13f-4f11-bff8-c10fd25fe75d\") " Oct 09 15:39:16 crc kubenswrapper[4719]: I1009 15:39:16.272091 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d393af38-b13f-4f11-bff8-c10fd25fe75d-combined-ca-bundle\") pod \"d393af38-b13f-4f11-bff8-c10fd25fe75d\" (UID: \"d393af38-b13f-4f11-bff8-c10fd25fe75d\") " Oct 09 15:39:16 crc kubenswrapper[4719]: I1009 15:39:16.272138 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d393af38-b13f-4f11-bff8-c10fd25fe75d-scripts\") pod \"d393af38-b13f-4f11-bff8-c10fd25fe75d\" (UID: \"d393af38-b13f-4f11-bff8-c10fd25fe75d\") " Oct 09 15:39:16 crc kubenswrapper[4719]: I1009 15:39:16.272887 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d393af38-b13f-4f11-bff8-c10fd25fe75d-config-data\") pod \"d393af38-b13f-4f11-bff8-c10fd25fe75d\" (UID: \"d393af38-b13f-4f11-bff8-c10fd25fe75d\") " Oct 09 15:39:16 crc kubenswrapper[4719]: I1009 15:39:16.277611 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d393af38-b13f-4f11-bff8-c10fd25fe75d-scripts" (OuterVolumeSpecName: "scripts") pod "d393af38-b13f-4f11-bff8-c10fd25fe75d" (UID: "d393af38-b13f-4f11-bff8-c10fd25fe75d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:39:16 crc kubenswrapper[4719]: I1009 15:39:16.282965 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d393af38-b13f-4f11-bff8-c10fd25fe75d-kube-api-access-p298c" (OuterVolumeSpecName: "kube-api-access-p298c") pod "d393af38-b13f-4f11-bff8-c10fd25fe75d" (UID: "d393af38-b13f-4f11-bff8-c10fd25fe75d"). InnerVolumeSpecName "kube-api-access-p298c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:39:16 crc kubenswrapper[4719]: I1009 15:39:16.302459 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d393af38-b13f-4f11-bff8-c10fd25fe75d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d393af38-b13f-4f11-bff8-c10fd25fe75d" (UID: "d393af38-b13f-4f11-bff8-c10fd25fe75d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:39:16 crc kubenswrapper[4719]: I1009 15:39:16.302904 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d393af38-b13f-4f11-bff8-c10fd25fe75d-config-data" (OuterVolumeSpecName: "config-data") pod "d393af38-b13f-4f11-bff8-c10fd25fe75d" (UID: "d393af38-b13f-4f11-bff8-c10fd25fe75d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:39:16 crc kubenswrapper[4719]: I1009 15:39:16.375143 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p298c\" (UniqueName: \"kubernetes.io/projected/d393af38-b13f-4f11-bff8-c10fd25fe75d-kube-api-access-p298c\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:16 crc kubenswrapper[4719]: I1009 15:39:16.375179 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d393af38-b13f-4f11-bff8-c10fd25fe75d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:16 crc kubenswrapper[4719]: I1009 15:39:16.375189 4719 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d393af38-b13f-4f11-bff8-c10fd25fe75d-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:16 crc kubenswrapper[4719]: I1009 15:39:16.375199 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d393af38-b13f-4f11-bff8-c10fd25fe75d-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:16 crc kubenswrapper[4719]: I1009 15:39:16.773036 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6wvgx" event={"ID":"d393af38-b13f-4f11-bff8-c10fd25fe75d","Type":"ContainerDied","Data":"21cf6786aca5dca2a8c20b162f38b58ee0c90ef16bc504c24c8715b8fe7e6394"} Oct 09 15:39:16 crc kubenswrapper[4719]: I1009 15:39:16.773692 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21cf6786aca5dca2a8c20b162f38b58ee0c90ef16bc504c24c8715b8fe7e6394" Oct 09 15:39:16 crc kubenswrapper[4719]: I1009 15:39:16.773840 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6wvgx" Oct 09 15:39:16 crc kubenswrapper[4719]: I1009 15:39:16.948958 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 15:39:16 crc kubenswrapper[4719]: I1009 15:39:16.949472 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="bbf9c16f-67c5-40df-8a0d-9993341b5f6e" containerName="nova-scheduler-scheduler" containerID="cri-o://4b2d1444fd94ba0dc2b092803cdf0e9f3380acaff7b82ceaf5076e613b95c91f" gracePeriod=30 Oct 09 15:39:16 crc kubenswrapper[4719]: I1009 15:39:16.963834 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 15:39:16 crc kubenswrapper[4719]: I1009 15:39:16.964126 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="49188dc1-6a08-454e-8a32-a2d252d091b3" containerName="nova-api-log" containerID="cri-o://526e227ba2c04dab1e155386c4f4ca41809d5d65e90839701d141bf9042dd6b4" gracePeriod=30 Oct 09 15:39:16 crc kubenswrapper[4719]: I1009 15:39:16.964283 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="49188dc1-6a08-454e-8a32-a2d252d091b3" containerName="nova-api-api" containerID="cri-o://c4d683f6292ad40ff086bbfa4aa6994834f0291f82e2412f0d1d6e375b8d4db7" gracePeriod=30 Oct 09 15:39:16 crc kubenswrapper[4719]: I1009 15:39:16.980755 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 15:39:16 crc kubenswrapper[4719]: I1009 15:39:16.981026 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="659a2cfd-faec-4ff3-9a23-7733a138493e" containerName="nova-metadata-log" containerID="cri-o://966f250211675f8690fa96c8f29689409f69e49fceaf7fabbbb79a060e17a791" gracePeriod=30 Oct 09 15:39:16 crc kubenswrapper[4719]: I1009 15:39:16.981251 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="659a2cfd-faec-4ff3-9a23-7733a138493e" containerName="nova-metadata-metadata" containerID="cri-o://9ca0d5390efd60e078c476f987e29cd72f157ed67e6f8a5cc2f5aaa89d9cb60b" gracePeriod=30 Oct 09 15:39:17 crc kubenswrapper[4719]: I1009 15:39:17.798220 4719 generic.go:334] "Generic (PLEG): container finished" podID="659a2cfd-faec-4ff3-9a23-7733a138493e" containerID="966f250211675f8690fa96c8f29689409f69e49fceaf7fabbbb79a060e17a791" exitCode=143 Oct 09 15:39:17 crc kubenswrapper[4719]: I1009 15:39:17.798385 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"659a2cfd-faec-4ff3-9a23-7733a138493e","Type":"ContainerDied","Data":"966f250211675f8690fa96c8f29689409f69e49fceaf7fabbbb79a060e17a791"} Oct 09 15:39:17 crc kubenswrapper[4719]: I1009 15:39:17.802997 4719 generic.go:334] "Generic (PLEG): container finished" podID="49188dc1-6a08-454e-8a32-a2d252d091b3" containerID="526e227ba2c04dab1e155386c4f4ca41809d5d65e90839701d141bf9042dd6b4" exitCode=143 Oct 09 15:39:17 crc kubenswrapper[4719]: I1009 15:39:17.803055 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49188dc1-6a08-454e-8a32-a2d252d091b3","Type":"ContainerDied","Data":"526e227ba2c04dab1e155386c4f4ca41809d5d65e90839701d141bf9042dd6b4"} Oct 09 15:39:17 crc kubenswrapper[4719]: E1009 15:39:17.905633 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4b2d1444fd94ba0dc2b092803cdf0e9f3380acaff7b82ceaf5076e613b95c91f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 09 15:39:17 crc kubenswrapper[4719]: E1009 15:39:17.907324 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4b2d1444fd94ba0dc2b092803cdf0e9f3380acaff7b82ceaf5076e613b95c91f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 09 15:39:17 crc kubenswrapper[4719]: E1009 15:39:17.908649 4719 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4b2d1444fd94ba0dc2b092803cdf0e9f3380acaff7b82ceaf5076e613b95c91f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 09 15:39:17 crc kubenswrapper[4719]: E1009 15:39:17.908732 4719 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="bbf9c16f-67c5-40df-8a0d-9993341b5f6e" containerName="nova-scheduler-scheduler" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.239783 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.332462 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49188dc1-6a08-454e-8a32-a2d252d091b3-logs\") pod \"49188dc1-6a08-454e-8a32-a2d252d091b3\" (UID: \"49188dc1-6a08-454e-8a32-a2d252d091b3\") " Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.332563 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvjg8\" (UniqueName: \"kubernetes.io/projected/49188dc1-6a08-454e-8a32-a2d252d091b3-kube-api-access-gvjg8\") pod \"49188dc1-6a08-454e-8a32-a2d252d091b3\" (UID: \"49188dc1-6a08-454e-8a32-a2d252d091b3\") " Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.332620 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49188dc1-6a08-454e-8a32-a2d252d091b3-combined-ca-bundle\") pod \"49188dc1-6a08-454e-8a32-a2d252d091b3\" (UID: \"49188dc1-6a08-454e-8a32-a2d252d091b3\") " Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.332687 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49188dc1-6a08-454e-8a32-a2d252d091b3-internal-tls-certs\") pod \"49188dc1-6a08-454e-8a32-a2d252d091b3\" (UID: \"49188dc1-6a08-454e-8a32-a2d252d091b3\") " Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.332842 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49188dc1-6a08-454e-8a32-a2d252d091b3-public-tls-certs\") pod \"49188dc1-6a08-454e-8a32-a2d252d091b3\" (UID: \"49188dc1-6a08-454e-8a32-a2d252d091b3\") " Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.332887 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49188dc1-6a08-454e-8a32-a2d252d091b3-config-data\") pod \"49188dc1-6a08-454e-8a32-a2d252d091b3\" (UID: \"49188dc1-6a08-454e-8a32-a2d252d091b3\") " Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.334157 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49188dc1-6a08-454e-8a32-a2d252d091b3-logs" (OuterVolumeSpecName: "logs") pod "49188dc1-6a08-454e-8a32-a2d252d091b3" (UID: "49188dc1-6a08-454e-8a32-a2d252d091b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.340045 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49188dc1-6a08-454e-8a32-a2d252d091b3-kube-api-access-gvjg8" (OuterVolumeSpecName: "kube-api-access-gvjg8") pod "49188dc1-6a08-454e-8a32-a2d252d091b3" (UID: "49188dc1-6a08-454e-8a32-a2d252d091b3"). InnerVolumeSpecName "kube-api-access-gvjg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.372326 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49188dc1-6a08-454e-8a32-a2d252d091b3-config-data" (OuterVolumeSpecName: "config-data") pod "49188dc1-6a08-454e-8a32-a2d252d091b3" (UID: "49188dc1-6a08-454e-8a32-a2d252d091b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.372649 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49188dc1-6a08-454e-8a32-a2d252d091b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49188dc1-6a08-454e-8a32-a2d252d091b3" (UID: "49188dc1-6a08-454e-8a32-a2d252d091b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.406233 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49188dc1-6a08-454e-8a32-a2d252d091b3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "49188dc1-6a08-454e-8a32-a2d252d091b3" (UID: "49188dc1-6a08-454e-8a32-a2d252d091b3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.421860 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49188dc1-6a08-454e-8a32-a2d252d091b3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "49188dc1-6a08-454e-8a32-a2d252d091b3" (UID: "49188dc1-6a08-454e-8a32-a2d252d091b3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.435462 4719 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49188dc1-6a08-454e-8a32-a2d252d091b3-logs\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.435558 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvjg8\" (UniqueName: \"kubernetes.io/projected/49188dc1-6a08-454e-8a32-a2d252d091b3-kube-api-access-gvjg8\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.435615 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49188dc1-6a08-454e-8a32-a2d252d091b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.435664 4719 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49188dc1-6a08-454e-8a32-a2d252d091b3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.435711 4719 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49188dc1-6a08-454e-8a32-a2d252d091b3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.435759 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49188dc1-6a08-454e-8a32-a2d252d091b3-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.496643 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.639484 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm4lx\" (UniqueName: \"kubernetes.io/projected/659a2cfd-faec-4ff3-9a23-7733a138493e-kube-api-access-cm4lx\") pod \"659a2cfd-faec-4ff3-9a23-7733a138493e\" (UID: \"659a2cfd-faec-4ff3-9a23-7733a138493e\") " Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.639591 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/659a2cfd-faec-4ff3-9a23-7733a138493e-nova-metadata-tls-certs\") pod \"659a2cfd-faec-4ff3-9a23-7733a138493e\" (UID: \"659a2cfd-faec-4ff3-9a23-7733a138493e\") " Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.639640 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/659a2cfd-faec-4ff3-9a23-7733a138493e-combined-ca-bundle\") pod \"659a2cfd-faec-4ff3-9a23-7733a138493e\" (UID: \"659a2cfd-faec-4ff3-9a23-7733a138493e\") " Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.639718 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/659a2cfd-faec-4ff3-9a23-7733a138493e-config-data\") pod \"659a2cfd-faec-4ff3-9a23-7733a138493e\" (UID: \"659a2cfd-faec-4ff3-9a23-7733a138493e\") " Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.639766 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/659a2cfd-faec-4ff3-9a23-7733a138493e-logs\") pod \"659a2cfd-faec-4ff3-9a23-7733a138493e\" (UID: \"659a2cfd-faec-4ff3-9a23-7733a138493e\") " Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.640488 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/659a2cfd-faec-4ff3-9a23-7733a138493e-logs" (OuterVolumeSpecName: "logs") pod "659a2cfd-faec-4ff3-9a23-7733a138493e" (UID: "659a2cfd-faec-4ff3-9a23-7733a138493e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.642741 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/659a2cfd-faec-4ff3-9a23-7733a138493e-kube-api-access-cm4lx" (OuterVolumeSpecName: "kube-api-access-cm4lx") pod "659a2cfd-faec-4ff3-9a23-7733a138493e" (UID: "659a2cfd-faec-4ff3-9a23-7733a138493e"). InnerVolumeSpecName "kube-api-access-cm4lx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.664534 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/659a2cfd-faec-4ff3-9a23-7733a138493e-config-data" (OuterVolumeSpecName: "config-data") pod "659a2cfd-faec-4ff3-9a23-7733a138493e" (UID: "659a2cfd-faec-4ff3-9a23-7733a138493e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.668430 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/659a2cfd-faec-4ff3-9a23-7733a138493e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "659a2cfd-faec-4ff3-9a23-7733a138493e" (UID: "659a2cfd-faec-4ff3-9a23-7733a138493e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.715011 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/659a2cfd-faec-4ff3-9a23-7733a138493e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "659a2cfd-faec-4ff3-9a23-7733a138493e" (UID: "659a2cfd-faec-4ff3-9a23-7733a138493e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.742425 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm4lx\" (UniqueName: \"kubernetes.io/projected/659a2cfd-faec-4ff3-9a23-7733a138493e-kube-api-access-cm4lx\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.742459 4719 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/659a2cfd-faec-4ff3-9a23-7733a138493e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.742472 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/659a2cfd-faec-4ff3-9a23-7733a138493e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.742481 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/659a2cfd-faec-4ff3-9a23-7733a138493e-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.742490 4719 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/659a2cfd-faec-4ff3-9a23-7733a138493e-logs\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.813813 4719 generic.go:334] "Generic (PLEG): container finished" podID="659a2cfd-faec-4ff3-9a23-7733a138493e" containerID="9ca0d5390efd60e078c476f987e29cd72f157ed67e6f8a5cc2f5aaa89d9cb60b" exitCode=0 Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.813881 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.813898 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"659a2cfd-faec-4ff3-9a23-7733a138493e","Type":"ContainerDied","Data":"9ca0d5390efd60e078c476f987e29cd72f157ed67e6f8a5cc2f5aaa89d9cb60b"} Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.814587 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"659a2cfd-faec-4ff3-9a23-7733a138493e","Type":"ContainerDied","Data":"63edd1bec258e02f9625c5172cda1c3ff57b09a0f893da809d86c4ee51e27f5f"} Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.814607 4719 scope.go:117] "RemoveContainer" containerID="9ca0d5390efd60e078c476f987e29cd72f157ed67e6f8a5cc2f5aaa89d9cb60b" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.817884 4719 generic.go:334] "Generic (PLEG): container finished" podID="49188dc1-6a08-454e-8a32-a2d252d091b3" containerID="c4d683f6292ad40ff086bbfa4aa6994834f0291f82e2412f0d1d6e375b8d4db7" exitCode=0 Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.817924 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49188dc1-6a08-454e-8a32-a2d252d091b3","Type":"ContainerDied","Data":"c4d683f6292ad40ff086bbfa4aa6994834f0291f82e2412f0d1d6e375b8d4db7"} Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.817938 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.817960 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49188dc1-6a08-454e-8a32-a2d252d091b3","Type":"ContainerDied","Data":"296a66a9466482feb48f8f45722dfda99793fc873bdb93bacb2df204900fea5f"} Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.853043 4719 scope.go:117] "RemoveContainer" containerID="966f250211675f8690fa96c8f29689409f69e49fceaf7fabbbb79a060e17a791" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.859576 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.871768 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.886169 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.896595 4719 scope.go:117] "RemoveContainer" containerID="9ca0d5390efd60e078c476f987e29cd72f157ed67e6f8a5cc2f5aaa89d9cb60b" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.899210 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 15:39:18 crc kubenswrapper[4719]: E1009 15:39:18.901560 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ca0d5390efd60e078c476f987e29cd72f157ed67e6f8a5cc2f5aaa89d9cb60b\": container with ID starting with 9ca0d5390efd60e078c476f987e29cd72f157ed67e6f8a5cc2f5aaa89d9cb60b not found: ID does not exist" containerID="9ca0d5390efd60e078c476f987e29cd72f157ed67e6f8a5cc2f5aaa89d9cb60b" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.901612 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ca0d5390efd60e078c476f987e29cd72f157ed67e6f8a5cc2f5aaa89d9cb60b"} err="failed to get container status \"9ca0d5390efd60e078c476f987e29cd72f157ed67e6f8a5cc2f5aaa89d9cb60b\": rpc error: code = NotFound desc = could not find container \"9ca0d5390efd60e078c476f987e29cd72f157ed67e6f8a5cc2f5aaa89d9cb60b\": container with ID starting with 9ca0d5390efd60e078c476f987e29cd72f157ed67e6f8a5cc2f5aaa89d9cb60b not found: ID does not exist" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.901637 4719 scope.go:117] "RemoveContainer" containerID="966f250211675f8690fa96c8f29689409f69e49fceaf7fabbbb79a060e17a791" Oct 09 15:39:18 crc kubenswrapper[4719]: E1009 15:39:18.901993 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"966f250211675f8690fa96c8f29689409f69e49fceaf7fabbbb79a060e17a791\": container with ID starting with 966f250211675f8690fa96c8f29689409f69e49fceaf7fabbbb79a060e17a791 not found: ID does not exist" containerID="966f250211675f8690fa96c8f29689409f69e49fceaf7fabbbb79a060e17a791" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.902036 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"966f250211675f8690fa96c8f29689409f69e49fceaf7fabbbb79a060e17a791"} err="failed to get container status \"966f250211675f8690fa96c8f29689409f69e49fceaf7fabbbb79a060e17a791\": rpc error: code = NotFound desc = could not find container \"966f250211675f8690fa96c8f29689409f69e49fceaf7fabbbb79a060e17a791\": container with ID starting with 966f250211675f8690fa96c8f29689409f69e49fceaf7fabbbb79a060e17a791 not found: ID does not exist" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.902066 4719 scope.go:117] "RemoveContainer" containerID="c4d683f6292ad40ff086bbfa4aa6994834f0291f82e2412f0d1d6e375b8d4db7" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.911049 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 09 15:39:18 crc kubenswrapper[4719]: E1009 15:39:18.911581 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="659a2cfd-faec-4ff3-9a23-7733a138493e" containerName="nova-metadata-log" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.911625 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="659a2cfd-faec-4ff3-9a23-7733a138493e" containerName="nova-metadata-log" Oct 09 15:39:18 crc kubenswrapper[4719]: E1009 15:39:18.911653 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d393af38-b13f-4f11-bff8-c10fd25fe75d" containerName="nova-manage" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.911662 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="d393af38-b13f-4f11-bff8-c10fd25fe75d" containerName="nova-manage" Oct 09 15:39:18 crc kubenswrapper[4719]: E1009 15:39:18.911678 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49188dc1-6a08-454e-8a32-a2d252d091b3" containerName="nova-api-api" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.911703 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="49188dc1-6a08-454e-8a32-a2d252d091b3" containerName="nova-api-api" Oct 09 15:39:18 crc kubenswrapper[4719]: E1009 15:39:18.911719 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa17763b-47cf-4305-a42e-4f43fa08e189" containerName="dnsmasq-dns" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.911726 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa17763b-47cf-4305-a42e-4f43fa08e189" containerName="dnsmasq-dns" Oct 09 15:39:18 crc kubenswrapper[4719]: E1009 15:39:18.911743 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49188dc1-6a08-454e-8a32-a2d252d091b3" containerName="nova-api-log" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.911750 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="49188dc1-6a08-454e-8a32-a2d252d091b3" containerName="nova-api-log" Oct 09 15:39:18 crc kubenswrapper[4719]: E1009 15:39:18.911764 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="659a2cfd-faec-4ff3-9a23-7733a138493e" containerName="nova-metadata-metadata" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.911771 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="659a2cfd-faec-4ff3-9a23-7733a138493e" containerName="nova-metadata-metadata" Oct 09 15:39:18 crc kubenswrapper[4719]: E1009 15:39:18.911782 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa17763b-47cf-4305-a42e-4f43fa08e189" containerName="init" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.911789 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa17763b-47cf-4305-a42e-4f43fa08e189" containerName="init" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.912008 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="d393af38-b13f-4f11-bff8-c10fd25fe75d" containerName="nova-manage" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.912021 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="49188dc1-6a08-454e-8a32-a2d252d091b3" containerName="nova-api-api" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.912042 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="659a2cfd-faec-4ff3-9a23-7733a138493e" containerName="nova-metadata-log" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.912052 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="659a2cfd-faec-4ff3-9a23-7733a138493e" containerName="nova-metadata-metadata" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.912061 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa17763b-47cf-4305-a42e-4f43fa08e189" containerName="dnsmasq-dns" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.912071 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="49188dc1-6a08-454e-8a32-a2d252d091b3" containerName="nova-api-log" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.913126 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.918500 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.918679 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.918819 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.922632 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.936850 4719 scope.go:117] "RemoveContainer" containerID="526e227ba2c04dab1e155386c4f4ca41809d5d65e90839701d141bf9042dd6b4" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.940449 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.942545 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.944304 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.944789 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.948665 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.971700 4719 scope.go:117] "RemoveContainer" containerID="c4d683f6292ad40ff086bbfa4aa6994834f0291f82e2412f0d1d6e375b8d4db7" Oct 09 15:39:18 crc kubenswrapper[4719]: E1009 15:39:18.972117 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4d683f6292ad40ff086bbfa4aa6994834f0291f82e2412f0d1d6e375b8d4db7\": container with ID starting with c4d683f6292ad40ff086bbfa4aa6994834f0291f82e2412f0d1d6e375b8d4db7 not found: ID does not exist" containerID="c4d683f6292ad40ff086bbfa4aa6994834f0291f82e2412f0d1d6e375b8d4db7" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.972159 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4d683f6292ad40ff086bbfa4aa6994834f0291f82e2412f0d1d6e375b8d4db7"} err="failed to get container status \"c4d683f6292ad40ff086bbfa4aa6994834f0291f82e2412f0d1d6e375b8d4db7\": rpc error: code = NotFound desc = could not find container \"c4d683f6292ad40ff086bbfa4aa6994834f0291f82e2412f0d1d6e375b8d4db7\": container with ID starting with c4d683f6292ad40ff086bbfa4aa6994834f0291f82e2412f0d1d6e375b8d4db7 not found: ID does not exist" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.972192 4719 scope.go:117] "RemoveContainer" containerID="526e227ba2c04dab1e155386c4f4ca41809d5d65e90839701d141bf9042dd6b4" Oct 09 15:39:18 crc kubenswrapper[4719]: E1009 15:39:18.972503 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"526e227ba2c04dab1e155386c4f4ca41809d5d65e90839701d141bf9042dd6b4\": container with ID starting with 526e227ba2c04dab1e155386c4f4ca41809d5d65e90839701d141bf9042dd6b4 not found: ID does not exist" containerID="526e227ba2c04dab1e155386c4f4ca41809d5d65e90839701d141bf9042dd6b4" Oct 09 15:39:18 crc kubenswrapper[4719]: I1009 15:39:18.972528 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"526e227ba2c04dab1e155386c4f4ca41809d5d65e90839701d141bf9042dd6b4"} err="failed to get container status \"526e227ba2c04dab1e155386c4f4ca41809d5d65e90839701d141bf9042dd6b4\": rpc error: code = NotFound desc = could not find container \"526e227ba2c04dab1e155386c4f4ca41809d5d65e90839701d141bf9042dd6b4\": container with ID starting with 526e227ba2c04dab1e155386c4f4ca41809d5d65e90839701d141bf9042dd6b4 not found: ID does not exist" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.054901 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cptzg\" (UniqueName: \"kubernetes.io/projected/c3ae86dd-d13b-46b0-8f6f-a913c783a884-kube-api-access-cptzg\") pod \"nova-api-0\" (UID: \"c3ae86dd-d13b-46b0-8f6f-a913c783a884\") " pod="openstack/nova-api-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.054980 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhvmj\" (UniqueName: \"kubernetes.io/projected/c1cdf53c-cd57-4e4c-85b0-178a7bc15043-kube-api-access-hhvmj\") pod \"nova-metadata-0\" (UID: \"c1cdf53c-cd57-4e4c-85b0-178a7bc15043\") " pod="openstack/nova-metadata-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.055019 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3ae86dd-d13b-46b0-8f6f-a913c783a884-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c3ae86dd-d13b-46b0-8f6f-a913c783a884\") " pod="openstack/nova-api-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.055054 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3ae86dd-d13b-46b0-8f6f-a913c783a884-public-tls-certs\") pod \"nova-api-0\" (UID: \"c3ae86dd-d13b-46b0-8f6f-a913c783a884\") " pod="openstack/nova-api-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.055117 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3ae86dd-d13b-46b0-8f6f-a913c783a884-logs\") pod \"nova-api-0\" (UID: \"c3ae86dd-d13b-46b0-8f6f-a913c783a884\") " pod="openstack/nova-api-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.055132 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1cdf53c-cd57-4e4c-85b0-178a7bc15043-config-data\") pod \"nova-metadata-0\" (UID: \"c1cdf53c-cd57-4e4c-85b0-178a7bc15043\") " pod="openstack/nova-metadata-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.055160 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1cdf53c-cd57-4e4c-85b0-178a7bc15043-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c1cdf53c-cd57-4e4c-85b0-178a7bc15043\") " pod="openstack/nova-metadata-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.055264 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1cdf53c-cd57-4e4c-85b0-178a7bc15043-logs\") pod \"nova-metadata-0\" (UID: \"c1cdf53c-cd57-4e4c-85b0-178a7bc15043\") " pod="openstack/nova-metadata-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.055290 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3ae86dd-d13b-46b0-8f6f-a913c783a884-config-data\") pod \"nova-api-0\" (UID: \"c3ae86dd-d13b-46b0-8f6f-a913c783a884\") " pod="openstack/nova-api-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.055311 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1cdf53c-cd57-4e4c-85b0-178a7bc15043-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c1cdf53c-cd57-4e4c-85b0-178a7bc15043\") " pod="openstack/nova-metadata-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.055380 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3ae86dd-d13b-46b0-8f6f-a913c783a884-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c3ae86dd-d13b-46b0-8f6f-a913c783a884\") " pod="openstack/nova-api-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.157000 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3ae86dd-d13b-46b0-8f6f-a913c783a884-public-tls-certs\") pod \"nova-api-0\" (UID: \"c3ae86dd-d13b-46b0-8f6f-a913c783a884\") " pod="openstack/nova-api-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.157139 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3ae86dd-d13b-46b0-8f6f-a913c783a884-logs\") pod \"nova-api-0\" (UID: \"c3ae86dd-d13b-46b0-8f6f-a913c783a884\") " pod="openstack/nova-api-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.157164 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1cdf53c-cd57-4e4c-85b0-178a7bc15043-config-data\") pod \"nova-metadata-0\" (UID: \"c1cdf53c-cd57-4e4c-85b0-178a7bc15043\") " pod="openstack/nova-metadata-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.157202 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1cdf53c-cd57-4e4c-85b0-178a7bc15043-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c1cdf53c-cd57-4e4c-85b0-178a7bc15043\") " pod="openstack/nova-metadata-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.157258 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1cdf53c-cd57-4e4c-85b0-178a7bc15043-logs\") pod \"nova-metadata-0\" (UID: \"c1cdf53c-cd57-4e4c-85b0-178a7bc15043\") " pod="openstack/nova-metadata-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.157284 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3ae86dd-d13b-46b0-8f6f-a913c783a884-config-data\") pod \"nova-api-0\" (UID: \"c3ae86dd-d13b-46b0-8f6f-a913c783a884\") " pod="openstack/nova-api-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.157314 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1cdf53c-cd57-4e4c-85b0-178a7bc15043-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c1cdf53c-cd57-4e4c-85b0-178a7bc15043\") " pod="openstack/nova-metadata-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.157370 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3ae86dd-d13b-46b0-8f6f-a913c783a884-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c3ae86dd-d13b-46b0-8f6f-a913c783a884\") " pod="openstack/nova-api-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.157652 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1cdf53c-cd57-4e4c-85b0-178a7bc15043-logs\") pod \"nova-metadata-0\" (UID: \"c1cdf53c-cd57-4e4c-85b0-178a7bc15043\") " pod="openstack/nova-metadata-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.158122 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cptzg\" (UniqueName: \"kubernetes.io/projected/c3ae86dd-d13b-46b0-8f6f-a913c783a884-kube-api-access-cptzg\") pod \"nova-api-0\" (UID: \"c3ae86dd-d13b-46b0-8f6f-a913c783a884\") " pod="openstack/nova-api-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.158175 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhvmj\" (UniqueName: \"kubernetes.io/projected/c1cdf53c-cd57-4e4c-85b0-178a7bc15043-kube-api-access-hhvmj\") pod \"nova-metadata-0\" (UID: \"c1cdf53c-cd57-4e4c-85b0-178a7bc15043\") " pod="openstack/nova-metadata-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.158255 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3ae86dd-d13b-46b0-8f6f-a913c783a884-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c3ae86dd-d13b-46b0-8f6f-a913c783a884\") " pod="openstack/nova-api-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.158714 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3ae86dd-d13b-46b0-8f6f-a913c783a884-logs\") pod \"nova-api-0\" (UID: \"c3ae86dd-d13b-46b0-8f6f-a913c783a884\") " pod="openstack/nova-api-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.164074 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3ae86dd-d13b-46b0-8f6f-a913c783a884-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c3ae86dd-d13b-46b0-8f6f-a913c783a884\") " pod="openstack/nova-api-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.165461 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3ae86dd-d13b-46b0-8f6f-a913c783a884-config-data\") pod \"nova-api-0\" (UID: \"c3ae86dd-d13b-46b0-8f6f-a913c783a884\") " pod="openstack/nova-api-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.165970 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3ae86dd-d13b-46b0-8f6f-a913c783a884-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c3ae86dd-d13b-46b0-8f6f-a913c783a884\") " pod="openstack/nova-api-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.166200 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1cdf53c-cd57-4e4c-85b0-178a7bc15043-config-data\") pod \"nova-metadata-0\" (UID: \"c1cdf53c-cd57-4e4c-85b0-178a7bc15043\") " pod="openstack/nova-metadata-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.167597 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1cdf53c-cd57-4e4c-85b0-178a7bc15043-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c1cdf53c-cd57-4e4c-85b0-178a7bc15043\") " pod="openstack/nova-metadata-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.167714 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1cdf53c-cd57-4e4c-85b0-178a7bc15043-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c1cdf53c-cd57-4e4c-85b0-178a7bc15043\") " pod="openstack/nova-metadata-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.167772 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3ae86dd-d13b-46b0-8f6f-a913c783a884-public-tls-certs\") pod \"nova-api-0\" (UID: \"c3ae86dd-d13b-46b0-8f6f-a913c783a884\") " pod="openstack/nova-api-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.179115 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cptzg\" (UniqueName: \"kubernetes.io/projected/c3ae86dd-d13b-46b0-8f6f-a913c783a884-kube-api-access-cptzg\") pod \"nova-api-0\" (UID: \"c3ae86dd-d13b-46b0-8f6f-a913c783a884\") " pod="openstack/nova-api-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.180501 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49188dc1-6a08-454e-8a32-a2d252d091b3" path="/var/lib/kubelet/pods/49188dc1-6a08-454e-8a32-a2d252d091b3/volumes" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.181318 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="659a2cfd-faec-4ff3-9a23-7733a138493e" path="/var/lib/kubelet/pods/659a2cfd-faec-4ff3-9a23-7733a138493e/volumes" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.183880 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhvmj\" (UniqueName: \"kubernetes.io/projected/c1cdf53c-cd57-4e4c-85b0-178a7bc15043-kube-api-access-hhvmj\") pod \"nova-metadata-0\" (UID: \"c1cdf53c-cd57-4e4c-85b0-178a7bc15043\") " pod="openstack/nova-metadata-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.238519 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.259781 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.712952 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 15:39:19 crc kubenswrapper[4719]: W1009 15:39:19.718294 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3ae86dd_d13b_46b0_8f6f_a913c783a884.slice/crio-0318dcdf6c391e3a97b690ce724042e61b27d21321c415253b6dc4880119657a WatchSource:0}: Error finding container 0318dcdf6c391e3a97b690ce724042e61b27d21321c415253b6dc4880119657a: Status 404 returned error can't find the container with id 0318dcdf6c391e3a97b690ce724042e61b27d21321c415253b6dc4880119657a Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.789990 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 15:39:19 crc kubenswrapper[4719]: W1009 15:39:19.792847 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1cdf53c_cd57_4e4c_85b0_178a7bc15043.slice/crio-ea453688d09ebe875fd9a3ddf39c76815f1bccc1b828075fb1aae6b692160209 WatchSource:0}: Error finding container ea453688d09ebe875fd9a3ddf39c76815f1bccc1b828075fb1aae6b692160209: Status 404 returned error can't find the container with id ea453688d09ebe875fd9a3ddf39c76815f1bccc1b828075fb1aae6b692160209 Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.829234 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c3ae86dd-d13b-46b0-8f6f-a913c783a884","Type":"ContainerStarted","Data":"0318dcdf6c391e3a97b690ce724042e61b27d21321c415253b6dc4880119657a"} Oct 09 15:39:19 crc kubenswrapper[4719]: I1009 15:39:19.832763 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c1cdf53c-cd57-4e4c-85b0-178a7bc15043","Type":"ContainerStarted","Data":"ea453688d09ebe875fd9a3ddf39c76815f1bccc1b828075fb1aae6b692160209"} Oct 09 15:39:20 crc kubenswrapper[4719]: I1009 15:39:20.864365 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c3ae86dd-d13b-46b0-8f6f-a913c783a884","Type":"ContainerStarted","Data":"d261d360f315d2196b786b103d24a3da52ac007de1436bcf4023be5d5b1e3d69"} Oct 09 15:39:20 crc kubenswrapper[4719]: I1009 15:39:20.864728 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c3ae86dd-d13b-46b0-8f6f-a913c783a884","Type":"ContainerStarted","Data":"bbb70550b4a5c4660d388dd843ad6fd22902b25b5b53790c50b72aba2634973a"} Oct 09 15:39:20 crc kubenswrapper[4719]: I1009 15:39:20.866866 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c1cdf53c-cd57-4e4c-85b0-178a7bc15043","Type":"ContainerStarted","Data":"2fe419ebda187422cd52cb1741ffc6e85b29df4b12c83f78241957cf93d0253e"} Oct 09 15:39:20 crc kubenswrapper[4719]: I1009 15:39:20.866896 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c1cdf53c-cd57-4e4c-85b0-178a7bc15043","Type":"ContainerStarted","Data":"91b1752d5d8ec7289965c4e1aa282106b22b1c4b663976f15eb2b39b51c4ed88"} Oct 09 15:39:20 crc kubenswrapper[4719]: I1009 15:39:20.884048 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.884028571 podStartE2EDuration="2.884028571s" podCreationTimestamp="2025-10-09 15:39:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:39:20.880542109 +0000 UTC m=+1266.390253424" watchObservedRunningTime="2025-10-09 15:39:20.884028571 +0000 UTC m=+1266.393739866" Oct 09 15:39:20 crc kubenswrapper[4719]: I1009 15:39:20.904856 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.904835775 podStartE2EDuration="2.904835775s" podCreationTimestamp="2025-10-09 15:39:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:39:20.90373701 +0000 UTC m=+1266.413448305" watchObservedRunningTime="2025-10-09 15:39:20.904835775 +0000 UTC m=+1266.414547050" Oct 09 15:39:21 crc kubenswrapper[4719]: I1009 15:39:21.886484 4719 generic.go:334] "Generic (PLEG): container finished" podID="bbf9c16f-67c5-40df-8a0d-9993341b5f6e" containerID="4b2d1444fd94ba0dc2b092803cdf0e9f3380acaff7b82ceaf5076e613b95c91f" exitCode=0 Oct 09 15:39:21 crc kubenswrapper[4719]: I1009 15:39:21.886904 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bbf9c16f-67c5-40df-8a0d-9993341b5f6e","Type":"ContainerDied","Data":"4b2d1444fd94ba0dc2b092803cdf0e9f3380acaff7b82ceaf5076e613b95c91f"} Oct 09 15:39:22 crc kubenswrapper[4719]: I1009 15:39:22.134391 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 15:39:22 crc kubenswrapper[4719]: I1009 15:39:22.224941 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw5vc\" (UniqueName: \"kubernetes.io/projected/bbf9c16f-67c5-40df-8a0d-9993341b5f6e-kube-api-access-pw5vc\") pod \"bbf9c16f-67c5-40df-8a0d-9993341b5f6e\" (UID: \"bbf9c16f-67c5-40df-8a0d-9993341b5f6e\") " Oct 09 15:39:22 crc kubenswrapper[4719]: I1009 15:39:22.225060 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf9c16f-67c5-40df-8a0d-9993341b5f6e-config-data\") pod \"bbf9c16f-67c5-40df-8a0d-9993341b5f6e\" (UID: \"bbf9c16f-67c5-40df-8a0d-9993341b5f6e\") " Oct 09 15:39:22 crc kubenswrapper[4719]: I1009 15:39:22.225102 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf9c16f-67c5-40df-8a0d-9993341b5f6e-combined-ca-bundle\") pod \"bbf9c16f-67c5-40df-8a0d-9993341b5f6e\" (UID: \"bbf9c16f-67c5-40df-8a0d-9993341b5f6e\") " Oct 09 15:39:22 crc kubenswrapper[4719]: I1009 15:39:22.235612 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbf9c16f-67c5-40df-8a0d-9993341b5f6e-kube-api-access-pw5vc" (OuterVolumeSpecName: "kube-api-access-pw5vc") pod "bbf9c16f-67c5-40df-8a0d-9993341b5f6e" (UID: "bbf9c16f-67c5-40df-8a0d-9993341b5f6e"). InnerVolumeSpecName "kube-api-access-pw5vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:39:22 crc kubenswrapper[4719]: I1009 15:39:22.253951 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf9c16f-67c5-40df-8a0d-9993341b5f6e-config-data" (OuterVolumeSpecName: "config-data") pod "bbf9c16f-67c5-40df-8a0d-9993341b5f6e" (UID: "bbf9c16f-67c5-40df-8a0d-9993341b5f6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:39:22 crc kubenswrapper[4719]: I1009 15:39:22.256701 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf9c16f-67c5-40df-8a0d-9993341b5f6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbf9c16f-67c5-40df-8a0d-9993341b5f6e" (UID: "bbf9c16f-67c5-40df-8a0d-9993341b5f6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:39:22 crc kubenswrapper[4719]: I1009 15:39:22.327309 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw5vc\" (UniqueName: \"kubernetes.io/projected/bbf9c16f-67c5-40df-8a0d-9993341b5f6e-kube-api-access-pw5vc\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:22 crc kubenswrapper[4719]: I1009 15:39:22.327342 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf9c16f-67c5-40df-8a0d-9993341b5f6e-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:22 crc kubenswrapper[4719]: I1009 15:39:22.327394 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf9c16f-67c5-40df-8a0d-9993341b5f6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:22 crc kubenswrapper[4719]: I1009 15:39:22.901604 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bbf9c16f-67c5-40df-8a0d-9993341b5f6e","Type":"ContainerDied","Data":"281554f8ece5e15cbda19db9a7c3d02453a91453796e0bc88e259c5582e21e1f"} Oct 09 15:39:22 crc kubenswrapper[4719]: I1009 15:39:22.901932 4719 scope.go:117] "RemoveContainer" containerID="4b2d1444fd94ba0dc2b092803cdf0e9f3380acaff7b82ceaf5076e613b95c91f" Oct 09 15:39:22 crc kubenswrapper[4719]: I1009 15:39:22.901717 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 15:39:22 crc kubenswrapper[4719]: I1009 15:39:22.946444 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 15:39:22 crc kubenswrapper[4719]: I1009 15:39:22.958633 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 15:39:22 crc kubenswrapper[4719]: I1009 15:39:22.970125 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 15:39:22 crc kubenswrapper[4719]: E1009 15:39:22.970531 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf9c16f-67c5-40df-8a0d-9993341b5f6e" containerName="nova-scheduler-scheduler" Oct 09 15:39:22 crc kubenswrapper[4719]: I1009 15:39:22.970547 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf9c16f-67c5-40df-8a0d-9993341b5f6e" containerName="nova-scheduler-scheduler" Oct 09 15:39:22 crc kubenswrapper[4719]: I1009 15:39:22.970805 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbf9c16f-67c5-40df-8a0d-9993341b5f6e" containerName="nova-scheduler-scheduler" Oct 09 15:39:22 crc kubenswrapper[4719]: I1009 15:39:22.971490 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 15:39:22 crc kubenswrapper[4719]: I1009 15:39:22.981042 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 09 15:39:22 crc kubenswrapper[4719]: I1009 15:39:22.994984 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 15:39:23 crc kubenswrapper[4719]: I1009 15:39:23.141284 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8b2b868-83d7-496b-8036-a10584724f35-config-data\") pod \"nova-scheduler-0\" (UID: \"c8b2b868-83d7-496b-8036-a10584724f35\") " pod="openstack/nova-scheduler-0" Oct 09 15:39:23 crc kubenswrapper[4719]: I1009 15:39:23.141522 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9xhq\" (UniqueName: \"kubernetes.io/projected/c8b2b868-83d7-496b-8036-a10584724f35-kube-api-access-m9xhq\") pod \"nova-scheduler-0\" (UID: \"c8b2b868-83d7-496b-8036-a10584724f35\") " pod="openstack/nova-scheduler-0" Oct 09 15:39:23 crc kubenswrapper[4719]: I1009 15:39:23.141814 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8b2b868-83d7-496b-8036-a10584724f35-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c8b2b868-83d7-496b-8036-a10584724f35\") " pod="openstack/nova-scheduler-0" Oct 09 15:39:23 crc kubenswrapper[4719]: I1009 15:39:23.174223 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbf9c16f-67c5-40df-8a0d-9993341b5f6e" path="/var/lib/kubelet/pods/bbf9c16f-67c5-40df-8a0d-9993341b5f6e/volumes" Oct 09 15:39:23 crc kubenswrapper[4719]: I1009 15:39:23.244288 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8b2b868-83d7-496b-8036-a10584724f35-config-data\") pod \"nova-scheduler-0\" (UID: \"c8b2b868-83d7-496b-8036-a10584724f35\") " pod="openstack/nova-scheduler-0" Oct 09 15:39:23 crc kubenswrapper[4719]: I1009 15:39:23.244607 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9xhq\" (UniqueName: \"kubernetes.io/projected/c8b2b868-83d7-496b-8036-a10584724f35-kube-api-access-m9xhq\") pod \"nova-scheduler-0\" (UID: \"c8b2b868-83d7-496b-8036-a10584724f35\") " pod="openstack/nova-scheduler-0" Oct 09 15:39:23 crc kubenswrapper[4719]: I1009 15:39:23.244880 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8b2b868-83d7-496b-8036-a10584724f35-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c8b2b868-83d7-496b-8036-a10584724f35\") " pod="openstack/nova-scheduler-0" Oct 09 15:39:23 crc kubenswrapper[4719]: I1009 15:39:23.251288 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8b2b868-83d7-496b-8036-a10584724f35-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c8b2b868-83d7-496b-8036-a10584724f35\") " pod="openstack/nova-scheduler-0" Oct 09 15:39:23 crc kubenswrapper[4719]: I1009 15:39:23.267875 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8b2b868-83d7-496b-8036-a10584724f35-config-data\") pod \"nova-scheduler-0\" (UID: \"c8b2b868-83d7-496b-8036-a10584724f35\") " pod="openstack/nova-scheduler-0" Oct 09 15:39:23 crc kubenswrapper[4719]: I1009 15:39:23.289648 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9xhq\" (UniqueName: \"kubernetes.io/projected/c8b2b868-83d7-496b-8036-a10584724f35-kube-api-access-m9xhq\") pod \"nova-scheduler-0\" (UID: \"c8b2b868-83d7-496b-8036-a10584724f35\") " pod="openstack/nova-scheduler-0" Oct 09 15:39:23 crc kubenswrapper[4719]: I1009 15:39:23.295003 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 15:39:23 crc kubenswrapper[4719]: I1009 15:39:23.727867 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 15:39:23 crc kubenswrapper[4719]: W1009 15:39:23.729864 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8b2b868_83d7_496b_8036_a10584724f35.slice/crio-9c3d88201a9a77e9e51ba3fbd6893415010d22f01f09b003b997618262dd70a7 WatchSource:0}: Error finding container 9c3d88201a9a77e9e51ba3fbd6893415010d22f01f09b003b997618262dd70a7: Status 404 returned error can't find the container with id 9c3d88201a9a77e9e51ba3fbd6893415010d22f01f09b003b997618262dd70a7 Oct 09 15:39:23 crc kubenswrapper[4719]: I1009 15:39:23.915301 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c8b2b868-83d7-496b-8036-a10584724f35","Type":"ContainerStarted","Data":"9c3d88201a9a77e9e51ba3fbd6893415010d22f01f09b003b997618262dd70a7"} Oct 09 15:39:24 crc kubenswrapper[4719]: I1009 15:39:24.260394 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 15:39:24 crc kubenswrapper[4719]: I1009 15:39:24.260444 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 15:39:24 crc kubenswrapper[4719]: I1009 15:39:24.925107 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c8b2b868-83d7-496b-8036-a10584724f35","Type":"ContainerStarted","Data":"86f8ac28f9b2a07d6d8b63c9800510dda45cf3fafe4f8b3d72eb57df095924f6"} Oct 09 15:39:24 crc kubenswrapper[4719]: I1009 15:39:24.942811 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.942797433 podStartE2EDuration="2.942797433s" podCreationTimestamp="2025-10-09 15:39:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:39:24.942020168 +0000 UTC m=+1270.451731533" watchObservedRunningTime="2025-10-09 15:39:24.942797433 +0000 UTC m=+1270.452508718" Oct 09 15:39:28 crc kubenswrapper[4719]: I1009 15:39:28.295525 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 09 15:39:29 crc kubenswrapper[4719]: I1009 15:39:29.240157 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 15:39:29 crc kubenswrapper[4719]: I1009 15:39:29.240529 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 15:39:29 crc kubenswrapper[4719]: I1009 15:39:29.260781 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 09 15:39:29 crc kubenswrapper[4719]: I1009 15:39:29.260836 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 09 15:39:30 crc kubenswrapper[4719]: I1009 15:39:30.252531 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c3ae86dd-d13b-46b0-8f6f-a913c783a884" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.226:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 15:39:30 crc kubenswrapper[4719]: I1009 15:39:30.252595 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c3ae86dd-d13b-46b0-8f6f-a913c783a884" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.226:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 15:39:30 crc kubenswrapper[4719]: I1009 15:39:30.274595 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c1cdf53c-cd57-4e4c-85b0-178a7bc15043" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.227:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 15:39:30 crc kubenswrapper[4719]: I1009 15:39:30.274654 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c1cdf53c-cd57-4e4c-85b0-178a7bc15043" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.227:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 15:39:33 crc kubenswrapper[4719]: I1009 15:39:33.295747 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 09 15:39:33 crc kubenswrapper[4719]: I1009 15:39:33.322757 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 09 15:39:34 crc kubenswrapper[4719]: I1009 15:39:34.040842 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 09 15:39:35 crc kubenswrapper[4719]: I1009 15:39:35.128211 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 09 15:39:39 crc kubenswrapper[4719]: I1009 15:39:39.247996 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 09 15:39:39 crc kubenswrapper[4719]: I1009 15:39:39.248823 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 09 15:39:39 crc kubenswrapper[4719]: I1009 15:39:39.252162 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 09 15:39:39 crc kubenswrapper[4719]: I1009 15:39:39.258188 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 09 15:39:39 crc kubenswrapper[4719]: I1009 15:39:39.265133 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 09 15:39:39 crc kubenswrapper[4719]: I1009 15:39:39.270629 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 09 15:39:39 crc kubenswrapper[4719]: I1009 15:39:39.271288 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 09 15:39:40 crc kubenswrapper[4719]: I1009 15:39:40.064680 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 09 15:39:40 crc kubenswrapper[4719]: I1009 15:39:40.070644 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 09 15:39:40 crc kubenswrapper[4719]: I1009 15:39:40.075937 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 09 15:39:48 crc kubenswrapper[4719]: I1009 15:39:48.414249 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 15:39:49 crc kubenswrapper[4719]: I1009 15:39:49.331824 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 15:39:51 crc kubenswrapper[4719]: I1009 15:39:51.861052 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="d3a820d9-3c13-47ec-a39e-dea4d60b7536" containerName="rabbitmq" containerID="cri-o://8611d72af232bf969bfd43aaa526aa9e349e35e4283ddc18128023360f6a6345" gracePeriod=604797 Oct 09 15:39:52 crc kubenswrapper[4719]: I1009 15:39:52.605266 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="c8f5a6f9-5554-485d-9aee-47449402e37b" containerName="rabbitmq" containerID="cri-o://9078d3710db1f54276251a04386eaa1cc78f42b433fab5cf22a1239ba1607a6d" gracePeriod=604797 Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.238958 4719 generic.go:334] "Generic (PLEG): container finished" podID="d3a820d9-3c13-47ec-a39e-dea4d60b7536" containerID="8611d72af232bf969bfd43aaa526aa9e349e35e4283ddc18128023360f6a6345" exitCode=0 Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.239004 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d3a820d9-3c13-47ec-a39e-dea4d60b7536","Type":"ContainerDied","Data":"8611d72af232bf969bfd43aaa526aa9e349e35e4283ddc18128023360f6a6345"} Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.494915 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.642796 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d3a820d9-3c13-47ec-a39e-dea4d60b7536-rabbitmq-confd\") pod \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.642878 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3a820d9-3c13-47ec-a39e-dea4d60b7536-config-data\") pod \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.642941 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d3a820d9-3c13-47ec-a39e-dea4d60b7536-erlang-cookie-secret\") pod \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.643030 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d3a820d9-3c13-47ec-a39e-dea4d60b7536-rabbitmq-plugins\") pod \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.643059 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d3a820d9-3c13-47ec-a39e-dea4d60b7536-rabbitmq-erlang-cookie\") pod \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.643091 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d3a820d9-3c13-47ec-a39e-dea4d60b7536-server-conf\") pod \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.643137 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d3a820d9-3c13-47ec-a39e-dea4d60b7536-pod-info\") pod \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.643193 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.643245 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d3a820d9-3c13-47ec-a39e-dea4d60b7536-rabbitmq-tls\") pod \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.643303 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d3a820d9-3c13-47ec-a39e-dea4d60b7536-plugins-conf\") pod \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.643381 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlpkb\" (UniqueName: \"kubernetes.io/projected/d3a820d9-3c13-47ec-a39e-dea4d60b7536-kube-api-access-rlpkb\") pod \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\" (UID: \"d3a820d9-3c13-47ec-a39e-dea4d60b7536\") " Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.643944 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3a820d9-3c13-47ec-a39e-dea4d60b7536-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d3a820d9-3c13-47ec-a39e-dea4d60b7536" (UID: "d3a820d9-3c13-47ec-a39e-dea4d60b7536"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.644108 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3a820d9-3c13-47ec-a39e-dea4d60b7536-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d3a820d9-3c13-47ec-a39e-dea4d60b7536" (UID: "d3a820d9-3c13-47ec-a39e-dea4d60b7536"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.644331 4719 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d3a820d9-3c13-47ec-a39e-dea4d60b7536-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.644407 4719 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d3a820d9-3c13-47ec-a39e-dea4d60b7536-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.644516 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3a820d9-3c13-47ec-a39e-dea4d60b7536-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d3a820d9-3c13-47ec-a39e-dea4d60b7536" (UID: "d3a820d9-3c13-47ec-a39e-dea4d60b7536"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.651338 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "d3a820d9-3c13-47ec-a39e-dea4d60b7536" (UID: "d3a820d9-3c13-47ec-a39e-dea4d60b7536"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.651518 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3a820d9-3c13-47ec-a39e-dea4d60b7536-kube-api-access-rlpkb" (OuterVolumeSpecName: "kube-api-access-rlpkb") pod "d3a820d9-3c13-47ec-a39e-dea4d60b7536" (UID: "d3a820d9-3c13-47ec-a39e-dea4d60b7536"). InnerVolumeSpecName "kube-api-access-rlpkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.651638 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3a820d9-3c13-47ec-a39e-dea4d60b7536-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d3a820d9-3c13-47ec-a39e-dea4d60b7536" (UID: "d3a820d9-3c13-47ec-a39e-dea4d60b7536"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.656625 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d3a820d9-3c13-47ec-a39e-dea4d60b7536-pod-info" (OuterVolumeSpecName: "pod-info") pod "d3a820d9-3c13-47ec-a39e-dea4d60b7536" (UID: "d3a820d9-3c13-47ec-a39e-dea4d60b7536"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.656851 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3a820d9-3c13-47ec-a39e-dea4d60b7536-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d3a820d9-3c13-47ec-a39e-dea4d60b7536" (UID: "d3a820d9-3c13-47ec-a39e-dea4d60b7536"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.681971 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3a820d9-3c13-47ec-a39e-dea4d60b7536-config-data" (OuterVolumeSpecName: "config-data") pod "d3a820d9-3c13-47ec-a39e-dea4d60b7536" (UID: "d3a820d9-3c13-47ec-a39e-dea4d60b7536"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.722301 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3a820d9-3c13-47ec-a39e-dea4d60b7536-server-conf" (OuterVolumeSpecName: "server-conf") pod "d3a820d9-3c13-47ec-a39e-dea4d60b7536" (UID: "d3a820d9-3c13-47ec-a39e-dea4d60b7536"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.746169 4719 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d3a820d9-3c13-47ec-a39e-dea4d60b7536-server-conf\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.746203 4719 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d3a820d9-3c13-47ec-a39e-dea4d60b7536-pod-info\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.746223 4719 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.746232 4719 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d3a820d9-3c13-47ec-a39e-dea4d60b7536-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.746242 4719 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d3a820d9-3c13-47ec-a39e-dea4d60b7536-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.746252 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlpkb\" (UniqueName: \"kubernetes.io/projected/d3a820d9-3c13-47ec-a39e-dea4d60b7536-kube-api-access-rlpkb\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.746262 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3a820d9-3c13-47ec-a39e-dea4d60b7536-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.746271 4719 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d3a820d9-3c13-47ec-a39e-dea4d60b7536-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.773203 4719 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.823451 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3a820d9-3c13-47ec-a39e-dea4d60b7536-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d3a820d9-3c13-47ec-a39e-dea4d60b7536" (UID: "d3a820d9-3c13-47ec-a39e-dea4d60b7536"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.847753 4719 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:53 crc kubenswrapper[4719]: I1009 15:39:53.847785 4719 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d3a820d9-3c13-47ec-a39e-dea4d60b7536-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.133842 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.250841 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d3a820d9-3c13-47ec-a39e-dea4d60b7536","Type":"ContainerDied","Data":"b0cf669e83b7b9b6df49c16b1c59f0a3249877edc9e4308a1a96a020ff89e25b"} Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.250857 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.250913 4719 scope.go:117] "RemoveContainer" containerID="8611d72af232bf969bfd43aaa526aa9e349e35e4283ddc18128023360f6a6345" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.254087 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c8f5a6f9-5554-485d-9aee-47449402e37b-server-conf\") pod \"c8f5a6f9-5554-485d-9aee-47449402e37b\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.254150 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c8f5a6f9-5554-485d-9aee-47449402e37b-rabbitmq-confd\") pod \"c8f5a6f9-5554-485d-9aee-47449402e37b\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.254178 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c8f5a6f9-5554-485d-9aee-47449402e37b-plugins-conf\") pod \"c8f5a6f9-5554-485d-9aee-47449402e37b\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.254204 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c8f5a6f9-5554-485d-9aee-47449402e37b-config-data\") pod \"c8f5a6f9-5554-485d-9aee-47449402e37b\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.254278 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c8f5a6f9-5554-485d-9aee-47449402e37b-rabbitmq-tls\") pod \"c8f5a6f9-5554-485d-9aee-47449402e37b\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.254310 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c8f5a6f9-5554-485d-9aee-47449402e37b-pod-info\") pod \"c8f5a6f9-5554-485d-9aee-47449402e37b\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.254336 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"c8f5a6f9-5554-485d-9aee-47449402e37b\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.254562 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c8f5a6f9-5554-485d-9aee-47449402e37b-rabbitmq-plugins\") pod \"c8f5a6f9-5554-485d-9aee-47449402e37b\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.254608 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-565ml\" (UniqueName: \"kubernetes.io/projected/c8f5a6f9-5554-485d-9aee-47449402e37b-kube-api-access-565ml\") pod \"c8f5a6f9-5554-485d-9aee-47449402e37b\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.254665 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c8f5a6f9-5554-485d-9aee-47449402e37b-erlang-cookie-secret\") pod \"c8f5a6f9-5554-485d-9aee-47449402e37b\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.254719 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c8f5a6f9-5554-485d-9aee-47449402e37b-rabbitmq-erlang-cookie\") pod \"c8f5a6f9-5554-485d-9aee-47449402e37b\" (UID: \"c8f5a6f9-5554-485d-9aee-47449402e37b\") " Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.255592 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8f5a6f9-5554-485d-9aee-47449402e37b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c8f5a6f9-5554-485d-9aee-47449402e37b" (UID: "c8f5a6f9-5554-485d-9aee-47449402e37b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.256078 4719 generic.go:334] "Generic (PLEG): container finished" podID="c8f5a6f9-5554-485d-9aee-47449402e37b" containerID="9078d3710db1f54276251a04386eaa1cc78f42b433fab5cf22a1239ba1607a6d" exitCode=0 Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.256125 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c8f5a6f9-5554-485d-9aee-47449402e37b","Type":"ContainerDied","Data":"9078d3710db1f54276251a04386eaa1cc78f42b433fab5cf22a1239ba1607a6d"} Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.256153 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c8f5a6f9-5554-485d-9aee-47449402e37b","Type":"ContainerDied","Data":"09b8626743fb2352967231ddfa8b0c9daa6f27a4a6802bfa250b78eefc657800"} Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.256232 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.256818 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8f5a6f9-5554-485d-9aee-47449402e37b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c8f5a6f9-5554-485d-9aee-47449402e37b" (UID: "c8f5a6f9-5554-485d-9aee-47449402e37b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.257567 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8f5a6f9-5554-485d-9aee-47449402e37b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c8f5a6f9-5554-485d-9aee-47449402e37b" (UID: "c8f5a6f9-5554-485d-9aee-47449402e37b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.261257 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8f5a6f9-5554-485d-9aee-47449402e37b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c8f5a6f9-5554-485d-9aee-47449402e37b" (UID: "c8f5a6f9-5554-485d-9aee-47449402e37b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.261723 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8f5a6f9-5554-485d-9aee-47449402e37b-kube-api-access-565ml" (OuterVolumeSpecName: "kube-api-access-565ml") pod "c8f5a6f9-5554-485d-9aee-47449402e37b" (UID: "c8f5a6f9-5554-485d-9aee-47449402e37b"). InnerVolumeSpecName "kube-api-access-565ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.263861 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c8f5a6f9-5554-485d-9aee-47449402e37b-pod-info" (OuterVolumeSpecName: "pod-info") pod "c8f5a6f9-5554-485d-9aee-47449402e37b" (UID: "c8f5a6f9-5554-485d-9aee-47449402e37b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.266987 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8f5a6f9-5554-485d-9aee-47449402e37b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c8f5a6f9-5554-485d-9aee-47449402e37b" (UID: "c8f5a6f9-5554-485d-9aee-47449402e37b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.299572 4719 scope.go:117] "RemoveContainer" containerID="72455dcfecc11206bc40520b8d088cd8e9d4106ebbf33631ff25928e9b1dc487" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.299597 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "c8f5a6f9-5554-485d-9aee-47449402e37b" (UID: "c8f5a6f9-5554-485d-9aee-47449402e37b"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.311028 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8f5a6f9-5554-485d-9aee-47449402e37b-config-data" (OuterVolumeSpecName: "config-data") pod "c8f5a6f9-5554-485d-9aee-47449402e37b" (UID: "c8f5a6f9-5554-485d-9aee-47449402e37b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.342871 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8f5a6f9-5554-485d-9aee-47449402e37b-server-conf" (OuterVolumeSpecName: "server-conf") pod "c8f5a6f9-5554-485d-9aee-47449402e37b" (UID: "c8f5a6f9-5554-485d-9aee-47449402e37b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.356815 4719 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.356848 4719 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c8f5a6f9-5554-485d-9aee-47449402e37b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.356858 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-565ml\" (UniqueName: \"kubernetes.io/projected/c8f5a6f9-5554-485d-9aee-47449402e37b-kube-api-access-565ml\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.356867 4719 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c8f5a6f9-5554-485d-9aee-47449402e37b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.356876 4719 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c8f5a6f9-5554-485d-9aee-47449402e37b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.356884 4719 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c8f5a6f9-5554-485d-9aee-47449402e37b-server-conf\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.356892 4719 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c8f5a6f9-5554-485d-9aee-47449402e37b-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.356900 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c8f5a6f9-5554-485d-9aee-47449402e37b-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.356907 4719 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c8f5a6f9-5554-485d-9aee-47449402e37b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.356914 4719 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c8f5a6f9-5554-485d-9aee-47449402e37b-pod-info\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.380575 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8f5a6f9-5554-485d-9aee-47449402e37b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c8f5a6f9-5554-485d-9aee-47449402e37b" (UID: "c8f5a6f9-5554-485d-9aee-47449402e37b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.381213 4719 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.470294 4719 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.470339 4719 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c8f5a6f9-5554-485d-9aee-47449402e37b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.473023 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.473161 4719 scope.go:117] "RemoveContainer" containerID="9078d3710db1f54276251a04386eaa1cc78f42b433fab5cf22a1239ba1607a6d" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.489709 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.506510 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 15:39:54 crc kubenswrapper[4719]: E1009 15:39:54.506937 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3a820d9-3c13-47ec-a39e-dea4d60b7536" containerName="rabbitmq" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.506952 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3a820d9-3c13-47ec-a39e-dea4d60b7536" containerName="rabbitmq" Oct 09 15:39:54 crc kubenswrapper[4719]: E1009 15:39:54.506971 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8f5a6f9-5554-485d-9aee-47449402e37b" containerName="rabbitmq" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.506977 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8f5a6f9-5554-485d-9aee-47449402e37b" containerName="rabbitmq" Oct 09 15:39:54 crc kubenswrapper[4719]: E1009 15:39:54.506989 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3a820d9-3c13-47ec-a39e-dea4d60b7536" containerName="setup-container" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.506996 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3a820d9-3c13-47ec-a39e-dea4d60b7536" containerName="setup-container" Oct 09 15:39:54 crc kubenswrapper[4719]: E1009 15:39:54.507012 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8f5a6f9-5554-485d-9aee-47449402e37b" containerName="setup-container" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.507017 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8f5a6f9-5554-485d-9aee-47449402e37b" containerName="setup-container" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.507203 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8f5a6f9-5554-485d-9aee-47449402e37b" containerName="rabbitmq" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.507222 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3a820d9-3c13-47ec-a39e-dea4d60b7536" containerName="rabbitmq" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.508275 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.516718 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.519279 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.520378 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.520787 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.520831 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.520921 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-rhngj" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.520977 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.521029 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.530517 4719 scope.go:117] "RemoveContainer" containerID="df5f9951dcb79d436b3ba8bc9106853f92f66edecb4cb1855c4a6cc3746f4ea4" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.564811 4719 scope.go:117] "RemoveContainer" containerID="9078d3710db1f54276251a04386eaa1cc78f42b433fab5cf22a1239ba1607a6d" Oct 09 15:39:54 crc kubenswrapper[4719]: E1009 15:39:54.565273 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9078d3710db1f54276251a04386eaa1cc78f42b433fab5cf22a1239ba1607a6d\": container with ID starting with 9078d3710db1f54276251a04386eaa1cc78f42b433fab5cf22a1239ba1607a6d not found: ID does not exist" containerID="9078d3710db1f54276251a04386eaa1cc78f42b433fab5cf22a1239ba1607a6d" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.565409 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9078d3710db1f54276251a04386eaa1cc78f42b433fab5cf22a1239ba1607a6d"} err="failed to get container status \"9078d3710db1f54276251a04386eaa1cc78f42b433fab5cf22a1239ba1607a6d\": rpc error: code = NotFound desc = could not find container \"9078d3710db1f54276251a04386eaa1cc78f42b433fab5cf22a1239ba1607a6d\": container with ID starting with 9078d3710db1f54276251a04386eaa1cc78f42b433fab5cf22a1239ba1607a6d not found: ID does not exist" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.565499 4719 scope.go:117] "RemoveContainer" containerID="df5f9951dcb79d436b3ba8bc9106853f92f66edecb4cb1855c4a6cc3746f4ea4" Oct 09 15:39:54 crc kubenswrapper[4719]: E1009 15:39:54.566840 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df5f9951dcb79d436b3ba8bc9106853f92f66edecb4cb1855c4a6cc3746f4ea4\": container with ID starting with df5f9951dcb79d436b3ba8bc9106853f92f66edecb4cb1855c4a6cc3746f4ea4 not found: ID does not exist" containerID="df5f9951dcb79d436b3ba8bc9106853f92f66edecb4cb1855c4a6cc3746f4ea4" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.566874 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df5f9951dcb79d436b3ba8bc9106853f92f66edecb4cb1855c4a6cc3746f4ea4"} err="failed to get container status \"df5f9951dcb79d436b3ba8bc9106853f92f66edecb4cb1855c4a6cc3746f4ea4\": rpc error: code = NotFound desc = could not find container \"df5f9951dcb79d436b3ba8bc9106853f92f66edecb4cb1855c4a6cc3746f4ea4\": container with ID starting with df5f9951dcb79d436b3ba8bc9106853f92f66edecb4cb1855c4a6cc3746f4ea4 not found: ID does not exist" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.572430 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.572496 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.572520 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.572566 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-config-data\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.572591 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.572614 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.572637 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.572844 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.572932 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtcfj\" (UniqueName: \"kubernetes.io/projected/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-kube-api-access-qtcfj\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.573004 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.573074 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.594099 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.613532 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.626489 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.628288 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.631975 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.632402 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.632625 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.632857 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.632981 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.633903 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.633902 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.636072 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-4c4vp" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.674580 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.674629 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.674660 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.674679 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.674706 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.674725 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q7hk\" (UniqueName: \"kubernetes.io/projected/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-kube-api-access-8q7hk\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.674755 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.674771 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.674806 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.674827 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.674864 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.674902 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-config-data\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.674927 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.675105 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.675134 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.675155 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.675201 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.675228 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.675257 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.675297 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtcfj\" (UniqueName: \"kubernetes.io/projected/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-kube-api-access-qtcfj\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.675331 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.675371 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.675936 4719 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.677702 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.678257 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.678677 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.680616 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.682959 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-config-data\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.685107 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.685113 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.685734 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.686606 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.695244 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtcfj\" (UniqueName: \"kubernetes.io/projected/cf67e8f5-acbb-4033-bcca-d9c86d2be88c-kube-api-access-qtcfj\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.731987 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"cf67e8f5-acbb-4033-bcca-d9c86d2be88c\") " pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.777218 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.777299 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.777384 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.777422 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.777452 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.777475 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.777514 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.777538 4719 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.777877 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.778274 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.777550 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.778338 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.778404 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q7hk\" (UniqueName: \"kubernetes.io/projected/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-kube-api-access-8q7hk\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.778456 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.778651 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.778817 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.779692 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.780652 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.781404 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.781568 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.781817 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.800956 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q7hk\" (UniqueName: \"kubernetes.io/projected/256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8-kube-api-access-8q7hk\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.814080 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.830257 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 15:39:54 crc kubenswrapper[4719]: I1009 15:39:54.947808 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:39:55 crc kubenswrapper[4719]: I1009 15:39:55.185757 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8f5a6f9-5554-485d-9aee-47449402e37b" path="/var/lib/kubelet/pods/c8f5a6f9-5554-485d-9aee-47449402e37b/volumes" Oct 09 15:39:55 crc kubenswrapper[4719]: I1009 15:39:55.187483 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3a820d9-3c13-47ec-a39e-dea4d60b7536" path="/var/lib/kubelet/pods/d3a820d9-3c13-47ec-a39e-dea4d60b7536/volumes" Oct 09 15:39:55 crc kubenswrapper[4719]: I1009 15:39:55.322923 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 15:39:55 crc kubenswrapper[4719]: I1009 15:39:55.420940 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 15:39:55 crc kubenswrapper[4719]: W1009 15:39:55.431401 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod256b6f76_4a1f_43d5_bbe4_fcc45d0a59b8.slice/crio-f17c67911ec16871696c833b5f216e22afa3111ef0970f0974be787c07fdaa12 WatchSource:0}: Error finding container f17c67911ec16871696c833b5f216e22afa3111ef0970f0974be787c07fdaa12: Status 404 returned error can't find the container with id f17c67911ec16871696c833b5f216e22afa3111ef0970f0974be787c07fdaa12 Oct 09 15:39:56 crc kubenswrapper[4719]: I1009 15:39:56.285244 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cf67e8f5-acbb-4033-bcca-d9c86d2be88c","Type":"ContainerStarted","Data":"53c3a121c885b5418eafdc5e5505ae34553b17f8e3514344ba27ae48bccfb37a"} Oct 09 15:39:56 crc kubenswrapper[4719]: I1009 15:39:56.287072 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8","Type":"ContainerStarted","Data":"f17c67911ec16871696c833b5f216e22afa3111ef0970f0974be787c07fdaa12"} Oct 09 15:39:57 crc kubenswrapper[4719]: I1009 15:39:57.299456 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cf67e8f5-acbb-4033-bcca-d9c86d2be88c","Type":"ContainerStarted","Data":"0c89f3a8486bf9bc031c7c588da4eebf8e66ecb63aa1f76dcefa8866571c8143"} Oct 09 15:39:57 crc kubenswrapper[4719]: I1009 15:39:57.302698 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8","Type":"ContainerStarted","Data":"f193a06c5855ca39fccf1c1939d16dfce34290c75726a3036e10220efeebe4aa"} Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.112778 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cb846c9d9-5g7fj"] Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.114984 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.118077 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.132668 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb846c9d9-5g7fj"] Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.153223 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-openstack-edpm-ipam\") pod \"dnsmasq-dns-6cb846c9d9-5g7fj\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.153309 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-dns-swift-storage-0\") pod \"dnsmasq-dns-6cb846c9d9-5g7fj\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.153341 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-config\") pod \"dnsmasq-dns-6cb846c9d9-5g7fj\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.153406 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-dns-svc\") pod \"dnsmasq-dns-6cb846c9d9-5g7fj\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.153496 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxtcv\" (UniqueName: \"kubernetes.io/projected/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-kube-api-access-xxtcv\") pod \"dnsmasq-dns-6cb846c9d9-5g7fj\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.153551 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb846c9d9-5g7fj\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.153580 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb846c9d9-5g7fj\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.255674 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-dns-swift-storage-0\") pod \"dnsmasq-dns-6cb846c9d9-5g7fj\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.255797 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-config\") pod \"dnsmasq-dns-6cb846c9d9-5g7fj\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.255878 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-dns-svc\") pod \"dnsmasq-dns-6cb846c9d9-5g7fj\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.256103 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxtcv\" (UniqueName: \"kubernetes.io/projected/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-kube-api-access-xxtcv\") pod \"dnsmasq-dns-6cb846c9d9-5g7fj\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.256209 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb846c9d9-5g7fj\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.256267 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb846c9d9-5g7fj\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.257415 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb846c9d9-5g7fj\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.257437 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-openstack-edpm-ipam\") pod \"dnsmasq-dns-6cb846c9d9-5g7fj\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.257449 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb846c9d9-5g7fj\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.257593 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-dns-swift-storage-0\") pod \"dnsmasq-dns-6cb846c9d9-5g7fj\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.257873 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-openstack-edpm-ipam\") pod \"dnsmasq-dns-6cb846c9d9-5g7fj\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.258164 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-config\") pod \"dnsmasq-dns-6cb846c9d9-5g7fj\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.258937 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-dns-svc\") pod \"dnsmasq-dns-6cb846c9d9-5g7fj\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.275826 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxtcv\" (UniqueName: \"kubernetes.io/projected/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-kube-api-access-xxtcv\") pod \"dnsmasq-dns-6cb846c9d9-5g7fj\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.436327 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:03 crc kubenswrapper[4719]: I1009 15:40:03.874270 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb846c9d9-5g7fj"] Oct 09 15:40:04 crc kubenswrapper[4719]: I1009 15:40:04.381641 4719 generic.go:334] "Generic (PLEG): container finished" podID="a3f0a2fe-be13-41da-bb16-5d68b6c14de4" containerID="d6a2915bfb2afc60553b83441c2b55c901203693a34c5b734c42433fa98b8859" exitCode=0 Oct 09 15:40:04 crc kubenswrapper[4719]: I1009 15:40:04.381737 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" event={"ID":"a3f0a2fe-be13-41da-bb16-5d68b6c14de4","Type":"ContainerDied","Data":"d6a2915bfb2afc60553b83441c2b55c901203693a34c5b734c42433fa98b8859"} Oct 09 15:40:04 crc kubenswrapper[4719]: I1009 15:40:04.381966 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" event={"ID":"a3f0a2fe-be13-41da-bb16-5d68b6c14de4","Type":"ContainerStarted","Data":"1659d71c3c959b0b472e70f0c2cbad1953e3c504919de24b40d09dc8c1ccb0af"} Oct 09 15:40:05 crc kubenswrapper[4719]: I1009 15:40:05.391787 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" event={"ID":"a3f0a2fe-be13-41da-bb16-5d68b6c14de4","Type":"ContainerStarted","Data":"3f3e45fd5d839263a7a9ccdb3c56692cd7a0ab79628512ad452c31b55488823e"} Oct 09 15:40:05 crc kubenswrapper[4719]: I1009 15:40:05.392714 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:05 crc kubenswrapper[4719]: I1009 15:40:05.409246 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" podStartSLOduration=2.40919441 podStartE2EDuration="2.40919441s" podCreationTimestamp="2025-10-09 15:40:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:40:05.408388385 +0000 UTC m=+1310.918099660" watchObservedRunningTime="2025-10-09 15:40:05.40919441 +0000 UTC m=+1310.918905685" Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.438376 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.523437 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66f8586597-6gv4h"] Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.523698 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66f8586597-6gv4h" podUID="8c6a2f16-2424-4091-97d8-6e5dc05d37a6" containerName="dnsmasq-dns" containerID="cri-o://2ac08dbed9e2e6db28246046df1340751706e82529cc7966a834efd45ee905ef" gracePeriod=10 Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.642514 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-756dcbffdc-f275x"] Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.652983 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-756dcbffdc-f275x" Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.670945 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-756dcbffdc-f275x"] Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.672931 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78bd028a-6324-402f-80e4-5a712e07bfb6-dns-svc\") pod \"dnsmasq-dns-756dcbffdc-f275x\" (UID: \"78bd028a-6324-402f-80e4-5a712e07bfb6\") " pod="openstack/dnsmasq-dns-756dcbffdc-f275x" Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.673026 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/78bd028a-6324-402f-80e4-5a712e07bfb6-dns-swift-storage-0\") pod \"dnsmasq-dns-756dcbffdc-f275x\" (UID: \"78bd028a-6324-402f-80e4-5a712e07bfb6\") " pod="openstack/dnsmasq-dns-756dcbffdc-f275x" Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.673087 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/78bd028a-6324-402f-80e4-5a712e07bfb6-openstack-edpm-ipam\") pod \"dnsmasq-dns-756dcbffdc-f275x\" (UID: \"78bd028a-6324-402f-80e4-5a712e07bfb6\") " pod="openstack/dnsmasq-dns-756dcbffdc-f275x" Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.673157 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78bd028a-6324-402f-80e4-5a712e07bfb6-ovsdbserver-nb\") pod \"dnsmasq-dns-756dcbffdc-f275x\" (UID: \"78bd028a-6324-402f-80e4-5a712e07bfb6\") " pod="openstack/dnsmasq-dns-756dcbffdc-f275x" Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.673601 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwkf4\" (UniqueName: \"kubernetes.io/projected/78bd028a-6324-402f-80e4-5a712e07bfb6-kube-api-access-cwkf4\") pod \"dnsmasq-dns-756dcbffdc-f275x\" (UID: \"78bd028a-6324-402f-80e4-5a712e07bfb6\") " pod="openstack/dnsmasq-dns-756dcbffdc-f275x" Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.673680 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78bd028a-6324-402f-80e4-5a712e07bfb6-config\") pod \"dnsmasq-dns-756dcbffdc-f275x\" (UID: \"78bd028a-6324-402f-80e4-5a712e07bfb6\") " pod="openstack/dnsmasq-dns-756dcbffdc-f275x" Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.673843 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78bd028a-6324-402f-80e4-5a712e07bfb6-ovsdbserver-sb\") pod \"dnsmasq-dns-756dcbffdc-f275x\" (UID: \"78bd028a-6324-402f-80e4-5a712e07bfb6\") " pod="openstack/dnsmasq-dns-756dcbffdc-f275x" Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.776591 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwkf4\" (UniqueName: \"kubernetes.io/projected/78bd028a-6324-402f-80e4-5a712e07bfb6-kube-api-access-cwkf4\") pod \"dnsmasq-dns-756dcbffdc-f275x\" (UID: \"78bd028a-6324-402f-80e4-5a712e07bfb6\") " pod="openstack/dnsmasq-dns-756dcbffdc-f275x" Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.776644 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78bd028a-6324-402f-80e4-5a712e07bfb6-config\") pod \"dnsmasq-dns-756dcbffdc-f275x\" (UID: \"78bd028a-6324-402f-80e4-5a712e07bfb6\") " pod="openstack/dnsmasq-dns-756dcbffdc-f275x" Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.776739 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78bd028a-6324-402f-80e4-5a712e07bfb6-ovsdbserver-sb\") pod \"dnsmasq-dns-756dcbffdc-f275x\" (UID: \"78bd028a-6324-402f-80e4-5a712e07bfb6\") " pod="openstack/dnsmasq-dns-756dcbffdc-f275x" Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.776769 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78bd028a-6324-402f-80e4-5a712e07bfb6-dns-svc\") pod \"dnsmasq-dns-756dcbffdc-f275x\" (UID: \"78bd028a-6324-402f-80e4-5a712e07bfb6\") " pod="openstack/dnsmasq-dns-756dcbffdc-f275x" Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.776794 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/78bd028a-6324-402f-80e4-5a712e07bfb6-dns-swift-storage-0\") pod \"dnsmasq-dns-756dcbffdc-f275x\" (UID: \"78bd028a-6324-402f-80e4-5a712e07bfb6\") " pod="openstack/dnsmasq-dns-756dcbffdc-f275x" Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.776821 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/78bd028a-6324-402f-80e4-5a712e07bfb6-openstack-edpm-ipam\") pod \"dnsmasq-dns-756dcbffdc-f275x\" (UID: \"78bd028a-6324-402f-80e4-5a712e07bfb6\") " pod="openstack/dnsmasq-dns-756dcbffdc-f275x" Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.776853 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78bd028a-6324-402f-80e4-5a712e07bfb6-ovsdbserver-nb\") pod \"dnsmasq-dns-756dcbffdc-f275x\" (UID: \"78bd028a-6324-402f-80e4-5a712e07bfb6\") " pod="openstack/dnsmasq-dns-756dcbffdc-f275x" Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.779056 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78bd028a-6324-402f-80e4-5a712e07bfb6-config\") pod \"dnsmasq-dns-756dcbffdc-f275x\" (UID: \"78bd028a-6324-402f-80e4-5a712e07bfb6\") " pod="openstack/dnsmasq-dns-756dcbffdc-f275x" Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.779807 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/78bd028a-6324-402f-80e4-5a712e07bfb6-dns-swift-storage-0\") pod \"dnsmasq-dns-756dcbffdc-f275x\" (UID: \"78bd028a-6324-402f-80e4-5a712e07bfb6\") " pod="openstack/dnsmasq-dns-756dcbffdc-f275x" Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.780367 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78bd028a-6324-402f-80e4-5a712e07bfb6-ovsdbserver-sb\") pod \"dnsmasq-dns-756dcbffdc-f275x\" (UID: \"78bd028a-6324-402f-80e4-5a712e07bfb6\") " pod="openstack/dnsmasq-dns-756dcbffdc-f275x" Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.780514 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/78bd028a-6324-402f-80e4-5a712e07bfb6-openstack-edpm-ipam\") pod \"dnsmasq-dns-756dcbffdc-f275x\" (UID: \"78bd028a-6324-402f-80e4-5a712e07bfb6\") " pod="openstack/dnsmasq-dns-756dcbffdc-f275x" Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.781213 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78bd028a-6324-402f-80e4-5a712e07bfb6-ovsdbserver-nb\") pod \"dnsmasq-dns-756dcbffdc-f275x\" (UID: \"78bd028a-6324-402f-80e4-5a712e07bfb6\") " pod="openstack/dnsmasq-dns-756dcbffdc-f275x" Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.781972 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78bd028a-6324-402f-80e4-5a712e07bfb6-dns-svc\") pod \"dnsmasq-dns-756dcbffdc-f275x\" (UID: \"78bd028a-6324-402f-80e4-5a712e07bfb6\") " pod="openstack/dnsmasq-dns-756dcbffdc-f275x" Oct 09 15:40:13 crc kubenswrapper[4719]: I1009 15:40:13.798045 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwkf4\" (UniqueName: \"kubernetes.io/projected/78bd028a-6324-402f-80e4-5a712e07bfb6-kube-api-access-cwkf4\") pod \"dnsmasq-dns-756dcbffdc-f275x\" (UID: \"78bd028a-6324-402f-80e4-5a712e07bfb6\") " pod="openstack/dnsmasq-dns-756dcbffdc-f275x" Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.007424 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-756dcbffdc-f275x" Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.148292 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66f8586597-6gv4h" Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.185832 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgb6m\" (UniqueName: \"kubernetes.io/projected/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-kube-api-access-hgb6m\") pod \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\" (UID: \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\") " Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.186078 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-dns-swift-storage-0\") pod \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\" (UID: \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\") " Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.186135 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-ovsdbserver-sb\") pod \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\" (UID: \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\") " Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.186162 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-dns-svc\") pod \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\" (UID: \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\") " Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.186516 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-config\") pod \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\" (UID: \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\") " Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.186547 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-ovsdbserver-nb\") pod \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\" (UID: \"8c6a2f16-2424-4091-97d8-6e5dc05d37a6\") " Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.204520 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-kube-api-access-hgb6m" (OuterVolumeSpecName: "kube-api-access-hgb6m") pod "8c6a2f16-2424-4091-97d8-6e5dc05d37a6" (UID: "8c6a2f16-2424-4091-97d8-6e5dc05d37a6"). InnerVolumeSpecName "kube-api-access-hgb6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.246764 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8c6a2f16-2424-4091-97d8-6e5dc05d37a6" (UID: "8c6a2f16-2424-4091-97d8-6e5dc05d37a6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.255113 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8c6a2f16-2424-4091-97d8-6e5dc05d37a6" (UID: "8c6a2f16-2424-4091-97d8-6e5dc05d37a6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.269571 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8c6a2f16-2424-4091-97d8-6e5dc05d37a6" (UID: "8c6a2f16-2424-4091-97d8-6e5dc05d37a6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.270922 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-config" (OuterVolumeSpecName: "config") pod "8c6a2f16-2424-4091-97d8-6e5dc05d37a6" (UID: "8c6a2f16-2424-4091-97d8-6e5dc05d37a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.276690 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8c6a2f16-2424-4091-97d8-6e5dc05d37a6" (UID: "8c6a2f16-2424-4091-97d8-6e5dc05d37a6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.291802 4719 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.291824 4719 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.291870 4719 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.291881 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.291890 4719 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.291899 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgb6m\" (UniqueName: \"kubernetes.io/projected/8c6a2f16-2424-4091-97d8-6e5dc05d37a6-kube-api-access-hgb6m\") on node \"crc\" DevicePath \"\"" Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.473461 4719 generic.go:334] "Generic (PLEG): container finished" podID="8c6a2f16-2424-4091-97d8-6e5dc05d37a6" containerID="2ac08dbed9e2e6db28246046df1340751706e82529cc7966a834efd45ee905ef" exitCode=0 Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.473505 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66f8586597-6gv4h" event={"ID":"8c6a2f16-2424-4091-97d8-6e5dc05d37a6","Type":"ContainerDied","Data":"2ac08dbed9e2e6db28246046df1340751706e82529cc7966a834efd45ee905ef"} Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.473532 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66f8586597-6gv4h" event={"ID":"8c6a2f16-2424-4091-97d8-6e5dc05d37a6","Type":"ContainerDied","Data":"468395e7beedf62a40cc721d19e5bfcb5a004418e47c7883874297cdb840e7fb"} Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.473547 4719 scope.go:117] "RemoveContainer" containerID="2ac08dbed9e2e6db28246046df1340751706e82529cc7966a834efd45ee905ef" Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.473663 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66f8586597-6gv4h" Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.492706 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-756dcbffdc-f275x"] Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.510991 4719 scope.go:117] "RemoveContainer" containerID="b63a8cff8817e9d33c85cb502d6125eef9a53a513adf6c4918b89de7ddfb4296" Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.519887 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66f8586597-6gv4h"] Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.531554 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66f8586597-6gv4h"] Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.545766 4719 scope.go:117] "RemoveContainer" containerID="2ac08dbed9e2e6db28246046df1340751706e82529cc7966a834efd45ee905ef" Oct 09 15:40:14 crc kubenswrapper[4719]: E1009 15:40:14.547927 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ac08dbed9e2e6db28246046df1340751706e82529cc7966a834efd45ee905ef\": container with ID starting with 2ac08dbed9e2e6db28246046df1340751706e82529cc7966a834efd45ee905ef not found: ID does not exist" containerID="2ac08dbed9e2e6db28246046df1340751706e82529cc7966a834efd45ee905ef" Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.547964 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ac08dbed9e2e6db28246046df1340751706e82529cc7966a834efd45ee905ef"} err="failed to get container status \"2ac08dbed9e2e6db28246046df1340751706e82529cc7966a834efd45ee905ef\": rpc error: code = NotFound desc = could not find container \"2ac08dbed9e2e6db28246046df1340751706e82529cc7966a834efd45ee905ef\": container with ID starting with 2ac08dbed9e2e6db28246046df1340751706e82529cc7966a834efd45ee905ef not found: ID does not exist" Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.547987 4719 scope.go:117] "RemoveContainer" containerID="b63a8cff8817e9d33c85cb502d6125eef9a53a513adf6c4918b89de7ddfb4296" Oct 09 15:40:14 crc kubenswrapper[4719]: E1009 15:40:14.548302 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b63a8cff8817e9d33c85cb502d6125eef9a53a513adf6c4918b89de7ddfb4296\": container with ID starting with b63a8cff8817e9d33c85cb502d6125eef9a53a513adf6c4918b89de7ddfb4296 not found: ID does not exist" containerID="b63a8cff8817e9d33c85cb502d6125eef9a53a513adf6c4918b89de7ddfb4296" Oct 09 15:40:14 crc kubenswrapper[4719]: I1009 15:40:14.548344 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b63a8cff8817e9d33c85cb502d6125eef9a53a513adf6c4918b89de7ddfb4296"} err="failed to get container status \"b63a8cff8817e9d33c85cb502d6125eef9a53a513adf6c4918b89de7ddfb4296\": rpc error: code = NotFound desc = could not find container \"b63a8cff8817e9d33c85cb502d6125eef9a53a513adf6c4918b89de7ddfb4296\": container with ID starting with b63a8cff8817e9d33c85cb502d6125eef9a53a513adf6c4918b89de7ddfb4296 not found: ID does not exist" Oct 09 15:40:15 crc kubenswrapper[4719]: I1009 15:40:15.189124 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c6a2f16-2424-4091-97d8-6e5dc05d37a6" path="/var/lib/kubelet/pods/8c6a2f16-2424-4091-97d8-6e5dc05d37a6/volumes" Oct 09 15:40:15 crc kubenswrapper[4719]: I1009 15:40:15.486465 4719 generic.go:334] "Generic (PLEG): container finished" podID="78bd028a-6324-402f-80e4-5a712e07bfb6" containerID="2a2ef70dc062e6974181bd37c00704d8b9ba7828db8894421f8b3e9d30ebe6d9" exitCode=0 Oct 09 15:40:15 crc kubenswrapper[4719]: I1009 15:40:15.486566 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-756dcbffdc-f275x" event={"ID":"78bd028a-6324-402f-80e4-5a712e07bfb6","Type":"ContainerDied","Data":"2a2ef70dc062e6974181bd37c00704d8b9ba7828db8894421f8b3e9d30ebe6d9"} Oct 09 15:40:15 crc kubenswrapper[4719]: I1009 15:40:15.487537 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-756dcbffdc-f275x" event={"ID":"78bd028a-6324-402f-80e4-5a712e07bfb6","Type":"ContainerStarted","Data":"9eef615d6f4df80faa6c563f2bd5d8b527c1661e381bd19b28ccd4255e5fe43c"} Oct 09 15:40:16 crc kubenswrapper[4719]: I1009 15:40:16.501244 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-756dcbffdc-f275x" event={"ID":"78bd028a-6324-402f-80e4-5a712e07bfb6","Type":"ContainerStarted","Data":"b1d7b0490bda25af1b930329345a849a250e4c6f0b716eca375fb542fbe37869"} Oct 09 15:40:16 crc kubenswrapper[4719]: I1009 15:40:16.501855 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-756dcbffdc-f275x" Oct 09 15:40:16 crc kubenswrapper[4719]: I1009 15:40:16.522453 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-756dcbffdc-f275x" podStartSLOduration=3.522433726 podStartE2EDuration="3.522433726s" podCreationTimestamp="2025-10-09 15:40:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:40:16.516922491 +0000 UTC m=+1322.026633786" watchObservedRunningTime="2025-10-09 15:40:16.522433726 +0000 UTC m=+1322.032145011" Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.009253 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-756dcbffdc-f275x" Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.075679 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb846c9d9-5g7fj"] Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.075968 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" podUID="a3f0a2fe-be13-41da-bb16-5d68b6c14de4" containerName="dnsmasq-dns" containerID="cri-o://3f3e45fd5d839263a7a9ccdb3c56692cd7a0ab79628512ad452c31b55488823e" gracePeriod=10 Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.579211 4719 generic.go:334] "Generic (PLEG): container finished" podID="a3f0a2fe-be13-41da-bb16-5d68b6c14de4" containerID="3f3e45fd5d839263a7a9ccdb3c56692cd7a0ab79628512ad452c31b55488823e" exitCode=0 Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.579426 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" event={"ID":"a3f0a2fe-be13-41da-bb16-5d68b6c14de4","Type":"ContainerDied","Data":"3f3e45fd5d839263a7a9ccdb3c56692cd7a0ab79628512ad452c31b55488823e"} Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.579804 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" event={"ID":"a3f0a2fe-be13-41da-bb16-5d68b6c14de4","Type":"ContainerDied","Data":"1659d71c3c959b0b472e70f0c2cbad1953e3c504919de24b40d09dc8c1ccb0af"} Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.579877 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1659d71c3c959b0b472e70f0c2cbad1953e3c504919de24b40d09dc8c1ccb0af" Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.599443 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.701572 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-ovsdbserver-sb\") pod \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.701616 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-dns-swift-storage-0\") pod \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.701693 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-dns-svc\") pod \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.701831 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-ovsdbserver-nb\") pod \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.701869 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-openstack-edpm-ipam\") pod \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.702158 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-config\") pod \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.702211 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxtcv\" (UniqueName: \"kubernetes.io/projected/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-kube-api-access-xxtcv\") pod \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\" (UID: \"a3f0a2fe-be13-41da-bb16-5d68b6c14de4\") " Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.710260 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-kube-api-access-xxtcv" (OuterVolumeSpecName: "kube-api-access-xxtcv") pod "a3f0a2fe-be13-41da-bb16-5d68b6c14de4" (UID: "a3f0a2fe-be13-41da-bb16-5d68b6c14de4"). InnerVolumeSpecName "kube-api-access-xxtcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.762290 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a3f0a2fe-be13-41da-bb16-5d68b6c14de4" (UID: "a3f0a2fe-be13-41da-bb16-5d68b6c14de4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.762584 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a3f0a2fe-be13-41da-bb16-5d68b6c14de4" (UID: "a3f0a2fe-be13-41da-bb16-5d68b6c14de4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.763058 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "a3f0a2fe-be13-41da-bb16-5d68b6c14de4" (UID: "a3f0a2fe-be13-41da-bb16-5d68b6c14de4"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.772075 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a3f0a2fe-be13-41da-bb16-5d68b6c14de4" (UID: "a3f0a2fe-be13-41da-bb16-5d68b6c14de4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.777446 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a3f0a2fe-be13-41da-bb16-5d68b6c14de4" (UID: "a3f0a2fe-be13-41da-bb16-5d68b6c14de4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.780712 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-config" (OuterVolumeSpecName: "config") pod "a3f0a2fe-be13-41da-bb16-5d68b6c14de4" (UID: "a3f0a2fe-be13-41da-bb16-5d68b6c14de4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.804380 4719 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.804584 4719 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.804652 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-config\") on node \"crc\" DevicePath \"\"" Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.804711 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxtcv\" (UniqueName: \"kubernetes.io/projected/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-kube-api-access-xxtcv\") on node \"crc\" DevicePath \"\"" Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.804770 4719 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.804824 4719 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 15:40:24 crc kubenswrapper[4719]: I1009 15:40:24.804875 4719 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3f0a2fe-be13-41da-bb16-5d68b6c14de4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 15:40:25 crc kubenswrapper[4719]: I1009 15:40:25.587643 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb846c9d9-5g7fj" Oct 09 15:40:25 crc kubenswrapper[4719]: I1009 15:40:25.617073 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb846c9d9-5g7fj"] Oct 09 15:40:25 crc kubenswrapper[4719]: I1009 15:40:25.625652 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cb846c9d9-5g7fj"] Oct 09 15:40:27 crc kubenswrapper[4719]: I1009 15:40:27.175249 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3f0a2fe-be13-41da-bb16-5d68b6c14de4" path="/var/lib/kubelet/pods/a3f0a2fe-be13-41da-bb16-5d68b6c14de4/volumes" Oct 09 15:40:29 crc kubenswrapper[4719]: I1009 15:40:29.625039 4719 generic.go:334] "Generic (PLEG): container finished" podID="cf67e8f5-acbb-4033-bcca-d9c86d2be88c" containerID="0c89f3a8486bf9bc031c7c588da4eebf8e66ecb63aa1f76dcefa8866571c8143" exitCode=0 Oct 09 15:40:29 crc kubenswrapper[4719]: I1009 15:40:29.625161 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cf67e8f5-acbb-4033-bcca-d9c86d2be88c","Type":"ContainerDied","Data":"0c89f3a8486bf9bc031c7c588da4eebf8e66ecb63aa1f76dcefa8866571c8143"} Oct 09 15:40:29 crc kubenswrapper[4719]: I1009 15:40:29.628763 4719 generic.go:334] "Generic (PLEG): container finished" podID="256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8" containerID="f193a06c5855ca39fccf1c1939d16dfce34290c75726a3036e10220efeebe4aa" exitCode=0 Oct 09 15:40:29 crc kubenswrapper[4719]: I1009 15:40:29.628802 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8","Type":"ContainerDied","Data":"f193a06c5855ca39fccf1c1939d16dfce34290c75726a3036e10220efeebe4aa"} Oct 09 15:40:30 crc kubenswrapper[4719]: I1009 15:40:30.639624 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cf67e8f5-acbb-4033-bcca-d9c86d2be88c","Type":"ContainerStarted","Data":"5a2a04d85623b8c10973ce36ff2d04aac9f3834d80956db0dec658a5ddf36b8d"} Oct 09 15:40:30 crc kubenswrapper[4719]: I1009 15:40:30.640170 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 09 15:40:30 crc kubenswrapper[4719]: I1009 15:40:30.641406 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8","Type":"ContainerStarted","Data":"6503b916ee38d5d354ac060e7b6c4e83467eda085c7293d9dce217d0766c8726"} Oct 09 15:40:30 crc kubenswrapper[4719]: I1009 15:40:30.641834 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:40:30 crc kubenswrapper[4719]: I1009 15:40:30.697951 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.697929259 podStartE2EDuration="36.697929259s" podCreationTimestamp="2025-10-09 15:39:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:40:30.692148216 +0000 UTC m=+1336.201859561" watchObservedRunningTime="2025-10-09 15:40:30.697929259 +0000 UTC m=+1336.207640554" Oct 09 15:40:30 crc kubenswrapper[4719]: I1009 15:40:30.699450 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.699441198 podStartE2EDuration="36.699441198s" podCreationTimestamp="2025-10-09 15:39:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:40:30.670513778 +0000 UTC m=+1336.180225083" watchObservedRunningTime="2025-10-09 15:40:30.699441198 +0000 UTC m=+1336.209152483" Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.654566 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-48j65"] Oct 09 15:40:36 crc kubenswrapper[4719]: E1009 15:40:36.655608 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c6a2f16-2424-4091-97d8-6e5dc05d37a6" containerName="dnsmasq-dns" Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.655625 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c6a2f16-2424-4091-97d8-6e5dc05d37a6" containerName="dnsmasq-dns" Oct 09 15:40:36 crc kubenswrapper[4719]: E1009 15:40:36.655634 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3f0a2fe-be13-41da-bb16-5d68b6c14de4" containerName="dnsmasq-dns" Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.655640 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3f0a2fe-be13-41da-bb16-5d68b6c14de4" containerName="dnsmasq-dns" Oct 09 15:40:36 crc kubenswrapper[4719]: E1009 15:40:36.655682 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c6a2f16-2424-4091-97d8-6e5dc05d37a6" containerName="init" Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.655688 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c6a2f16-2424-4091-97d8-6e5dc05d37a6" containerName="init" Oct 09 15:40:36 crc kubenswrapper[4719]: E1009 15:40:36.655701 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3f0a2fe-be13-41da-bb16-5d68b6c14de4" containerName="init" Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.655707 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3f0a2fe-be13-41da-bb16-5d68b6c14de4" containerName="init" Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.655893 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3f0a2fe-be13-41da-bb16-5d68b6c14de4" containerName="dnsmasq-dns" Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.655905 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c6a2f16-2424-4091-97d8-6e5dc05d37a6" containerName="dnsmasq-dns" Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.656576 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-48j65" Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.660555 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.660617 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ssvsw" Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.660838 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.661005 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.665665 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-48j65"] Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.735957 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35cce4cf-e1ff-44fb-9f62-887951a77275-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-48j65\" (UID: \"35cce4cf-e1ff-44fb-9f62-887951a77275\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-48j65" Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.736258 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rvb8\" (UniqueName: \"kubernetes.io/projected/35cce4cf-e1ff-44fb-9f62-887951a77275-kube-api-access-5rvb8\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-48j65\" (UID: \"35cce4cf-e1ff-44fb-9f62-887951a77275\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-48j65" Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.736378 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35cce4cf-e1ff-44fb-9f62-887951a77275-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-48j65\" (UID: \"35cce4cf-e1ff-44fb-9f62-887951a77275\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-48j65" Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.736608 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35cce4cf-e1ff-44fb-9f62-887951a77275-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-48j65\" (UID: \"35cce4cf-e1ff-44fb-9f62-887951a77275\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-48j65" Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.838972 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rvb8\" (UniqueName: \"kubernetes.io/projected/35cce4cf-e1ff-44fb-9f62-887951a77275-kube-api-access-5rvb8\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-48j65\" (UID: \"35cce4cf-e1ff-44fb-9f62-887951a77275\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-48j65" Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.839047 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35cce4cf-e1ff-44fb-9f62-887951a77275-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-48j65\" (UID: \"35cce4cf-e1ff-44fb-9f62-887951a77275\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-48j65" Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.839144 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35cce4cf-e1ff-44fb-9f62-887951a77275-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-48j65\" (UID: \"35cce4cf-e1ff-44fb-9f62-887951a77275\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-48j65" Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.839222 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35cce4cf-e1ff-44fb-9f62-887951a77275-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-48j65\" (UID: \"35cce4cf-e1ff-44fb-9f62-887951a77275\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-48j65" Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.845100 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35cce4cf-e1ff-44fb-9f62-887951a77275-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-48j65\" (UID: \"35cce4cf-e1ff-44fb-9f62-887951a77275\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-48j65" Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.845240 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35cce4cf-e1ff-44fb-9f62-887951a77275-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-48j65\" (UID: \"35cce4cf-e1ff-44fb-9f62-887951a77275\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-48j65" Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.855377 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35cce4cf-e1ff-44fb-9f62-887951a77275-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-48j65\" (UID: \"35cce4cf-e1ff-44fb-9f62-887951a77275\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-48j65" Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.881807 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rvb8\" (UniqueName: \"kubernetes.io/projected/35cce4cf-e1ff-44fb-9f62-887951a77275-kube-api-access-5rvb8\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-48j65\" (UID: \"35cce4cf-e1ff-44fb-9f62-887951a77275\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-48j65" Oct 09 15:40:36 crc kubenswrapper[4719]: I1009 15:40:36.974781 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-48j65" Oct 09 15:40:37 crc kubenswrapper[4719]: I1009 15:40:37.531328 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-48j65"] Oct 09 15:40:37 crc kubenswrapper[4719]: I1009 15:40:37.709263 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-48j65" event={"ID":"35cce4cf-e1ff-44fb-9f62-887951a77275","Type":"ContainerStarted","Data":"7503c95bf40340f812606d083319b26201354800cc92887c6bf7e0b877b2fe33"} Oct 09 15:40:44 crc kubenswrapper[4719]: I1009 15:40:44.834306 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 09 15:40:44 crc kubenswrapper[4719]: I1009 15:40:44.953633 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 09 15:40:47 crc kubenswrapper[4719]: E1009 15:40:47.451432 4719 log.go:32] "ImageFsInfo from image service failed" err="rpc error: code = Unknown desc = get image fs info unable to get usage for /var/lib/containers/storage/overlay-images: get disk usage for path /var/lib/containers/storage/overlay-images: lstat /var/lib/containers/storage/overlay-images/.tmp-images.json2711972416: no such file or directory" Oct 09 15:40:47 crc kubenswrapper[4719]: E1009 15:40:47.452106 4719 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get imageFs stats: missing image stats: nil" Oct 09 15:40:47 crc kubenswrapper[4719]: I1009 15:40:47.883395 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-48j65" event={"ID":"35cce4cf-e1ff-44fb-9f62-887951a77275","Type":"ContainerStarted","Data":"f1f38aa19eb9b0e0e85ca64f220cd8c25d649ae80042b1d17620a44eb71f8d82"} Oct 09 15:40:47 crc kubenswrapper[4719]: I1009 15:40:47.898058 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-48j65" podStartSLOduration=1.9597242160000001 podStartE2EDuration="11.898039815s" podCreationTimestamp="2025-10-09 15:40:36 +0000 UTC" firstStartedPulling="2025-10-09 15:40:37.541110316 +0000 UTC m=+1343.050821601" lastFinishedPulling="2025-10-09 15:40:47.479425915 +0000 UTC m=+1352.989137200" observedRunningTime="2025-10-09 15:40:47.896269088 +0000 UTC m=+1353.405980363" watchObservedRunningTime="2025-10-09 15:40:47.898039815 +0000 UTC m=+1353.407751100" Oct 09 15:40:58 crc kubenswrapper[4719]: I1009 15:40:58.984935 4719 generic.go:334] "Generic (PLEG): container finished" podID="35cce4cf-e1ff-44fb-9f62-887951a77275" containerID="f1f38aa19eb9b0e0e85ca64f220cd8c25d649ae80042b1d17620a44eb71f8d82" exitCode=0 Oct 09 15:40:58 crc kubenswrapper[4719]: I1009 15:40:58.985023 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-48j65" event={"ID":"35cce4cf-e1ff-44fb-9f62-887951a77275","Type":"ContainerDied","Data":"f1f38aa19eb9b0e0e85ca64f220cd8c25d649ae80042b1d17620a44eb71f8d82"} Oct 09 15:41:00 crc kubenswrapper[4719]: I1009 15:41:00.440963 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-48j65" Oct 09 15:41:00 crc kubenswrapper[4719]: I1009 15:41:00.584634 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rvb8\" (UniqueName: \"kubernetes.io/projected/35cce4cf-e1ff-44fb-9f62-887951a77275-kube-api-access-5rvb8\") pod \"35cce4cf-e1ff-44fb-9f62-887951a77275\" (UID: \"35cce4cf-e1ff-44fb-9f62-887951a77275\") " Oct 09 15:41:00 crc kubenswrapper[4719]: I1009 15:41:00.584808 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35cce4cf-e1ff-44fb-9f62-887951a77275-repo-setup-combined-ca-bundle\") pod \"35cce4cf-e1ff-44fb-9f62-887951a77275\" (UID: \"35cce4cf-e1ff-44fb-9f62-887951a77275\") " Oct 09 15:41:00 crc kubenswrapper[4719]: I1009 15:41:00.584885 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35cce4cf-e1ff-44fb-9f62-887951a77275-ssh-key\") pod \"35cce4cf-e1ff-44fb-9f62-887951a77275\" (UID: \"35cce4cf-e1ff-44fb-9f62-887951a77275\") " Oct 09 15:41:00 crc kubenswrapper[4719]: I1009 15:41:00.584974 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35cce4cf-e1ff-44fb-9f62-887951a77275-inventory\") pod \"35cce4cf-e1ff-44fb-9f62-887951a77275\" (UID: \"35cce4cf-e1ff-44fb-9f62-887951a77275\") " Oct 09 15:41:00 crc kubenswrapper[4719]: I1009 15:41:00.591386 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35cce4cf-e1ff-44fb-9f62-887951a77275-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "35cce4cf-e1ff-44fb-9f62-887951a77275" (UID: "35cce4cf-e1ff-44fb-9f62-887951a77275"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:41:00 crc kubenswrapper[4719]: I1009 15:41:00.591601 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35cce4cf-e1ff-44fb-9f62-887951a77275-kube-api-access-5rvb8" (OuterVolumeSpecName: "kube-api-access-5rvb8") pod "35cce4cf-e1ff-44fb-9f62-887951a77275" (UID: "35cce4cf-e1ff-44fb-9f62-887951a77275"). InnerVolumeSpecName "kube-api-access-5rvb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:41:00 crc kubenswrapper[4719]: I1009 15:41:00.613081 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35cce4cf-e1ff-44fb-9f62-887951a77275-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "35cce4cf-e1ff-44fb-9f62-887951a77275" (UID: "35cce4cf-e1ff-44fb-9f62-887951a77275"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:41:00 crc kubenswrapper[4719]: I1009 15:41:00.615702 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35cce4cf-e1ff-44fb-9f62-887951a77275-inventory" (OuterVolumeSpecName: "inventory") pod "35cce4cf-e1ff-44fb-9f62-887951a77275" (UID: "35cce4cf-e1ff-44fb-9f62-887951a77275"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:41:00 crc kubenswrapper[4719]: I1009 15:41:00.687725 4719 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35cce4cf-e1ff-44fb-9f62-887951a77275-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 15:41:00 crc kubenswrapper[4719]: I1009 15:41:00.687771 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rvb8\" (UniqueName: \"kubernetes.io/projected/35cce4cf-e1ff-44fb-9f62-887951a77275-kube-api-access-5rvb8\") on node \"crc\" DevicePath \"\"" Oct 09 15:41:00 crc kubenswrapper[4719]: I1009 15:41:00.687786 4719 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35cce4cf-e1ff-44fb-9f62-887951a77275-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:41:00 crc kubenswrapper[4719]: I1009 15:41:00.687798 4719 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35cce4cf-e1ff-44fb-9f62-887951a77275-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 15:41:01 crc kubenswrapper[4719]: I1009 15:41:01.005574 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-48j65" event={"ID":"35cce4cf-e1ff-44fb-9f62-887951a77275","Type":"ContainerDied","Data":"7503c95bf40340f812606d083319b26201354800cc92887c6bf7e0b877b2fe33"} Oct 09 15:41:01 crc kubenswrapper[4719]: I1009 15:41:01.005616 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7503c95bf40340f812606d083319b26201354800cc92887c6bf7e0b877b2fe33" Oct 09 15:41:01 crc kubenswrapper[4719]: I1009 15:41:01.005592 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-48j65" Oct 09 15:41:01 crc kubenswrapper[4719]: I1009 15:41:01.134445 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-g2mb5"] Oct 09 15:41:01 crc kubenswrapper[4719]: E1009 15:41:01.134911 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35cce4cf-e1ff-44fb-9f62-887951a77275" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 09 15:41:01 crc kubenswrapper[4719]: I1009 15:41:01.134929 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="35cce4cf-e1ff-44fb-9f62-887951a77275" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 09 15:41:01 crc kubenswrapper[4719]: I1009 15:41:01.135158 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="35cce4cf-e1ff-44fb-9f62-887951a77275" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 09 15:41:01 crc kubenswrapper[4719]: I1009 15:41:01.136296 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g2mb5" Oct 09 15:41:01 crc kubenswrapper[4719]: I1009 15:41:01.138999 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ssvsw" Oct 09 15:41:01 crc kubenswrapper[4719]: I1009 15:41:01.140240 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 15:41:01 crc kubenswrapper[4719]: I1009 15:41:01.140314 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 15:41:01 crc kubenswrapper[4719]: I1009 15:41:01.140471 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 15:41:01 crc kubenswrapper[4719]: I1009 15:41:01.148695 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-g2mb5"] Oct 09 15:41:01 crc kubenswrapper[4719]: I1009 15:41:01.298397 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cbe17ac-7862-4175-9d90-10fe6c51cfb4-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g2mb5\" (UID: \"2cbe17ac-7862-4175-9d90-10fe6c51cfb4\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g2mb5" Oct 09 15:41:01 crc kubenswrapper[4719]: I1009 15:41:01.298445 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cbe17ac-7862-4175-9d90-10fe6c51cfb4-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g2mb5\" (UID: \"2cbe17ac-7862-4175-9d90-10fe6c51cfb4\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g2mb5" Oct 09 15:41:01 crc kubenswrapper[4719]: I1009 15:41:01.298722 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njsp4\" (UniqueName: \"kubernetes.io/projected/2cbe17ac-7862-4175-9d90-10fe6c51cfb4-kube-api-access-njsp4\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g2mb5\" (UID: \"2cbe17ac-7862-4175-9d90-10fe6c51cfb4\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g2mb5" Oct 09 15:41:01 crc kubenswrapper[4719]: I1009 15:41:01.400973 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njsp4\" (UniqueName: \"kubernetes.io/projected/2cbe17ac-7862-4175-9d90-10fe6c51cfb4-kube-api-access-njsp4\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g2mb5\" (UID: \"2cbe17ac-7862-4175-9d90-10fe6c51cfb4\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g2mb5" Oct 09 15:41:01 crc kubenswrapper[4719]: I1009 15:41:01.401234 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cbe17ac-7862-4175-9d90-10fe6c51cfb4-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g2mb5\" (UID: \"2cbe17ac-7862-4175-9d90-10fe6c51cfb4\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g2mb5" Oct 09 15:41:01 crc kubenswrapper[4719]: I1009 15:41:01.401325 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cbe17ac-7862-4175-9d90-10fe6c51cfb4-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g2mb5\" (UID: \"2cbe17ac-7862-4175-9d90-10fe6c51cfb4\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g2mb5" Oct 09 15:41:01 crc kubenswrapper[4719]: I1009 15:41:01.404826 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cbe17ac-7862-4175-9d90-10fe6c51cfb4-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g2mb5\" (UID: \"2cbe17ac-7862-4175-9d90-10fe6c51cfb4\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g2mb5" Oct 09 15:41:01 crc kubenswrapper[4719]: I1009 15:41:01.412996 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cbe17ac-7862-4175-9d90-10fe6c51cfb4-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g2mb5\" (UID: \"2cbe17ac-7862-4175-9d90-10fe6c51cfb4\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g2mb5" Oct 09 15:41:01 crc kubenswrapper[4719]: I1009 15:41:01.417627 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njsp4\" (UniqueName: \"kubernetes.io/projected/2cbe17ac-7862-4175-9d90-10fe6c51cfb4-kube-api-access-njsp4\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g2mb5\" (UID: \"2cbe17ac-7862-4175-9d90-10fe6c51cfb4\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g2mb5" Oct 09 15:41:01 crc kubenswrapper[4719]: I1009 15:41:01.457637 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g2mb5" Oct 09 15:41:01 crc kubenswrapper[4719]: I1009 15:41:01.943180 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-g2mb5"] Oct 09 15:41:02 crc kubenswrapper[4719]: I1009 15:41:02.022436 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g2mb5" event={"ID":"2cbe17ac-7862-4175-9d90-10fe6c51cfb4","Type":"ContainerStarted","Data":"33ba7821ac876ca2684efca1cce861b728b97f348a9a54c0ae3e3ca4cc916d4e"} Oct 09 15:41:03 crc kubenswrapper[4719]: I1009 15:41:03.031978 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g2mb5" event={"ID":"2cbe17ac-7862-4175-9d90-10fe6c51cfb4","Type":"ContainerStarted","Data":"e5ed79465e4a27730d184f69fa14e927056bf2d729b2e165cfd4e667c3254144"} Oct 09 15:41:03 crc kubenswrapper[4719]: I1009 15:41:03.069472 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g2mb5" podStartSLOduration=1.49095733 podStartE2EDuration="2.067327531s" podCreationTimestamp="2025-10-09 15:41:01 +0000 UTC" firstStartedPulling="2025-10-09 15:41:01.94074591 +0000 UTC m=+1367.450457195" lastFinishedPulling="2025-10-09 15:41:02.517116111 +0000 UTC m=+1368.026827396" observedRunningTime="2025-10-09 15:41:03.046269973 +0000 UTC m=+1368.555981268" watchObservedRunningTime="2025-10-09 15:41:03.067327531 +0000 UTC m=+1368.577038816" Oct 09 15:41:06 crc kubenswrapper[4719]: I1009 15:41:06.058238 4719 generic.go:334] "Generic (PLEG): container finished" podID="2cbe17ac-7862-4175-9d90-10fe6c51cfb4" containerID="e5ed79465e4a27730d184f69fa14e927056bf2d729b2e165cfd4e667c3254144" exitCode=0 Oct 09 15:41:06 crc kubenswrapper[4719]: I1009 15:41:06.058322 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g2mb5" event={"ID":"2cbe17ac-7862-4175-9d90-10fe6c51cfb4","Type":"ContainerDied","Data":"e5ed79465e4a27730d184f69fa14e927056bf2d729b2e165cfd4e667c3254144"} Oct 09 15:41:07 crc kubenswrapper[4719]: I1009 15:41:07.496209 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g2mb5" Oct 09 15:41:07 crc kubenswrapper[4719]: I1009 15:41:07.620733 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cbe17ac-7862-4175-9d90-10fe6c51cfb4-inventory\") pod \"2cbe17ac-7862-4175-9d90-10fe6c51cfb4\" (UID: \"2cbe17ac-7862-4175-9d90-10fe6c51cfb4\") " Oct 09 15:41:07 crc kubenswrapper[4719]: I1009 15:41:07.620805 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njsp4\" (UniqueName: \"kubernetes.io/projected/2cbe17ac-7862-4175-9d90-10fe6c51cfb4-kube-api-access-njsp4\") pod \"2cbe17ac-7862-4175-9d90-10fe6c51cfb4\" (UID: \"2cbe17ac-7862-4175-9d90-10fe6c51cfb4\") " Oct 09 15:41:07 crc kubenswrapper[4719]: I1009 15:41:07.621026 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cbe17ac-7862-4175-9d90-10fe6c51cfb4-ssh-key\") pod \"2cbe17ac-7862-4175-9d90-10fe6c51cfb4\" (UID: \"2cbe17ac-7862-4175-9d90-10fe6c51cfb4\") " Oct 09 15:41:07 crc kubenswrapper[4719]: I1009 15:41:07.626159 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cbe17ac-7862-4175-9d90-10fe6c51cfb4-kube-api-access-njsp4" (OuterVolumeSpecName: "kube-api-access-njsp4") pod "2cbe17ac-7862-4175-9d90-10fe6c51cfb4" (UID: "2cbe17ac-7862-4175-9d90-10fe6c51cfb4"). InnerVolumeSpecName "kube-api-access-njsp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:41:07 crc kubenswrapper[4719]: I1009 15:41:07.649398 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cbe17ac-7862-4175-9d90-10fe6c51cfb4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2cbe17ac-7862-4175-9d90-10fe6c51cfb4" (UID: "2cbe17ac-7862-4175-9d90-10fe6c51cfb4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:41:07 crc kubenswrapper[4719]: I1009 15:41:07.655308 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cbe17ac-7862-4175-9d90-10fe6c51cfb4-inventory" (OuterVolumeSpecName: "inventory") pod "2cbe17ac-7862-4175-9d90-10fe6c51cfb4" (UID: "2cbe17ac-7862-4175-9d90-10fe6c51cfb4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:41:07 crc kubenswrapper[4719]: I1009 15:41:07.722963 4719 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cbe17ac-7862-4175-9d90-10fe6c51cfb4-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 15:41:07 crc kubenswrapper[4719]: I1009 15:41:07.722995 4719 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cbe17ac-7862-4175-9d90-10fe6c51cfb4-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 15:41:07 crc kubenswrapper[4719]: I1009 15:41:07.723005 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njsp4\" (UniqueName: \"kubernetes.io/projected/2cbe17ac-7862-4175-9d90-10fe6c51cfb4-kube-api-access-njsp4\") on node \"crc\" DevicePath \"\"" Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.078192 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g2mb5" event={"ID":"2cbe17ac-7862-4175-9d90-10fe6c51cfb4","Type":"ContainerDied","Data":"33ba7821ac876ca2684efca1cce861b728b97f348a9a54c0ae3e3ca4cc916d4e"} Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.078232 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33ba7821ac876ca2684efca1cce861b728b97f348a9a54c0ae3e3ca4cc916d4e" Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.078285 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g2mb5" Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.141261 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh"] Oct 09 15:41:08 crc kubenswrapper[4719]: E1009 15:41:08.141731 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cbe17ac-7862-4175-9d90-10fe6c51cfb4" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.141749 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cbe17ac-7862-4175-9d90-10fe6c51cfb4" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.141930 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cbe17ac-7862-4175-9d90-10fe6c51cfb4" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.142652 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh" Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.144705 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ssvsw" Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.144722 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.144934 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.145222 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.166037 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh"] Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.333921 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75c7240d-03e4-40f9-a915-c85892b060d9-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh\" (UID: \"75c7240d-03e4-40f9-a915-c85892b060d9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh" Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.334925 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82djm\" (UniqueName: \"kubernetes.io/projected/75c7240d-03e4-40f9-a915-c85892b060d9-kube-api-access-82djm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh\" (UID: \"75c7240d-03e4-40f9-a915-c85892b060d9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh" Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.335134 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75c7240d-03e4-40f9-a915-c85892b060d9-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh\" (UID: \"75c7240d-03e4-40f9-a915-c85892b060d9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh" Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.335192 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75c7240d-03e4-40f9-a915-c85892b060d9-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh\" (UID: \"75c7240d-03e4-40f9-a915-c85892b060d9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh" Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.436989 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82djm\" (UniqueName: \"kubernetes.io/projected/75c7240d-03e4-40f9-a915-c85892b060d9-kube-api-access-82djm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh\" (UID: \"75c7240d-03e4-40f9-a915-c85892b060d9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh" Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.437084 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75c7240d-03e4-40f9-a915-c85892b060d9-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh\" (UID: \"75c7240d-03e4-40f9-a915-c85892b060d9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh" Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.437126 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75c7240d-03e4-40f9-a915-c85892b060d9-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh\" (UID: \"75c7240d-03e4-40f9-a915-c85892b060d9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh" Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.437194 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75c7240d-03e4-40f9-a915-c85892b060d9-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh\" (UID: \"75c7240d-03e4-40f9-a915-c85892b060d9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh" Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.441083 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75c7240d-03e4-40f9-a915-c85892b060d9-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh\" (UID: \"75c7240d-03e4-40f9-a915-c85892b060d9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh" Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.441914 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75c7240d-03e4-40f9-a915-c85892b060d9-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh\" (UID: \"75c7240d-03e4-40f9-a915-c85892b060d9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh" Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.443535 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75c7240d-03e4-40f9-a915-c85892b060d9-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh\" (UID: \"75c7240d-03e4-40f9-a915-c85892b060d9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh" Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.459112 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82djm\" (UniqueName: \"kubernetes.io/projected/75c7240d-03e4-40f9-a915-c85892b060d9-kube-api-access-82djm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh\" (UID: \"75c7240d-03e4-40f9-a915-c85892b060d9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh" Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.467292 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh" Oct 09 15:41:08 crc kubenswrapper[4719]: I1009 15:41:08.991797 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh"] Oct 09 15:41:09 crc kubenswrapper[4719]: I1009 15:41:09.089055 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh" event={"ID":"75c7240d-03e4-40f9-a915-c85892b060d9","Type":"ContainerStarted","Data":"6b1d47eb69528d54d2ef5b0285608224d5f73454be42425086e8abb31d2118e6"} Oct 09 15:41:10 crc kubenswrapper[4719]: I1009 15:41:10.108730 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh" event={"ID":"75c7240d-03e4-40f9-a915-c85892b060d9","Type":"ContainerStarted","Data":"d272e8838a829958f4d8597faa2d581851a9e330d271244ad5cde1166710d58f"} Oct 09 15:41:10 crc kubenswrapper[4719]: I1009 15:41:10.133112 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh" podStartSLOduration=1.634614752 podStartE2EDuration="2.133095239s" podCreationTimestamp="2025-10-09 15:41:08 +0000 UTC" firstStartedPulling="2025-10-09 15:41:08.996675255 +0000 UTC m=+1374.506386540" lastFinishedPulling="2025-10-09 15:41:09.495155742 +0000 UTC m=+1375.004867027" observedRunningTime="2025-10-09 15:41:10.126711857 +0000 UTC m=+1375.636423142" watchObservedRunningTime="2025-10-09 15:41:10.133095239 +0000 UTC m=+1375.642806524" Oct 09 15:41:21 crc kubenswrapper[4719]: I1009 15:41:21.517235 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-46b8q"] Oct 09 15:41:21 crc kubenswrapper[4719]: I1009 15:41:21.522555 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-46b8q" Oct 09 15:41:21 crc kubenswrapper[4719]: I1009 15:41:21.532302 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-46b8q"] Oct 09 15:41:21 crc kubenswrapper[4719]: I1009 15:41:21.623200 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23b17d9f-68ed-4040-b073-cfb978c8e0df-utilities\") pod \"certified-operators-46b8q\" (UID: \"23b17d9f-68ed-4040-b073-cfb978c8e0df\") " pod="openshift-marketplace/certified-operators-46b8q" Oct 09 15:41:21 crc kubenswrapper[4719]: I1009 15:41:21.623281 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75v6j\" (UniqueName: \"kubernetes.io/projected/23b17d9f-68ed-4040-b073-cfb978c8e0df-kube-api-access-75v6j\") pod \"certified-operators-46b8q\" (UID: \"23b17d9f-68ed-4040-b073-cfb978c8e0df\") " pod="openshift-marketplace/certified-operators-46b8q" Oct 09 15:41:21 crc kubenswrapper[4719]: I1009 15:41:21.623317 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23b17d9f-68ed-4040-b073-cfb978c8e0df-catalog-content\") pod \"certified-operators-46b8q\" (UID: \"23b17d9f-68ed-4040-b073-cfb978c8e0df\") " pod="openshift-marketplace/certified-operators-46b8q" Oct 09 15:41:21 crc kubenswrapper[4719]: I1009 15:41:21.725730 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23b17d9f-68ed-4040-b073-cfb978c8e0df-utilities\") pod \"certified-operators-46b8q\" (UID: \"23b17d9f-68ed-4040-b073-cfb978c8e0df\") " pod="openshift-marketplace/certified-operators-46b8q" Oct 09 15:41:21 crc kubenswrapper[4719]: I1009 15:41:21.725839 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75v6j\" (UniqueName: \"kubernetes.io/projected/23b17d9f-68ed-4040-b073-cfb978c8e0df-kube-api-access-75v6j\") pod \"certified-operators-46b8q\" (UID: \"23b17d9f-68ed-4040-b073-cfb978c8e0df\") " pod="openshift-marketplace/certified-operators-46b8q" Oct 09 15:41:21 crc kubenswrapper[4719]: I1009 15:41:21.725892 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23b17d9f-68ed-4040-b073-cfb978c8e0df-catalog-content\") pod \"certified-operators-46b8q\" (UID: \"23b17d9f-68ed-4040-b073-cfb978c8e0df\") " pod="openshift-marketplace/certified-operators-46b8q" Oct 09 15:41:21 crc kubenswrapper[4719]: I1009 15:41:21.726213 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23b17d9f-68ed-4040-b073-cfb978c8e0df-utilities\") pod \"certified-operators-46b8q\" (UID: \"23b17d9f-68ed-4040-b073-cfb978c8e0df\") " pod="openshift-marketplace/certified-operators-46b8q" Oct 09 15:41:21 crc kubenswrapper[4719]: I1009 15:41:21.726406 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23b17d9f-68ed-4040-b073-cfb978c8e0df-catalog-content\") pod \"certified-operators-46b8q\" (UID: \"23b17d9f-68ed-4040-b073-cfb978c8e0df\") " pod="openshift-marketplace/certified-operators-46b8q" Oct 09 15:41:21 crc kubenswrapper[4719]: I1009 15:41:21.743949 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75v6j\" (UniqueName: \"kubernetes.io/projected/23b17d9f-68ed-4040-b073-cfb978c8e0df-kube-api-access-75v6j\") pod \"certified-operators-46b8q\" (UID: \"23b17d9f-68ed-4040-b073-cfb978c8e0df\") " pod="openshift-marketplace/certified-operators-46b8q" Oct 09 15:41:21 crc kubenswrapper[4719]: I1009 15:41:21.842389 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-46b8q" Oct 09 15:41:22 crc kubenswrapper[4719]: I1009 15:41:22.307838 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-46b8q"] Oct 09 15:41:23 crc kubenswrapper[4719]: I1009 15:41:23.239638 4719 generic.go:334] "Generic (PLEG): container finished" podID="23b17d9f-68ed-4040-b073-cfb978c8e0df" containerID="4c9b9264d1eb2a71e922d1fc54cc767dded60b85223c4867d53eb0bc866cd0d1" exitCode=0 Oct 09 15:41:23 crc kubenswrapper[4719]: I1009 15:41:23.239773 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46b8q" event={"ID":"23b17d9f-68ed-4040-b073-cfb978c8e0df","Type":"ContainerDied","Data":"4c9b9264d1eb2a71e922d1fc54cc767dded60b85223c4867d53eb0bc866cd0d1"} Oct 09 15:41:23 crc kubenswrapper[4719]: I1009 15:41:23.239930 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46b8q" event={"ID":"23b17d9f-68ed-4040-b073-cfb978c8e0df","Type":"ContainerStarted","Data":"5caebff8fb3bca89e76dc1a6f088b5208bf904507aff8e5b6ff36442984a20d0"} Oct 09 15:41:24 crc kubenswrapper[4719]: I1009 15:41:24.195320 4719 scope.go:117] "RemoveContainer" containerID="4dd880f5549a37e3c5798cffed123243f1447b4dbaf302745557799f6c966e66" Oct 09 15:41:24 crc kubenswrapper[4719]: I1009 15:41:24.251467 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46b8q" event={"ID":"23b17d9f-68ed-4040-b073-cfb978c8e0df","Type":"ContainerStarted","Data":"35bf1a30abc32f4780f4b218d487f5299665fea14dfda516c46d4e63e8c2eaf1"} Oct 09 15:41:25 crc kubenswrapper[4719]: I1009 15:41:25.262615 4719 generic.go:334] "Generic (PLEG): container finished" podID="23b17d9f-68ed-4040-b073-cfb978c8e0df" containerID="35bf1a30abc32f4780f4b218d487f5299665fea14dfda516c46d4e63e8c2eaf1" exitCode=0 Oct 09 15:41:25 crc kubenswrapper[4719]: I1009 15:41:25.262662 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46b8q" event={"ID":"23b17d9f-68ed-4040-b073-cfb978c8e0df","Type":"ContainerDied","Data":"35bf1a30abc32f4780f4b218d487f5299665fea14dfda516c46d4e63e8c2eaf1"} Oct 09 15:41:26 crc kubenswrapper[4719]: I1009 15:41:26.273225 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46b8q" event={"ID":"23b17d9f-68ed-4040-b073-cfb978c8e0df","Type":"ContainerStarted","Data":"01c7569395afabee6fcf8106e8e187534729f35e3350abe2f05e71d83157e42e"} Oct 09 15:41:26 crc kubenswrapper[4719]: I1009 15:41:26.297094 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-46b8q" podStartSLOduration=2.779378931 podStartE2EDuration="5.297076957s" podCreationTimestamp="2025-10-09 15:41:21 +0000 UTC" firstStartedPulling="2025-10-09 15:41:23.242233345 +0000 UTC m=+1388.751944630" lastFinishedPulling="2025-10-09 15:41:25.759931371 +0000 UTC m=+1391.269642656" observedRunningTime="2025-10-09 15:41:26.289705762 +0000 UTC m=+1391.799417057" watchObservedRunningTime="2025-10-09 15:41:26.297076957 +0000 UTC m=+1391.806788232" Oct 09 15:41:31 crc kubenswrapper[4719]: I1009 15:41:31.843844 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-46b8q" Oct 09 15:41:31 crc kubenswrapper[4719]: I1009 15:41:31.844423 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-46b8q" Oct 09 15:41:31 crc kubenswrapper[4719]: I1009 15:41:31.889194 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-46b8q" Oct 09 15:41:32 crc kubenswrapper[4719]: I1009 15:41:32.390696 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-46b8q" Oct 09 15:41:32 crc kubenswrapper[4719]: I1009 15:41:32.438073 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-46b8q"] Oct 09 15:41:34 crc kubenswrapper[4719]: I1009 15:41:34.339810 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-46b8q" podUID="23b17d9f-68ed-4040-b073-cfb978c8e0df" containerName="registry-server" containerID="cri-o://01c7569395afabee6fcf8106e8e187534729f35e3350abe2f05e71d83157e42e" gracePeriod=2 Oct 09 15:41:34 crc kubenswrapper[4719]: I1009 15:41:34.873602 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-46b8q" Oct 09 15:41:34 crc kubenswrapper[4719]: I1009 15:41:34.894562 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23b17d9f-68ed-4040-b073-cfb978c8e0df-utilities\") pod \"23b17d9f-68ed-4040-b073-cfb978c8e0df\" (UID: \"23b17d9f-68ed-4040-b073-cfb978c8e0df\") " Oct 09 15:41:34 crc kubenswrapper[4719]: I1009 15:41:34.894601 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23b17d9f-68ed-4040-b073-cfb978c8e0df-catalog-content\") pod \"23b17d9f-68ed-4040-b073-cfb978c8e0df\" (UID: \"23b17d9f-68ed-4040-b073-cfb978c8e0df\") " Oct 09 15:41:34 crc kubenswrapper[4719]: I1009 15:41:34.894692 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75v6j\" (UniqueName: \"kubernetes.io/projected/23b17d9f-68ed-4040-b073-cfb978c8e0df-kube-api-access-75v6j\") pod \"23b17d9f-68ed-4040-b073-cfb978c8e0df\" (UID: \"23b17d9f-68ed-4040-b073-cfb978c8e0df\") " Oct 09 15:41:34 crc kubenswrapper[4719]: I1009 15:41:34.895608 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23b17d9f-68ed-4040-b073-cfb978c8e0df-utilities" (OuterVolumeSpecName: "utilities") pod "23b17d9f-68ed-4040-b073-cfb978c8e0df" (UID: "23b17d9f-68ed-4040-b073-cfb978c8e0df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:41:34 crc kubenswrapper[4719]: I1009 15:41:34.902685 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23b17d9f-68ed-4040-b073-cfb978c8e0df-kube-api-access-75v6j" (OuterVolumeSpecName: "kube-api-access-75v6j") pod "23b17d9f-68ed-4040-b073-cfb978c8e0df" (UID: "23b17d9f-68ed-4040-b073-cfb978c8e0df"). InnerVolumeSpecName "kube-api-access-75v6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:41:34 crc kubenswrapper[4719]: I1009 15:41:34.962559 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23b17d9f-68ed-4040-b073-cfb978c8e0df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23b17d9f-68ed-4040-b073-cfb978c8e0df" (UID: "23b17d9f-68ed-4040-b073-cfb978c8e0df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:41:34 crc kubenswrapper[4719]: I1009 15:41:34.997769 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75v6j\" (UniqueName: \"kubernetes.io/projected/23b17d9f-68ed-4040-b073-cfb978c8e0df-kube-api-access-75v6j\") on node \"crc\" DevicePath \"\"" Oct 09 15:41:34 crc kubenswrapper[4719]: I1009 15:41:34.997809 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23b17d9f-68ed-4040-b073-cfb978c8e0df-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 15:41:34 crc kubenswrapper[4719]: I1009 15:41:34.997821 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23b17d9f-68ed-4040-b073-cfb978c8e0df-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 15:41:35 crc kubenswrapper[4719]: I1009 15:41:35.353742 4719 generic.go:334] "Generic (PLEG): container finished" podID="23b17d9f-68ed-4040-b073-cfb978c8e0df" containerID="01c7569395afabee6fcf8106e8e187534729f35e3350abe2f05e71d83157e42e" exitCode=0 Oct 09 15:41:35 crc kubenswrapper[4719]: I1009 15:41:35.353788 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46b8q" event={"ID":"23b17d9f-68ed-4040-b073-cfb978c8e0df","Type":"ContainerDied","Data":"01c7569395afabee6fcf8106e8e187534729f35e3350abe2f05e71d83157e42e"} Oct 09 15:41:35 crc kubenswrapper[4719]: I1009 15:41:35.353825 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46b8q" event={"ID":"23b17d9f-68ed-4040-b073-cfb978c8e0df","Type":"ContainerDied","Data":"5caebff8fb3bca89e76dc1a6f088b5208bf904507aff8e5b6ff36442984a20d0"} Oct 09 15:41:35 crc kubenswrapper[4719]: I1009 15:41:35.353864 4719 scope.go:117] "RemoveContainer" containerID="01c7569395afabee6fcf8106e8e187534729f35e3350abe2f05e71d83157e42e" Oct 09 15:41:35 crc kubenswrapper[4719]: I1009 15:41:35.353863 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-46b8q" Oct 09 15:41:35 crc kubenswrapper[4719]: I1009 15:41:35.374582 4719 scope.go:117] "RemoveContainer" containerID="35bf1a30abc32f4780f4b218d487f5299665fea14dfda516c46d4e63e8c2eaf1" Oct 09 15:41:35 crc kubenswrapper[4719]: I1009 15:41:35.388913 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-46b8q"] Oct 09 15:41:35 crc kubenswrapper[4719]: I1009 15:41:35.416899 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-46b8q"] Oct 09 15:41:35 crc kubenswrapper[4719]: I1009 15:41:35.417483 4719 scope.go:117] "RemoveContainer" containerID="4c9b9264d1eb2a71e922d1fc54cc767dded60b85223c4867d53eb0bc866cd0d1" Oct 09 15:41:35 crc kubenswrapper[4719]: I1009 15:41:35.465211 4719 scope.go:117] "RemoveContainer" containerID="01c7569395afabee6fcf8106e8e187534729f35e3350abe2f05e71d83157e42e" Oct 09 15:41:35 crc kubenswrapper[4719]: E1009 15:41:35.465596 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01c7569395afabee6fcf8106e8e187534729f35e3350abe2f05e71d83157e42e\": container with ID starting with 01c7569395afabee6fcf8106e8e187534729f35e3350abe2f05e71d83157e42e not found: ID does not exist" containerID="01c7569395afabee6fcf8106e8e187534729f35e3350abe2f05e71d83157e42e" Oct 09 15:41:35 crc kubenswrapper[4719]: I1009 15:41:35.465627 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01c7569395afabee6fcf8106e8e187534729f35e3350abe2f05e71d83157e42e"} err="failed to get container status \"01c7569395afabee6fcf8106e8e187534729f35e3350abe2f05e71d83157e42e\": rpc error: code = NotFound desc = could not find container \"01c7569395afabee6fcf8106e8e187534729f35e3350abe2f05e71d83157e42e\": container with ID starting with 01c7569395afabee6fcf8106e8e187534729f35e3350abe2f05e71d83157e42e not found: ID does not exist" Oct 09 15:41:35 crc kubenswrapper[4719]: I1009 15:41:35.465649 4719 scope.go:117] "RemoveContainer" containerID="35bf1a30abc32f4780f4b218d487f5299665fea14dfda516c46d4e63e8c2eaf1" Oct 09 15:41:35 crc kubenswrapper[4719]: E1009 15:41:35.465839 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35bf1a30abc32f4780f4b218d487f5299665fea14dfda516c46d4e63e8c2eaf1\": container with ID starting with 35bf1a30abc32f4780f4b218d487f5299665fea14dfda516c46d4e63e8c2eaf1 not found: ID does not exist" containerID="35bf1a30abc32f4780f4b218d487f5299665fea14dfda516c46d4e63e8c2eaf1" Oct 09 15:41:35 crc kubenswrapper[4719]: I1009 15:41:35.465866 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35bf1a30abc32f4780f4b218d487f5299665fea14dfda516c46d4e63e8c2eaf1"} err="failed to get container status \"35bf1a30abc32f4780f4b218d487f5299665fea14dfda516c46d4e63e8c2eaf1\": rpc error: code = NotFound desc = could not find container \"35bf1a30abc32f4780f4b218d487f5299665fea14dfda516c46d4e63e8c2eaf1\": container with ID starting with 35bf1a30abc32f4780f4b218d487f5299665fea14dfda516c46d4e63e8c2eaf1 not found: ID does not exist" Oct 09 15:41:35 crc kubenswrapper[4719]: I1009 15:41:35.465883 4719 scope.go:117] "RemoveContainer" containerID="4c9b9264d1eb2a71e922d1fc54cc767dded60b85223c4867d53eb0bc866cd0d1" Oct 09 15:41:35 crc kubenswrapper[4719]: E1009 15:41:35.466160 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c9b9264d1eb2a71e922d1fc54cc767dded60b85223c4867d53eb0bc866cd0d1\": container with ID starting with 4c9b9264d1eb2a71e922d1fc54cc767dded60b85223c4867d53eb0bc866cd0d1 not found: ID does not exist" containerID="4c9b9264d1eb2a71e922d1fc54cc767dded60b85223c4867d53eb0bc866cd0d1" Oct 09 15:41:35 crc kubenswrapper[4719]: I1009 15:41:35.466196 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c9b9264d1eb2a71e922d1fc54cc767dded60b85223c4867d53eb0bc866cd0d1"} err="failed to get container status \"4c9b9264d1eb2a71e922d1fc54cc767dded60b85223c4867d53eb0bc866cd0d1\": rpc error: code = NotFound desc = could not find container \"4c9b9264d1eb2a71e922d1fc54cc767dded60b85223c4867d53eb0bc866cd0d1\": container with ID starting with 4c9b9264d1eb2a71e922d1fc54cc767dded60b85223c4867d53eb0bc866cd0d1 not found: ID does not exist" Oct 09 15:41:36 crc kubenswrapper[4719]: I1009 15:41:36.976857 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:41:36 crc kubenswrapper[4719]: I1009 15:41:36.977219 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:41:37 crc kubenswrapper[4719]: I1009 15:41:37.172097 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23b17d9f-68ed-4040-b073-cfb978c8e0df" path="/var/lib/kubelet/pods/23b17d9f-68ed-4040-b073-cfb978c8e0df/volumes" Oct 09 15:41:51 crc kubenswrapper[4719]: I1009 15:41:51.727079 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4z4lr"] Oct 09 15:41:51 crc kubenswrapper[4719]: E1009 15:41:51.727979 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b17d9f-68ed-4040-b073-cfb978c8e0df" containerName="extract-content" Oct 09 15:41:51 crc kubenswrapper[4719]: I1009 15:41:51.727991 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b17d9f-68ed-4040-b073-cfb978c8e0df" containerName="extract-content" Oct 09 15:41:51 crc kubenswrapper[4719]: E1009 15:41:51.728002 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b17d9f-68ed-4040-b073-cfb978c8e0df" containerName="registry-server" Oct 09 15:41:51 crc kubenswrapper[4719]: I1009 15:41:51.728008 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b17d9f-68ed-4040-b073-cfb978c8e0df" containerName="registry-server" Oct 09 15:41:51 crc kubenswrapper[4719]: E1009 15:41:51.728021 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b17d9f-68ed-4040-b073-cfb978c8e0df" containerName="extract-utilities" Oct 09 15:41:51 crc kubenswrapper[4719]: I1009 15:41:51.728029 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b17d9f-68ed-4040-b073-cfb978c8e0df" containerName="extract-utilities" Oct 09 15:41:51 crc kubenswrapper[4719]: I1009 15:41:51.728278 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="23b17d9f-68ed-4040-b073-cfb978c8e0df" containerName="registry-server" Oct 09 15:41:51 crc kubenswrapper[4719]: I1009 15:41:51.729635 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4z4lr" Oct 09 15:41:51 crc kubenswrapper[4719]: I1009 15:41:51.740087 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4z4lr"] Oct 09 15:41:51 crc kubenswrapper[4719]: I1009 15:41:51.838521 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq4nj\" (UniqueName: \"kubernetes.io/projected/ce846983-ed40-42c7-84f5-df8bc90e89c4-kube-api-access-xq4nj\") pod \"redhat-marketplace-4z4lr\" (UID: \"ce846983-ed40-42c7-84f5-df8bc90e89c4\") " pod="openshift-marketplace/redhat-marketplace-4z4lr" Oct 09 15:41:51 crc kubenswrapper[4719]: I1009 15:41:51.838609 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce846983-ed40-42c7-84f5-df8bc90e89c4-utilities\") pod \"redhat-marketplace-4z4lr\" (UID: \"ce846983-ed40-42c7-84f5-df8bc90e89c4\") " pod="openshift-marketplace/redhat-marketplace-4z4lr" Oct 09 15:41:51 crc kubenswrapper[4719]: I1009 15:41:51.838700 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce846983-ed40-42c7-84f5-df8bc90e89c4-catalog-content\") pod \"redhat-marketplace-4z4lr\" (UID: \"ce846983-ed40-42c7-84f5-df8bc90e89c4\") " pod="openshift-marketplace/redhat-marketplace-4z4lr" Oct 09 15:41:51 crc kubenswrapper[4719]: I1009 15:41:51.940286 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq4nj\" (UniqueName: \"kubernetes.io/projected/ce846983-ed40-42c7-84f5-df8bc90e89c4-kube-api-access-xq4nj\") pod \"redhat-marketplace-4z4lr\" (UID: \"ce846983-ed40-42c7-84f5-df8bc90e89c4\") " pod="openshift-marketplace/redhat-marketplace-4z4lr" Oct 09 15:41:51 crc kubenswrapper[4719]: I1009 15:41:51.940378 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce846983-ed40-42c7-84f5-df8bc90e89c4-utilities\") pod \"redhat-marketplace-4z4lr\" (UID: \"ce846983-ed40-42c7-84f5-df8bc90e89c4\") " pod="openshift-marketplace/redhat-marketplace-4z4lr" Oct 09 15:41:51 crc kubenswrapper[4719]: I1009 15:41:51.940404 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce846983-ed40-42c7-84f5-df8bc90e89c4-catalog-content\") pod \"redhat-marketplace-4z4lr\" (UID: \"ce846983-ed40-42c7-84f5-df8bc90e89c4\") " pod="openshift-marketplace/redhat-marketplace-4z4lr" Oct 09 15:41:51 crc kubenswrapper[4719]: I1009 15:41:51.941327 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce846983-ed40-42c7-84f5-df8bc90e89c4-catalog-content\") pod \"redhat-marketplace-4z4lr\" (UID: \"ce846983-ed40-42c7-84f5-df8bc90e89c4\") " pod="openshift-marketplace/redhat-marketplace-4z4lr" Oct 09 15:41:51 crc kubenswrapper[4719]: I1009 15:41:51.941612 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce846983-ed40-42c7-84f5-df8bc90e89c4-utilities\") pod \"redhat-marketplace-4z4lr\" (UID: \"ce846983-ed40-42c7-84f5-df8bc90e89c4\") " pod="openshift-marketplace/redhat-marketplace-4z4lr" Oct 09 15:41:51 crc kubenswrapper[4719]: I1009 15:41:51.961484 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq4nj\" (UniqueName: \"kubernetes.io/projected/ce846983-ed40-42c7-84f5-df8bc90e89c4-kube-api-access-xq4nj\") pod \"redhat-marketplace-4z4lr\" (UID: \"ce846983-ed40-42c7-84f5-df8bc90e89c4\") " pod="openshift-marketplace/redhat-marketplace-4z4lr" Oct 09 15:41:52 crc kubenswrapper[4719]: I1009 15:41:52.048893 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4z4lr" Oct 09 15:41:52 crc kubenswrapper[4719]: I1009 15:41:52.510756 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4z4lr"] Oct 09 15:41:53 crc kubenswrapper[4719]: I1009 15:41:53.523575 4719 generic.go:334] "Generic (PLEG): container finished" podID="ce846983-ed40-42c7-84f5-df8bc90e89c4" containerID="f453c475d773a2d11f7bce558d03e96d1dc1ed95bbfe81340c0d3224a563fe58" exitCode=0 Oct 09 15:41:53 crc kubenswrapper[4719]: I1009 15:41:53.523714 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z4lr" event={"ID":"ce846983-ed40-42c7-84f5-df8bc90e89c4","Type":"ContainerDied","Data":"f453c475d773a2d11f7bce558d03e96d1dc1ed95bbfe81340c0d3224a563fe58"} Oct 09 15:41:53 crc kubenswrapper[4719]: I1009 15:41:53.524746 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z4lr" event={"ID":"ce846983-ed40-42c7-84f5-df8bc90e89c4","Type":"ContainerStarted","Data":"694638b3a627221b9c7df10a3527e5ae24556bc11d1213706d38a41352ea6712"} Oct 09 15:41:54 crc kubenswrapper[4719]: I1009 15:41:54.539604 4719 generic.go:334] "Generic (PLEG): container finished" podID="ce846983-ed40-42c7-84f5-df8bc90e89c4" containerID="1966b530475e32d5d53334795fa8d18ac81851ba6354e3906b815ed044c4430d" exitCode=0 Oct 09 15:41:54 crc kubenswrapper[4719]: I1009 15:41:54.539670 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z4lr" event={"ID":"ce846983-ed40-42c7-84f5-df8bc90e89c4","Type":"ContainerDied","Data":"1966b530475e32d5d53334795fa8d18ac81851ba6354e3906b815ed044c4430d"} Oct 09 15:41:55 crc kubenswrapper[4719]: I1009 15:41:55.561622 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z4lr" event={"ID":"ce846983-ed40-42c7-84f5-df8bc90e89c4","Type":"ContainerStarted","Data":"71ae9573d0274c24bc5871b2a7529dffbeda3ee28349b94a55d570997b88439c"} Oct 09 15:41:55 crc kubenswrapper[4719]: I1009 15:41:55.584930 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4z4lr" podStartSLOduration=3.176640538 podStartE2EDuration="4.584907198s" podCreationTimestamp="2025-10-09 15:41:51 +0000 UTC" firstStartedPulling="2025-10-09 15:41:53.526233754 +0000 UTC m=+1419.035945039" lastFinishedPulling="2025-10-09 15:41:54.934500414 +0000 UTC m=+1420.444211699" observedRunningTime="2025-10-09 15:41:55.57648719 +0000 UTC m=+1421.086198495" watchObservedRunningTime="2025-10-09 15:41:55.584907198 +0000 UTC m=+1421.094618483" Oct 09 15:42:02 crc kubenswrapper[4719]: I1009 15:42:02.049580 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4z4lr" Oct 09 15:42:02 crc kubenswrapper[4719]: I1009 15:42:02.051620 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4z4lr" Oct 09 15:42:02 crc kubenswrapper[4719]: I1009 15:42:02.100244 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4z4lr" Oct 09 15:42:02 crc kubenswrapper[4719]: I1009 15:42:02.694413 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4z4lr" Oct 09 15:42:02 crc kubenswrapper[4719]: I1009 15:42:02.741873 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4z4lr"] Oct 09 15:42:04 crc kubenswrapper[4719]: I1009 15:42:04.649064 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4z4lr" podUID="ce846983-ed40-42c7-84f5-df8bc90e89c4" containerName="registry-server" containerID="cri-o://71ae9573d0274c24bc5871b2a7529dffbeda3ee28349b94a55d570997b88439c" gracePeriod=2 Oct 09 15:42:05 crc kubenswrapper[4719]: I1009 15:42:05.130458 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4z4lr" Oct 09 15:42:05 crc kubenswrapper[4719]: I1009 15:42:05.193150 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xq4nj\" (UniqueName: \"kubernetes.io/projected/ce846983-ed40-42c7-84f5-df8bc90e89c4-kube-api-access-xq4nj\") pod \"ce846983-ed40-42c7-84f5-df8bc90e89c4\" (UID: \"ce846983-ed40-42c7-84f5-df8bc90e89c4\") " Oct 09 15:42:05 crc kubenswrapper[4719]: I1009 15:42:05.193373 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce846983-ed40-42c7-84f5-df8bc90e89c4-utilities\") pod \"ce846983-ed40-42c7-84f5-df8bc90e89c4\" (UID: \"ce846983-ed40-42c7-84f5-df8bc90e89c4\") " Oct 09 15:42:05 crc kubenswrapper[4719]: I1009 15:42:05.193461 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce846983-ed40-42c7-84f5-df8bc90e89c4-catalog-content\") pod \"ce846983-ed40-42c7-84f5-df8bc90e89c4\" (UID: \"ce846983-ed40-42c7-84f5-df8bc90e89c4\") " Oct 09 15:42:05 crc kubenswrapper[4719]: I1009 15:42:05.195215 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce846983-ed40-42c7-84f5-df8bc90e89c4-utilities" (OuterVolumeSpecName: "utilities") pod "ce846983-ed40-42c7-84f5-df8bc90e89c4" (UID: "ce846983-ed40-42c7-84f5-df8bc90e89c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:42:05 crc kubenswrapper[4719]: I1009 15:42:05.209487 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce846983-ed40-42c7-84f5-df8bc90e89c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce846983-ed40-42c7-84f5-df8bc90e89c4" (UID: "ce846983-ed40-42c7-84f5-df8bc90e89c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:42:05 crc kubenswrapper[4719]: I1009 15:42:05.213783 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce846983-ed40-42c7-84f5-df8bc90e89c4-kube-api-access-xq4nj" (OuterVolumeSpecName: "kube-api-access-xq4nj") pod "ce846983-ed40-42c7-84f5-df8bc90e89c4" (UID: "ce846983-ed40-42c7-84f5-df8bc90e89c4"). InnerVolumeSpecName "kube-api-access-xq4nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:42:05 crc kubenswrapper[4719]: I1009 15:42:05.297047 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce846983-ed40-42c7-84f5-df8bc90e89c4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 15:42:05 crc kubenswrapper[4719]: I1009 15:42:05.297079 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xq4nj\" (UniqueName: \"kubernetes.io/projected/ce846983-ed40-42c7-84f5-df8bc90e89c4-kube-api-access-xq4nj\") on node \"crc\" DevicePath \"\"" Oct 09 15:42:05 crc kubenswrapper[4719]: I1009 15:42:05.297090 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce846983-ed40-42c7-84f5-df8bc90e89c4-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 15:42:05 crc kubenswrapper[4719]: I1009 15:42:05.659414 4719 generic.go:334] "Generic (PLEG): container finished" podID="ce846983-ed40-42c7-84f5-df8bc90e89c4" containerID="71ae9573d0274c24bc5871b2a7529dffbeda3ee28349b94a55d570997b88439c" exitCode=0 Oct 09 15:42:05 crc kubenswrapper[4719]: I1009 15:42:05.659464 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z4lr" event={"ID":"ce846983-ed40-42c7-84f5-df8bc90e89c4","Type":"ContainerDied","Data":"71ae9573d0274c24bc5871b2a7529dffbeda3ee28349b94a55d570997b88439c"} Oct 09 15:42:05 crc kubenswrapper[4719]: I1009 15:42:05.659493 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4z4lr" Oct 09 15:42:05 crc kubenswrapper[4719]: I1009 15:42:05.659511 4719 scope.go:117] "RemoveContainer" containerID="71ae9573d0274c24bc5871b2a7529dffbeda3ee28349b94a55d570997b88439c" Oct 09 15:42:05 crc kubenswrapper[4719]: I1009 15:42:05.659500 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z4lr" event={"ID":"ce846983-ed40-42c7-84f5-df8bc90e89c4","Type":"ContainerDied","Data":"694638b3a627221b9c7df10a3527e5ae24556bc11d1213706d38a41352ea6712"} Oct 09 15:42:05 crc kubenswrapper[4719]: I1009 15:42:05.688617 4719 scope.go:117] "RemoveContainer" containerID="1966b530475e32d5d53334795fa8d18ac81851ba6354e3906b815ed044c4430d" Oct 09 15:42:05 crc kubenswrapper[4719]: I1009 15:42:05.694437 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4z4lr"] Oct 09 15:42:05 crc kubenswrapper[4719]: I1009 15:42:05.703752 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4z4lr"] Oct 09 15:42:05 crc kubenswrapper[4719]: I1009 15:42:05.722324 4719 scope.go:117] "RemoveContainer" containerID="f453c475d773a2d11f7bce558d03e96d1dc1ed95bbfe81340c0d3224a563fe58" Oct 09 15:42:05 crc kubenswrapper[4719]: I1009 15:42:05.765746 4719 scope.go:117] "RemoveContainer" containerID="71ae9573d0274c24bc5871b2a7529dffbeda3ee28349b94a55d570997b88439c" Oct 09 15:42:05 crc kubenswrapper[4719]: E1009 15:42:05.766100 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71ae9573d0274c24bc5871b2a7529dffbeda3ee28349b94a55d570997b88439c\": container with ID starting with 71ae9573d0274c24bc5871b2a7529dffbeda3ee28349b94a55d570997b88439c not found: ID does not exist" containerID="71ae9573d0274c24bc5871b2a7529dffbeda3ee28349b94a55d570997b88439c" Oct 09 15:42:05 crc kubenswrapper[4719]: I1009 15:42:05.766133 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71ae9573d0274c24bc5871b2a7529dffbeda3ee28349b94a55d570997b88439c"} err="failed to get container status \"71ae9573d0274c24bc5871b2a7529dffbeda3ee28349b94a55d570997b88439c\": rpc error: code = NotFound desc = could not find container \"71ae9573d0274c24bc5871b2a7529dffbeda3ee28349b94a55d570997b88439c\": container with ID starting with 71ae9573d0274c24bc5871b2a7529dffbeda3ee28349b94a55d570997b88439c not found: ID does not exist" Oct 09 15:42:05 crc kubenswrapper[4719]: I1009 15:42:05.766155 4719 scope.go:117] "RemoveContainer" containerID="1966b530475e32d5d53334795fa8d18ac81851ba6354e3906b815ed044c4430d" Oct 09 15:42:05 crc kubenswrapper[4719]: E1009 15:42:05.766589 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1966b530475e32d5d53334795fa8d18ac81851ba6354e3906b815ed044c4430d\": container with ID starting with 1966b530475e32d5d53334795fa8d18ac81851ba6354e3906b815ed044c4430d not found: ID does not exist" containerID="1966b530475e32d5d53334795fa8d18ac81851ba6354e3906b815ed044c4430d" Oct 09 15:42:05 crc kubenswrapper[4719]: I1009 15:42:05.766611 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1966b530475e32d5d53334795fa8d18ac81851ba6354e3906b815ed044c4430d"} err="failed to get container status \"1966b530475e32d5d53334795fa8d18ac81851ba6354e3906b815ed044c4430d\": rpc error: code = NotFound desc = could not find container \"1966b530475e32d5d53334795fa8d18ac81851ba6354e3906b815ed044c4430d\": container with ID starting with 1966b530475e32d5d53334795fa8d18ac81851ba6354e3906b815ed044c4430d not found: ID does not exist" Oct 09 15:42:05 crc kubenswrapper[4719]: I1009 15:42:05.766625 4719 scope.go:117] "RemoveContainer" containerID="f453c475d773a2d11f7bce558d03e96d1dc1ed95bbfe81340c0d3224a563fe58" Oct 09 15:42:05 crc kubenswrapper[4719]: E1009 15:42:05.766965 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f453c475d773a2d11f7bce558d03e96d1dc1ed95bbfe81340c0d3224a563fe58\": container with ID starting with f453c475d773a2d11f7bce558d03e96d1dc1ed95bbfe81340c0d3224a563fe58 not found: ID does not exist" containerID="f453c475d773a2d11f7bce558d03e96d1dc1ed95bbfe81340c0d3224a563fe58" Oct 09 15:42:05 crc kubenswrapper[4719]: I1009 15:42:05.766986 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f453c475d773a2d11f7bce558d03e96d1dc1ed95bbfe81340c0d3224a563fe58"} err="failed to get container status \"f453c475d773a2d11f7bce558d03e96d1dc1ed95bbfe81340c0d3224a563fe58\": rpc error: code = NotFound desc = could not find container \"f453c475d773a2d11f7bce558d03e96d1dc1ed95bbfe81340c0d3224a563fe58\": container with ID starting with f453c475d773a2d11f7bce558d03e96d1dc1ed95bbfe81340c0d3224a563fe58 not found: ID does not exist" Oct 09 15:42:06 crc kubenswrapper[4719]: I1009 15:42:06.977218 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:42:06 crc kubenswrapper[4719]: I1009 15:42:06.977628 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:42:07 crc kubenswrapper[4719]: I1009 15:42:07.176257 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce846983-ed40-42c7-84f5-df8bc90e89c4" path="/var/lib/kubelet/pods/ce846983-ed40-42c7-84f5-df8bc90e89c4/volumes" Oct 09 15:42:07 crc kubenswrapper[4719]: I1009 15:42:07.762325 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gpgqv"] Oct 09 15:42:07 crc kubenswrapper[4719]: E1009 15:42:07.763187 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce846983-ed40-42c7-84f5-df8bc90e89c4" containerName="extract-utilities" Oct 09 15:42:07 crc kubenswrapper[4719]: I1009 15:42:07.763283 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce846983-ed40-42c7-84f5-df8bc90e89c4" containerName="extract-utilities" Oct 09 15:42:07 crc kubenswrapper[4719]: E1009 15:42:07.763389 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce846983-ed40-42c7-84f5-df8bc90e89c4" containerName="extract-content" Oct 09 15:42:07 crc kubenswrapper[4719]: I1009 15:42:07.763508 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce846983-ed40-42c7-84f5-df8bc90e89c4" containerName="extract-content" Oct 09 15:42:07 crc kubenswrapper[4719]: E1009 15:42:07.763604 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce846983-ed40-42c7-84f5-df8bc90e89c4" containerName="registry-server" Oct 09 15:42:07 crc kubenswrapper[4719]: I1009 15:42:07.763660 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce846983-ed40-42c7-84f5-df8bc90e89c4" containerName="registry-server" Oct 09 15:42:07 crc kubenswrapper[4719]: I1009 15:42:07.763950 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce846983-ed40-42c7-84f5-df8bc90e89c4" containerName="registry-server" Oct 09 15:42:07 crc kubenswrapper[4719]: I1009 15:42:07.765442 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gpgqv" Oct 09 15:42:07 crc kubenswrapper[4719]: I1009 15:42:07.773724 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gpgqv"] Oct 09 15:42:07 crc kubenswrapper[4719]: I1009 15:42:07.844859 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def16219-1617-423e-93be-f43a5f782426-catalog-content\") pod \"community-operators-gpgqv\" (UID: \"def16219-1617-423e-93be-f43a5f782426\") " pod="openshift-marketplace/community-operators-gpgqv" Oct 09 15:42:07 crc kubenswrapper[4719]: I1009 15:42:07.844907 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcd65\" (UniqueName: \"kubernetes.io/projected/def16219-1617-423e-93be-f43a5f782426-kube-api-access-kcd65\") pod \"community-operators-gpgqv\" (UID: \"def16219-1617-423e-93be-f43a5f782426\") " pod="openshift-marketplace/community-operators-gpgqv" Oct 09 15:42:07 crc kubenswrapper[4719]: I1009 15:42:07.845171 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def16219-1617-423e-93be-f43a5f782426-utilities\") pod \"community-operators-gpgqv\" (UID: \"def16219-1617-423e-93be-f43a5f782426\") " pod="openshift-marketplace/community-operators-gpgqv" Oct 09 15:42:07 crc kubenswrapper[4719]: I1009 15:42:07.947071 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def16219-1617-423e-93be-f43a5f782426-utilities\") pod \"community-operators-gpgqv\" (UID: \"def16219-1617-423e-93be-f43a5f782426\") " pod="openshift-marketplace/community-operators-gpgqv" Oct 09 15:42:07 crc kubenswrapper[4719]: I1009 15:42:07.947277 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def16219-1617-423e-93be-f43a5f782426-catalog-content\") pod \"community-operators-gpgqv\" (UID: \"def16219-1617-423e-93be-f43a5f782426\") " pod="openshift-marketplace/community-operators-gpgqv" Oct 09 15:42:07 crc kubenswrapper[4719]: I1009 15:42:07.947336 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcd65\" (UniqueName: \"kubernetes.io/projected/def16219-1617-423e-93be-f43a5f782426-kube-api-access-kcd65\") pod \"community-operators-gpgqv\" (UID: \"def16219-1617-423e-93be-f43a5f782426\") " pod="openshift-marketplace/community-operators-gpgqv" Oct 09 15:42:07 crc kubenswrapper[4719]: I1009 15:42:07.947583 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def16219-1617-423e-93be-f43a5f782426-utilities\") pod \"community-operators-gpgqv\" (UID: \"def16219-1617-423e-93be-f43a5f782426\") " pod="openshift-marketplace/community-operators-gpgqv" Oct 09 15:42:07 crc kubenswrapper[4719]: I1009 15:42:07.947766 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def16219-1617-423e-93be-f43a5f782426-catalog-content\") pod \"community-operators-gpgqv\" (UID: \"def16219-1617-423e-93be-f43a5f782426\") " pod="openshift-marketplace/community-operators-gpgqv" Oct 09 15:42:07 crc kubenswrapper[4719]: I1009 15:42:07.966736 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcd65\" (UniqueName: \"kubernetes.io/projected/def16219-1617-423e-93be-f43a5f782426-kube-api-access-kcd65\") pod \"community-operators-gpgqv\" (UID: \"def16219-1617-423e-93be-f43a5f782426\") " pod="openshift-marketplace/community-operators-gpgqv" Oct 09 15:42:08 crc kubenswrapper[4719]: I1009 15:42:08.098580 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gpgqv" Oct 09 15:42:08 crc kubenswrapper[4719]: I1009 15:42:08.624145 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gpgqv"] Oct 09 15:42:08 crc kubenswrapper[4719]: I1009 15:42:08.693533 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpgqv" event={"ID":"def16219-1617-423e-93be-f43a5f782426","Type":"ContainerStarted","Data":"b0e94e8f6ce5a80572651f23fb214b14eb8728908614f18962ff667a6804cb82"} Oct 09 15:42:09 crc kubenswrapper[4719]: I1009 15:42:09.704895 4719 generic.go:334] "Generic (PLEG): container finished" podID="def16219-1617-423e-93be-f43a5f782426" containerID="68a197842832882d18c6ceddda17edc26817d729887eaf3ce61a61b1a0559cc3" exitCode=0 Oct 09 15:42:09 crc kubenswrapper[4719]: I1009 15:42:09.704985 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpgqv" event={"ID":"def16219-1617-423e-93be-f43a5f782426","Type":"ContainerDied","Data":"68a197842832882d18c6ceddda17edc26817d729887eaf3ce61a61b1a0559cc3"} Oct 09 15:42:11 crc kubenswrapper[4719]: I1009 15:42:11.723833 4719 generic.go:334] "Generic (PLEG): container finished" podID="def16219-1617-423e-93be-f43a5f782426" containerID="0febb4b1ddb04ec48c7f76b5d3abf0b9e67396bb52e7ae1daad0c26c576209ff" exitCode=0 Oct 09 15:42:11 crc kubenswrapper[4719]: I1009 15:42:11.723928 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpgqv" event={"ID":"def16219-1617-423e-93be-f43a5f782426","Type":"ContainerDied","Data":"0febb4b1ddb04ec48c7f76b5d3abf0b9e67396bb52e7ae1daad0c26c576209ff"} Oct 09 15:42:12 crc kubenswrapper[4719]: I1009 15:42:12.738319 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpgqv" event={"ID":"def16219-1617-423e-93be-f43a5f782426","Type":"ContainerStarted","Data":"6f6cbf80330489cf2a97202a5e0461300b0ef9420208722aa0304f4e9c56d8df"} Oct 09 15:42:12 crc kubenswrapper[4719]: I1009 15:42:12.779528 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gpgqv" podStartSLOduration=3.324694543 podStartE2EDuration="5.779506499s" podCreationTimestamp="2025-10-09 15:42:07 +0000 UTC" firstStartedPulling="2025-10-09 15:42:09.707130767 +0000 UTC m=+1435.216842052" lastFinishedPulling="2025-10-09 15:42:12.161942723 +0000 UTC m=+1437.671654008" observedRunningTime="2025-10-09 15:42:12.764213804 +0000 UTC m=+1438.273925099" watchObservedRunningTime="2025-10-09 15:42:12.779506499 +0000 UTC m=+1438.289217784" Oct 09 15:42:18 crc kubenswrapper[4719]: I1009 15:42:18.098831 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gpgqv" Oct 09 15:42:18 crc kubenswrapper[4719]: I1009 15:42:18.099428 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gpgqv" Oct 09 15:42:18 crc kubenswrapper[4719]: I1009 15:42:18.145555 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gpgqv" Oct 09 15:42:18 crc kubenswrapper[4719]: I1009 15:42:18.860555 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gpgqv" Oct 09 15:42:18 crc kubenswrapper[4719]: I1009 15:42:18.914341 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gpgqv"] Oct 09 15:42:20 crc kubenswrapper[4719]: I1009 15:42:20.825400 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gpgqv" podUID="def16219-1617-423e-93be-f43a5f782426" containerName="registry-server" containerID="cri-o://6f6cbf80330489cf2a97202a5e0461300b0ef9420208722aa0304f4e9c56d8df" gracePeriod=2 Oct 09 15:42:21 crc kubenswrapper[4719]: I1009 15:42:21.273114 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gpgqv" Oct 09 15:42:21 crc kubenswrapper[4719]: I1009 15:42:21.356775 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcd65\" (UniqueName: \"kubernetes.io/projected/def16219-1617-423e-93be-f43a5f782426-kube-api-access-kcd65\") pod \"def16219-1617-423e-93be-f43a5f782426\" (UID: \"def16219-1617-423e-93be-f43a5f782426\") " Oct 09 15:42:21 crc kubenswrapper[4719]: I1009 15:42:21.356992 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def16219-1617-423e-93be-f43a5f782426-catalog-content\") pod \"def16219-1617-423e-93be-f43a5f782426\" (UID: \"def16219-1617-423e-93be-f43a5f782426\") " Oct 09 15:42:21 crc kubenswrapper[4719]: I1009 15:42:21.357062 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def16219-1617-423e-93be-f43a5f782426-utilities\") pod \"def16219-1617-423e-93be-f43a5f782426\" (UID: \"def16219-1617-423e-93be-f43a5f782426\") " Oct 09 15:42:21 crc kubenswrapper[4719]: I1009 15:42:21.358227 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/def16219-1617-423e-93be-f43a5f782426-utilities" (OuterVolumeSpecName: "utilities") pod "def16219-1617-423e-93be-f43a5f782426" (UID: "def16219-1617-423e-93be-f43a5f782426"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:42:21 crc kubenswrapper[4719]: I1009 15:42:21.361780 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/def16219-1617-423e-93be-f43a5f782426-kube-api-access-kcd65" (OuterVolumeSpecName: "kube-api-access-kcd65") pod "def16219-1617-423e-93be-f43a5f782426" (UID: "def16219-1617-423e-93be-f43a5f782426"). InnerVolumeSpecName "kube-api-access-kcd65". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:42:21 crc kubenswrapper[4719]: I1009 15:42:21.409062 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/def16219-1617-423e-93be-f43a5f782426-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "def16219-1617-423e-93be-f43a5f782426" (UID: "def16219-1617-423e-93be-f43a5f782426"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:42:21 crc kubenswrapper[4719]: I1009 15:42:21.459874 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def16219-1617-423e-93be-f43a5f782426-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 15:42:21 crc kubenswrapper[4719]: I1009 15:42:21.459908 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def16219-1617-423e-93be-f43a5f782426-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 15:42:21 crc kubenswrapper[4719]: I1009 15:42:21.459918 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcd65\" (UniqueName: \"kubernetes.io/projected/def16219-1617-423e-93be-f43a5f782426-kube-api-access-kcd65\") on node \"crc\" DevicePath \"\"" Oct 09 15:42:21 crc kubenswrapper[4719]: I1009 15:42:21.836525 4719 generic.go:334] "Generic (PLEG): container finished" podID="def16219-1617-423e-93be-f43a5f782426" containerID="6f6cbf80330489cf2a97202a5e0461300b0ef9420208722aa0304f4e9c56d8df" exitCode=0 Oct 09 15:42:21 crc kubenswrapper[4719]: I1009 15:42:21.836571 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpgqv" event={"ID":"def16219-1617-423e-93be-f43a5f782426","Type":"ContainerDied","Data":"6f6cbf80330489cf2a97202a5e0461300b0ef9420208722aa0304f4e9c56d8df"} Oct 09 15:42:21 crc kubenswrapper[4719]: I1009 15:42:21.836594 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gpgqv" Oct 09 15:42:21 crc kubenswrapper[4719]: I1009 15:42:21.836610 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpgqv" event={"ID":"def16219-1617-423e-93be-f43a5f782426","Type":"ContainerDied","Data":"b0e94e8f6ce5a80572651f23fb214b14eb8728908614f18962ff667a6804cb82"} Oct 09 15:42:21 crc kubenswrapper[4719]: I1009 15:42:21.836641 4719 scope.go:117] "RemoveContainer" containerID="6f6cbf80330489cf2a97202a5e0461300b0ef9420208722aa0304f4e9c56d8df" Oct 09 15:42:21 crc kubenswrapper[4719]: I1009 15:42:21.872378 4719 scope.go:117] "RemoveContainer" containerID="0febb4b1ddb04ec48c7f76b5d3abf0b9e67396bb52e7ae1daad0c26c576209ff" Oct 09 15:42:21 crc kubenswrapper[4719]: I1009 15:42:21.883185 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gpgqv"] Oct 09 15:42:21 crc kubenswrapper[4719]: I1009 15:42:21.899112 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gpgqv"] Oct 09 15:42:21 crc kubenswrapper[4719]: I1009 15:42:21.908143 4719 scope.go:117] "RemoveContainer" containerID="68a197842832882d18c6ceddda17edc26817d729887eaf3ce61a61b1a0559cc3" Oct 09 15:42:21 crc kubenswrapper[4719]: I1009 15:42:21.957740 4719 scope.go:117] "RemoveContainer" containerID="6f6cbf80330489cf2a97202a5e0461300b0ef9420208722aa0304f4e9c56d8df" Oct 09 15:42:21 crc kubenswrapper[4719]: E1009 15:42:21.962534 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f6cbf80330489cf2a97202a5e0461300b0ef9420208722aa0304f4e9c56d8df\": container with ID starting with 6f6cbf80330489cf2a97202a5e0461300b0ef9420208722aa0304f4e9c56d8df not found: ID does not exist" containerID="6f6cbf80330489cf2a97202a5e0461300b0ef9420208722aa0304f4e9c56d8df" Oct 09 15:42:21 crc kubenswrapper[4719]: I1009 15:42:21.962579 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f6cbf80330489cf2a97202a5e0461300b0ef9420208722aa0304f4e9c56d8df"} err="failed to get container status \"6f6cbf80330489cf2a97202a5e0461300b0ef9420208722aa0304f4e9c56d8df\": rpc error: code = NotFound desc = could not find container \"6f6cbf80330489cf2a97202a5e0461300b0ef9420208722aa0304f4e9c56d8df\": container with ID starting with 6f6cbf80330489cf2a97202a5e0461300b0ef9420208722aa0304f4e9c56d8df not found: ID does not exist" Oct 09 15:42:21 crc kubenswrapper[4719]: I1009 15:42:21.962604 4719 scope.go:117] "RemoveContainer" containerID="0febb4b1ddb04ec48c7f76b5d3abf0b9e67396bb52e7ae1daad0c26c576209ff" Oct 09 15:42:21 crc kubenswrapper[4719]: E1009 15:42:21.964034 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0febb4b1ddb04ec48c7f76b5d3abf0b9e67396bb52e7ae1daad0c26c576209ff\": container with ID starting with 0febb4b1ddb04ec48c7f76b5d3abf0b9e67396bb52e7ae1daad0c26c576209ff not found: ID does not exist" containerID="0febb4b1ddb04ec48c7f76b5d3abf0b9e67396bb52e7ae1daad0c26c576209ff" Oct 09 15:42:21 crc kubenswrapper[4719]: I1009 15:42:21.964074 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0febb4b1ddb04ec48c7f76b5d3abf0b9e67396bb52e7ae1daad0c26c576209ff"} err="failed to get container status \"0febb4b1ddb04ec48c7f76b5d3abf0b9e67396bb52e7ae1daad0c26c576209ff\": rpc error: code = NotFound desc = could not find container \"0febb4b1ddb04ec48c7f76b5d3abf0b9e67396bb52e7ae1daad0c26c576209ff\": container with ID starting with 0febb4b1ddb04ec48c7f76b5d3abf0b9e67396bb52e7ae1daad0c26c576209ff not found: ID does not exist" Oct 09 15:42:21 crc kubenswrapper[4719]: I1009 15:42:21.964097 4719 scope.go:117] "RemoveContainer" containerID="68a197842832882d18c6ceddda17edc26817d729887eaf3ce61a61b1a0559cc3" Oct 09 15:42:21 crc kubenswrapper[4719]: E1009 15:42:21.964732 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68a197842832882d18c6ceddda17edc26817d729887eaf3ce61a61b1a0559cc3\": container with ID starting with 68a197842832882d18c6ceddda17edc26817d729887eaf3ce61a61b1a0559cc3 not found: ID does not exist" containerID="68a197842832882d18c6ceddda17edc26817d729887eaf3ce61a61b1a0559cc3" Oct 09 15:42:21 crc kubenswrapper[4719]: I1009 15:42:21.964792 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68a197842832882d18c6ceddda17edc26817d729887eaf3ce61a61b1a0559cc3"} err="failed to get container status \"68a197842832882d18c6ceddda17edc26817d729887eaf3ce61a61b1a0559cc3\": rpc error: code = NotFound desc = could not find container \"68a197842832882d18c6ceddda17edc26817d729887eaf3ce61a61b1a0559cc3\": container with ID starting with 68a197842832882d18c6ceddda17edc26817d729887eaf3ce61a61b1a0559cc3 not found: ID does not exist" Oct 09 15:42:23 crc kubenswrapper[4719]: I1009 15:42:23.178772 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="def16219-1617-423e-93be-f43a5f782426" path="/var/lib/kubelet/pods/def16219-1617-423e-93be-f43a5f782426/volumes" Oct 09 15:42:24 crc kubenswrapper[4719]: I1009 15:42:24.265452 4719 scope.go:117] "RemoveContainer" containerID="a5c8dac2514cab0b485142178177d06313e89059bbdb3f3c6f597c3b82c27c38" Oct 09 15:42:24 crc kubenswrapper[4719]: I1009 15:42:24.287621 4719 scope.go:117] "RemoveContainer" containerID="e5f20b20c42e0d1513484d9a4f5a033569d029d7a21fb90191416fde7f5713c0" Oct 09 15:42:24 crc kubenswrapper[4719]: I1009 15:42:24.354330 4719 scope.go:117] "RemoveContainer" containerID="c3420b29f4ae8cc02f43ccb61d99cc037e7b1d7ff80f91930f17fc72077bfbb1" Oct 09 15:42:36 crc kubenswrapper[4719]: I1009 15:42:36.976419 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:42:36 crc kubenswrapper[4719]: I1009 15:42:36.976994 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:42:36 crc kubenswrapper[4719]: I1009 15:42:36.977036 4719 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 15:42:36 crc kubenswrapper[4719]: I1009 15:42:36.977809 4719 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a69453ebf4e1aaf18164eaf7feb2c37cbe0797331962fcb2782850eab34c0940"} pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 15:42:36 crc kubenswrapper[4719]: I1009 15:42:36.977869 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" containerID="cri-o://a69453ebf4e1aaf18164eaf7feb2c37cbe0797331962fcb2782850eab34c0940" gracePeriod=600 Oct 09 15:42:37 crc kubenswrapper[4719]: I1009 15:42:37.987229 4719 generic.go:334] "Generic (PLEG): container finished" podID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerID="a69453ebf4e1aaf18164eaf7feb2c37cbe0797331962fcb2782850eab34c0940" exitCode=0 Oct 09 15:42:37 crc kubenswrapper[4719]: I1009 15:42:37.987388 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerDied","Data":"a69453ebf4e1aaf18164eaf7feb2c37cbe0797331962fcb2782850eab34c0940"} Oct 09 15:42:37 crc kubenswrapper[4719]: I1009 15:42:37.987808 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerStarted","Data":"55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b"} Oct 09 15:42:37 crc kubenswrapper[4719]: I1009 15:42:37.987831 4719 scope.go:117] "RemoveContainer" containerID="3f9a20c39c2315beb69542aebd5bd73add66f7a319edb9051e38f9e594c365d9" Oct 09 15:42:52 crc kubenswrapper[4719]: I1009 15:42:52.904526 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mvvcr"] Oct 09 15:42:52 crc kubenswrapper[4719]: E1009 15:42:52.905617 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def16219-1617-423e-93be-f43a5f782426" containerName="extract-content" Oct 09 15:42:52 crc kubenswrapper[4719]: I1009 15:42:52.905636 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="def16219-1617-423e-93be-f43a5f782426" containerName="extract-content" Oct 09 15:42:52 crc kubenswrapper[4719]: E1009 15:42:52.905675 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def16219-1617-423e-93be-f43a5f782426" containerName="registry-server" Oct 09 15:42:52 crc kubenswrapper[4719]: I1009 15:42:52.905683 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="def16219-1617-423e-93be-f43a5f782426" containerName="registry-server" Oct 09 15:42:52 crc kubenswrapper[4719]: E1009 15:42:52.905712 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def16219-1617-423e-93be-f43a5f782426" containerName="extract-utilities" Oct 09 15:42:52 crc kubenswrapper[4719]: I1009 15:42:52.905722 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="def16219-1617-423e-93be-f43a5f782426" containerName="extract-utilities" Oct 09 15:42:52 crc kubenswrapper[4719]: I1009 15:42:52.905963 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="def16219-1617-423e-93be-f43a5f782426" containerName="registry-server" Oct 09 15:42:52 crc kubenswrapper[4719]: I1009 15:42:52.907698 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvvcr" Oct 09 15:42:52 crc kubenswrapper[4719]: I1009 15:42:52.923097 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mvvcr"] Oct 09 15:42:52 crc kubenswrapper[4719]: I1009 15:42:52.953986 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2pc2\" (UniqueName: \"kubernetes.io/projected/6c07684c-f8ae-45e3-9f77-39a8d50d35bd-kube-api-access-p2pc2\") pod \"redhat-operators-mvvcr\" (UID: \"6c07684c-f8ae-45e3-9f77-39a8d50d35bd\") " pod="openshift-marketplace/redhat-operators-mvvcr" Oct 09 15:42:52 crc kubenswrapper[4719]: I1009 15:42:52.954077 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c07684c-f8ae-45e3-9f77-39a8d50d35bd-catalog-content\") pod \"redhat-operators-mvvcr\" (UID: \"6c07684c-f8ae-45e3-9f77-39a8d50d35bd\") " pod="openshift-marketplace/redhat-operators-mvvcr" Oct 09 15:42:52 crc kubenswrapper[4719]: I1009 15:42:52.954124 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c07684c-f8ae-45e3-9f77-39a8d50d35bd-utilities\") pod \"redhat-operators-mvvcr\" (UID: \"6c07684c-f8ae-45e3-9f77-39a8d50d35bd\") " pod="openshift-marketplace/redhat-operators-mvvcr" Oct 09 15:42:53 crc kubenswrapper[4719]: I1009 15:42:53.055995 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c07684c-f8ae-45e3-9f77-39a8d50d35bd-catalog-content\") pod \"redhat-operators-mvvcr\" (UID: \"6c07684c-f8ae-45e3-9f77-39a8d50d35bd\") " pod="openshift-marketplace/redhat-operators-mvvcr" Oct 09 15:42:53 crc kubenswrapper[4719]: I1009 15:42:53.056069 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c07684c-f8ae-45e3-9f77-39a8d50d35bd-utilities\") pod \"redhat-operators-mvvcr\" (UID: \"6c07684c-f8ae-45e3-9f77-39a8d50d35bd\") " pod="openshift-marketplace/redhat-operators-mvvcr" Oct 09 15:42:53 crc kubenswrapper[4719]: I1009 15:42:53.056192 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2pc2\" (UniqueName: \"kubernetes.io/projected/6c07684c-f8ae-45e3-9f77-39a8d50d35bd-kube-api-access-p2pc2\") pod \"redhat-operators-mvvcr\" (UID: \"6c07684c-f8ae-45e3-9f77-39a8d50d35bd\") " pod="openshift-marketplace/redhat-operators-mvvcr" Oct 09 15:42:53 crc kubenswrapper[4719]: I1009 15:42:53.056572 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c07684c-f8ae-45e3-9f77-39a8d50d35bd-catalog-content\") pod \"redhat-operators-mvvcr\" (UID: \"6c07684c-f8ae-45e3-9f77-39a8d50d35bd\") " pod="openshift-marketplace/redhat-operators-mvvcr" Oct 09 15:42:53 crc kubenswrapper[4719]: I1009 15:42:53.056590 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c07684c-f8ae-45e3-9f77-39a8d50d35bd-utilities\") pod \"redhat-operators-mvvcr\" (UID: \"6c07684c-f8ae-45e3-9f77-39a8d50d35bd\") " pod="openshift-marketplace/redhat-operators-mvvcr" Oct 09 15:42:53 crc kubenswrapper[4719]: I1009 15:42:53.075765 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2pc2\" (UniqueName: \"kubernetes.io/projected/6c07684c-f8ae-45e3-9f77-39a8d50d35bd-kube-api-access-p2pc2\") pod \"redhat-operators-mvvcr\" (UID: \"6c07684c-f8ae-45e3-9f77-39a8d50d35bd\") " pod="openshift-marketplace/redhat-operators-mvvcr" Oct 09 15:42:53 crc kubenswrapper[4719]: I1009 15:42:53.231605 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvvcr" Oct 09 15:42:53 crc kubenswrapper[4719]: I1009 15:42:53.707644 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mvvcr"] Oct 09 15:42:54 crc kubenswrapper[4719]: I1009 15:42:54.139896 4719 generic.go:334] "Generic (PLEG): container finished" podID="6c07684c-f8ae-45e3-9f77-39a8d50d35bd" containerID="b7f510739c5cad1bdf2a2f3c174f68f05871d5a9dcdd21736a50f998dc427607" exitCode=0 Oct 09 15:42:54 crc kubenswrapper[4719]: I1009 15:42:54.139936 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvvcr" event={"ID":"6c07684c-f8ae-45e3-9f77-39a8d50d35bd","Type":"ContainerDied","Data":"b7f510739c5cad1bdf2a2f3c174f68f05871d5a9dcdd21736a50f998dc427607"} Oct 09 15:42:54 crc kubenswrapper[4719]: I1009 15:42:54.139997 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvvcr" event={"ID":"6c07684c-f8ae-45e3-9f77-39a8d50d35bd","Type":"ContainerStarted","Data":"cfc5445c21c2177cdb4022eed2a79f720e0ad18ecdbc3b185ba83bd0655c7ea4"} Oct 09 15:42:55 crc kubenswrapper[4719]: I1009 15:42:55.157849 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvvcr" event={"ID":"6c07684c-f8ae-45e3-9f77-39a8d50d35bd","Type":"ContainerStarted","Data":"1c3837e092f72f360274d8c42fa516edf1383433336f06c881914bc6fa0b3760"} Oct 09 15:42:56 crc kubenswrapper[4719]: I1009 15:42:56.172615 4719 generic.go:334] "Generic (PLEG): container finished" podID="6c07684c-f8ae-45e3-9f77-39a8d50d35bd" containerID="1c3837e092f72f360274d8c42fa516edf1383433336f06c881914bc6fa0b3760" exitCode=0 Oct 09 15:42:56 crc kubenswrapper[4719]: I1009 15:42:56.172707 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvvcr" event={"ID":"6c07684c-f8ae-45e3-9f77-39a8d50d35bd","Type":"ContainerDied","Data":"1c3837e092f72f360274d8c42fa516edf1383433336f06c881914bc6fa0b3760"} Oct 09 15:42:57 crc kubenswrapper[4719]: I1009 15:42:57.197009 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvvcr" event={"ID":"6c07684c-f8ae-45e3-9f77-39a8d50d35bd","Type":"ContainerStarted","Data":"6f4e8e0cc1db239355abd464bb77fd13e7965a23ab9c368ecd71abedaf080498"} Oct 09 15:42:57 crc kubenswrapper[4719]: I1009 15:42:57.223754 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mvvcr" podStartSLOduration=2.607328891 podStartE2EDuration="5.223734387s" podCreationTimestamp="2025-10-09 15:42:52 +0000 UTC" firstStartedPulling="2025-10-09 15:42:54.141410388 +0000 UTC m=+1479.651121663" lastFinishedPulling="2025-10-09 15:42:56.757815874 +0000 UTC m=+1482.267527159" observedRunningTime="2025-10-09 15:42:57.213172972 +0000 UTC m=+1482.722884277" watchObservedRunningTime="2025-10-09 15:42:57.223734387 +0000 UTC m=+1482.733445672" Oct 09 15:43:03 crc kubenswrapper[4719]: I1009 15:43:03.231902 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mvvcr" Oct 09 15:43:03 crc kubenswrapper[4719]: I1009 15:43:03.233463 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mvvcr" Oct 09 15:43:03 crc kubenswrapper[4719]: I1009 15:43:03.283841 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mvvcr" Oct 09 15:43:03 crc kubenswrapper[4719]: I1009 15:43:03.336420 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mvvcr" Oct 09 15:43:03 crc kubenswrapper[4719]: I1009 15:43:03.522491 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mvvcr"] Oct 09 15:43:05 crc kubenswrapper[4719]: I1009 15:43:05.282387 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mvvcr" podUID="6c07684c-f8ae-45e3-9f77-39a8d50d35bd" containerName="registry-server" containerID="cri-o://6f4e8e0cc1db239355abd464bb77fd13e7965a23ab9c368ecd71abedaf080498" gracePeriod=2 Oct 09 15:43:05 crc kubenswrapper[4719]: I1009 15:43:05.731835 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvvcr" Oct 09 15:43:05 crc kubenswrapper[4719]: I1009 15:43:05.802004 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c07684c-f8ae-45e3-9f77-39a8d50d35bd-utilities\") pod \"6c07684c-f8ae-45e3-9f77-39a8d50d35bd\" (UID: \"6c07684c-f8ae-45e3-9f77-39a8d50d35bd\") " Oct 09 15:43:05 crc kubenswrapper[4719]: I1009 15:43:05.802181 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c07684c-f8ae-45e3-9f77-39a8d50d35bd-catalog-content\") pod \"6c07684c-f8ae-45e3-9f77-39a8d50d35bd\" (UID: \"6c07684c-f8ae-45e3-9f77-39a8d50d35bd\") " Oct 09 15:43:05 crc kubenswrapper[4719]: I1009 15:43:05.802226 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2pc2\" (UniqueName: \"kubernetes.io/projected/6c07684c-f8ae-45e3-9f77-39a8d50d35bd-kube-api-access-p2pc2\") pod \"6c07684c-f8ae-45e3-9f77-39a8d50d35bd\" (UID: \"6c07684c-f8ae-45e3-9f77-39a8d50d35bd\") " Oct 09 15:43:05 crc kubenswrapper[4719]: I1009 15:43:05.803313 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c07684c-f8ae-45e3-9f77-39a8d50d35bd-utilities" (OuterVolumeSpecName: "utilities") pod "6c07684c-f8ae-45e3-9f77-39a8d50d35bd" (UID: "6c07684c-f8ae-45e3-9f77-39a8d50d35bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:43:05 crc kubenswrapper[4719]: I1009 15:43:05.817724 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c07684c-f8ae-45e3-9f77-39a8d50d35bd-kube-api-access-p2pc2" (OuterVolumeSpecName: "kube-api-access-p2pc2") pod "6c07684c-f8ae-45e3-9f77-39a8d50d35bd" (UID: "6c07684c-f8ae-45e3-9f77-39a8d50d35bd"). InnerVolumeSpecName "kube-api-access-p2pc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:43:05 crc kubenswrapper[4719]: I1009 15:43:05.890838 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c07684c-f8ae-45e3-9f77-39a8d50d35bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c07684c-f8ae-45e3-9f77-39a8d50d35bd" (UID: "6c07684c-f8ae-45e3-9f77-39a8d50d35bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:43:05 crc kubenswrapper[4719]: I1009 15:43:05.904010 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2pc2\" (UniqueName: \"kubernetes.io/projected/6c07684c-f8ae-45e3-9f77-39a8d50d35bd-kube-api-access-p2pc2\") on node \"crc\" DevicePath \"\"" Oct 09 15:43:05 crc kubenswrapper[4719]: I1009 15:43:05.904045 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c07684c-f8ae-45e3-9f77-39a8d50d35bd-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 15:43:05 crc kubenswrapper[4719]: I1009 15:43:05.904055 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c07684c-f8ae-45e3-9f77-39a8d50d35bd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 15:43:06 crc kubenswrapper[4719]: I1009 15:43:06.300836 4719 generic.go:334] "Generic (PLEG): container finished" podID="6c07684c-f8ae-45e3-9f77-39a8d50d35bd" containerID="6f4e8e0cc1db239355abd464bb77fd13e7965a23ab9c368ecd71abedaf080498" exitCode=0 Oct 09 15:43:06 crc kubenswrapper[4719]: I1009 15:43:06.300959 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvvcr" Oct 09 15:43:06 crc kubenswrapper[4719]: I1009 15:43:06.300991 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvvcr" event={"ID":"6c07684c-f8ae-45e3-9f77-39a8d50d35bd","Type":"ContainerDied","Data":"6f4e8e0cc1db239355abd464bb77fd13e7965a23ab9c368ecd71abedaf080498"} Oct 09 15:43:06 crc kubenswrapper[4719]: I1009 15:43:06.301454 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvvcr" event={"ID":"6c07684c-f8ae-45e3-9f77-39a8d50d35bd","Type":"ContainerDied","Data":"cfc5445c21c2177cdb4022eed2a79f720e0ad18ecdbc3b185ba83bd0655c7ea4"} Oct 09 15:43:06 crc kubenswrapper[4719]: I1009 15:43:06.301498 4719 scope.go:117] "RemoveContainer" containerID="6f4e8e0cc1db239355abd464bb77fd13e7965a23ab9c368ecd71abedaf080498" Oct 09 15:43:06 crc kubenswrapper[4719]: I1009 15:43:06.336986 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mvvcr"] Oct 09 15:43:06 crc kubenswrapper[4719]: I1009 15:43:06.342994 4719 scope.go:117] "RemoveContainer" containerID="1c3837e092f72f360274d8c42fa516edf1383433336f06c881914bc6fa0b3760" Oct 09 15:43:06 crc kubenswrapper[4719]: I1009 15:43:06.348244 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mvvcr"] Oct 09 15:43:06 crc kubenswrapper[4719]: I1009 15:43:06.373535 4719 scope.go:117] "RemoveContainer" containerID="b7f510739c5cad1bdf2a2f3c174f68f05871d5a9dcdd21736a50f998dc427607" Oct 09 15:43:06 crc kubenswrapper[4719]: I1009 15:43:06.414783 4719 scope.go:117] "RemoveContainer" containerID="6f4e8e0cc1db239355abd464bb77fd13e7965a23ab9c368ecd71abedaf080498" Oct 09 15:43:06 crc kubenswrapper[4719]: E1009 15:43:06.418154 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f4e8e0cc1db239355abd464bb77fd13e7965a23ab9c368ecd71abedaf080498\": container with ID starting with 6f4e8e0cc1db239355abd464bb77fd13e7965a23ab9c368ecd71abedaf080498 not found: ID does not exist" containerID="6f4e8e0cc1db239355abd464bb77fd13e7965a23ab9c368ecd71abedaf080498" Oct 09 15:43:06 crc kubenswrapper[4719]: I1009 15:43:06.418400 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f4e8e0cc1db239355abd464bb77fd13e7965a23ab9c368ecd71abedaf080498"} err="failed to get container status \"6f4e8e0cc1db239355abd464bb77fd13e7965a23ab9c368ecd71abedaf080498\": rpc error: code = NotFound desc = could not find container \"6f4e8e0cc1db239355abd464bb77fd13e7965a23ab9c368ecd71abedaf080498\": container with ID starting with 6f4e8e0cc1db239355abd464bb77fd13e7965a23ab9c368ecd71abedaf080498 not found: ID does not exist" Oct 09 15:43:06 crc kubenswrapper[4719]: I1009 15:43:06.418432 4719 scope.go:117] "RemoveContainer" containerID="1c3837e092f72f360274d8c42fa516edf1383433336f06c881914bc6fa0b3760" Oct 09 15:43:06 crc kubenswrapper[4719]: E1009 15:43:06.419095 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c3837e092f72f360274d8c42fa516edf1383433336f06c881914bc6fa0b3760\": container with ID starting with 1c3837e092f72f360274d8c42fa516edf1383433336f06c881914bc6fa0b3760 not found: ID does not exist" containerID="1c3837e092f72f360274d8c42fa516edf1383433336f06c881914bc6fa0b3760" Oct 09 15:43:06 crc kubenswrapper[4719]: I1009 15:43:06.419154 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c3837e092f72f360274d8c42fa516edf1383433336f06c881914bc6fa0b3760"} err="failed to get container status \"1c3837e092f72f360274d8c42fa516edf1383433336f06c881914bc6fa0b3760\": rpc error: code = NotFound desc = could not find container \"1c3837e092f72f360274d8c42fa516edf1383433336f06c881914bc6fa0b3760\": container with ID starting with 1c3837e092f72f360274d8c42fa516edf1383433336f06c881914bc6fa0b3760 not found: ID does not exist" Oct 09 15:43:06 crc kubenswrapper[4719]: I1009 15:43:06.419194 4719 scope.go:117] "RemoveContainer" containerID="b7f510739c5cad1bdf2a2f3c174f68f05871d5a9dcdd21736a50f998dc427607" Oct 09 15:43:06 crc kubenswrapper[4719]: E1009 15:43:06.419609 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7f510739c5cad1bdf2a2f3c174f68f05871d5a9dcdd21736a50f998dc427607\": container with ID starting with b7f510739c5cad1bdf2a2f3c174f68f05871d5a9dcdd21736a50f998dc427607 not found: ID does not exist" containerID="b7f510739c5cad1bdf2a2f3c174f68f05871d5a9dcdd21736a50f998dc427607" Oct 09 15:43:06 crc kubenswrapper[4719]: I1009 15:43:06.419728 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7f510739c5cad1bdf2a2f3c174f68f05871d5a9dcdd21736a50f998dc427607"} err="failed to get container status \"b7f510739c5cad1bdf2a2f3c174f68f05871d5a9dcdd21736a50f998dc427607\": rpc error: code = NotFound desc = could not find container \"b7f510739c5cad1bdf2a2f3c174f68f05871d5a9dcdd21736a50f998dc427607\": container with ID starting with b7f510739c5cad1bdf2a2f3c174f68f05871d5a9dcdd21736a50f998dc427607 not found: ID does not exist" Oct 09 15:43:07 crc kubenswrapper[4719]: I1009 15:43:07.174086 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c07684c-f8ae-45e3-9f77-39a8d50d35bd" path="/var/lib/kubelet/pods/6c07684c-f8ae-45e3-9f77-39a8d50d35bd/volumes" Oct 09 15:43:24 crc kubenswrapper[4719]: I1009 15:43:24.492661 4719 scope.go:117] "RemoveContainer" containerID="ec6518a87040c131606f12df82fa5858982b2e63c47387b171518c644374efec" Oct 09 15:44:17 crc kubenswrapper[4719]: I1009 15:44:17.045839 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-gcfdh"] Oct 09 15:44:17 crc kubenswrapper[4719]: I1009 15:44:17.055942 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-gcfdh"] Oct 09 15:44:17 crc kubenswrapper[4719]: I1009 15:44:17.175190 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8ea2169-9310-49ca-ac1f-f11eeb9b8e26" path="/var/lib/kubelet/pods/f8ea2169-9310-49ca-ac1f-f11eeb9b8e26/volumes" Oct 09 15:44:24 crc kubenswrapper[4719]: I1009 15:44:24.557332 4719 scope.go:117] "RemoveContainer" containerID="9f4263077898c99175651f39fa6fa5bf6e0de796ee696a530c40b8c9b7f37380" Oct 09 15:44:24 crc kubenswrapper[4719]: I1009 15:44:24.578656 4719 scope.go:117] "RemoveContainer" containerID="e631c472b961ca8b25e75e1dbbb45738f9771e87c0c550bd213d7799cee0180e" Oct 09 15:44:24 crc kubenswrapper[4719]: I1009 15:44:24.604270 4719 scope.go:117] "RemoveContainer" containerID="eebce7882d9088534e11c0336746ad7a6869f4d1de7da554fcdd245ba8fa770f" Oct 09 15:44:26 crc kubenswrapper[4719]: I1009 15:44:26.035429 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-8nm4m"] Oct 09 15:44:26 crc kubenswrapper[4719]: I1009 15:44:26.045175 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-p6rrg"] Oct 09 15:44:26 crc kubenswrapper[4719]: I1009 15:44:26.054244 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-8nm4m"] Oct 09 15:44:26 crc kubenswrapper[4719]: I1009 15:44:26.063110 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-p6rrg"] Oct 09 15:44:27 crc kubenswrapper[4719]: I1009 15:44:27.175904 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="726baf17-62cc-40f1-bb7c-204e357465d1" path="/var/lib/kubelet/pods/726baf17-62cc-40f1-bb7c-204e357465d1/volumes" Oct 09 15:44:27 crc kubenswrapper[4719]: I1009 15:44:27.177199 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82195049-d846-4c69-a778-b85451e8d485" path="/var/lib/kubelet/pods/82195049-d846-4c69-a778-b85451e8d485/volumes" Oct 09 15:44:28 crc kubenswrapper[4719]: I1009 15:44:28.047200 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-f584-account-create-h95kn"] Oct 09 15:44:28 crc kubenswrapper[4719]: I1009 15:44:28.061128 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-f584-account-create-h95kn"] Oct 09 15:44:29 crc kubenswrapper[4719]: I1009 15:44:29.134107 4719 generic.go:334] "Generic (PLEG): container finished" podID="75c7240d-03e4-40f9-a915-c85892b060d9" containerID="d272e8838a829958f4d8597faa2d581851a9e330d271244ad5cde1166710d58f" exitCode=0 Oct 09 15:44:29 crc kubenswrapper[4719]: I1009 15:44:29.134175 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh" event={"ID":"75c7240d-03e4-40f9-a915-c85892b060d9","Type":"ContainerDied","Data":"d272e8838a829958f4d8597faa2d581851a9e330d271244ad5cde1166710d58f"} Oct 09 15:44:29 crc kubenswrapper[4719]: I1009 15:44:29.172831 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8898aee0-de9a-4957-a478-0f322abd395b" path="/var/lib/kubelet/pods/8898aee0-de9a-4957-a478-0f322abd395b/volumes" Oct 09 15:44:30 crc kubenswrapper[4719]: I1009 15:44:30.537672 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh" Oct 09 15:44:30 crc kubenswrapper[4719]: I1009 15:44:30.687939 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75c7240d-03e4-40f9-a915-c85892b060d9-bootstrap-combined-ca-bundle\") pod \"75c7240d-03e4-40f9-a915-c85892b060d9\" (UID: \"75c7240d-03e4-40f9-a915-c85892b060d9\") " Oct 09 15:44:30 crc kubenswrapper[4719]: I1009 15:44:30.687976 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82djm\" (UniqueName: \"kubernetes.io/projected/75c7240d-03e4-40f9-a915-c85892b060d9-kube-api-access-82djm\") pod \"75c7240d-03e4-40f9-a915-c85892b060d9\" (UID: \"75c7240d-03e4-40f9-a915-c85892b060d9\") " Oct 09 15:44:30 crc kubenswrapper[4719]: I1009 15:44:30.688014 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75c7240d-03e4-40f9-a915-c85892b060d9-inventory\") pod \"75c7240d-03e4-40f9-a915-c85892b060d9\" (UID: \"75c7240d-03e4-40f9-a915-c85892b060d9\") " Oct 09 15:44:30 crc kubenswrapper[4719]: I1009 15:44:30.688180 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75c7240d-03e4-40f9-a915-c85892b060d9-ssh-key\") pod \"75c7240d-03e4-40f9-a915-c85892b060d9\" (UID: \"75c7240d-03e4-40f9-a915-c85892b060d9\") " Oct 09 15:44:30 crc kubenswrapper[4719]: I1009 15:44:30.693784 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75c7240d-03e4-40f9-a915-c85892b060d9-kube-api-access-82djm" (OuterVolumeSpecName: "kube-api-access-82djm") pod "75c7240d-03e4-40f9-a915-c85892b060d9" (UID: "75c7240d-03e4-40f9-a915-c85892b060d9"). InnerVolumeSpecName "kube-api-access-82djm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:44:30 crc kubenswrapper[4719]: I1009 15:44:30.694131 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75c7240d-03e4-40f9-a915-c85892b060d9-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "75c7240d-03e4-40f9-a915-c85892b060d9" (UID: "75c7240d-03e4-40f9-a915-c85892b060d9"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:44:30 crc kubenswrapper[4719]: I1009 15:44:30.715256 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75c7240d-03e4-40f9-a915-c85892b060d9-inventory" (OuterVolumeSpecName: "inventory") pod "75c7240d-03e4-40f9-a915-c85892b060d9" (UID: "75c7240d-03e4-40f9-a915-c85892b060d9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:44:30 crc kubenswrapper[4719]: I1009 15:44:30.717907 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75c7240d-03e4-40f9-a915-c85892b060d9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "75c7240d-03e4-40f9-a915-c85892b060d9" (UID: "75c7240d-03e4-40f9-a915-c85892b060d9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:44:30 crc kubenswrapper[4719]: I1009 15:44:30.790128 4719 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75c7240d-03e4-40f9-a915-c85892b060d9-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 15:44:30 crc kubenswrapper[4719]: I1009 15:44:30.790159 4719 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75c7240d-03e4-40f9-a915-c85892b060d9-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:44:30 crc kubenswrapper[4719]: I1009 15:44:30.790171 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82djm\" (UniqueName: \"kubernetes.io/projected/75c7240d-03e4-40f9-a915-c85892b060d9-kube-api-access-82djm\") on node \"crc\" DevicePath \"\"" Oct 09 15:44:30 crc kubenswrapper[4719]: I1009 15:44:30.790180 4719 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75c7240d-03e4-40f9-a915-c85892b060d9-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.151056 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh" event={"ID":"75c7240d-03e4-40f9-a915-c85892b060d9","Type":"ContainerDied","Data":"6b1d47eb69528d54d2ef5b0285608224d5f73454be42425086e8abb31d2118e6"} Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.151095 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b1d47eb69528d54d2ef5b0285608224d5f73454be42425086e8abb31d2118e6" Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.151098 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh" Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.242996 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl"] Oct 09 15:44:31 crc kubenswrapper[4719]: E1009 15:44:31.243546 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c07684c-f8ae-45e3-9f77-39a8d50d35bd" containerName="extract-content" Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.243570 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c07684c-f8ae-45e3-9f77-39a8d50d35bd" containerName="extract-content" Oct 09 15:44:31 crc kubenswrapper[4719]: E1009 15:44:31.243584 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75c7240d-03e4-40f9-a915-c85892b060d9" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.243594 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="75c7240d-03e4-40f9-a915-c85892b060d9" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 09 15:44:31 crc kubenswrapper[4719]: E1009 15:44:31.243615 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c07684c-f8ae-45e3-9f77-39a8d50d35bd" containerName="extract-utilities" Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.243622 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c07684c-f8ae-45e3-9f77-39a8d50d35bd" containerName="extract-utilities" Oct 09 15:44:31 crc kubenswrapper[4719]: E1009 15:44:31.243675 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c07684c-f8ae-45e3-9f77-39a8d50d35bd" containerName="registry-server" Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.243682 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c07684c-f8ae-45e3-9f77-39a8d50d35bd" containerName="registry-server" Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.243911 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="75c7240d-03e4-40f9-a915-c85892b060d9" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.243952 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c07684c-f8ae-45e3-9f77-39a8d50d35bd" containerName="registry-server" Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.244748 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl" Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.249470 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.249657 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ssvsw" Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.249711 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.250631 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.266692 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl"] Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.299928 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49f3b180-01ca-489f-9a12-5e22d186b1b7-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl\" (UID: \"49f3b180-01ca-489f-9a12-5e22d186b1b7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl" Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.300127 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwcgp\" (UniqueName: \"kubernetes.io/projected/49f3b180-01ca-489f-9a12-5e22d186b1b7-kube-api-access-pwcgp\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl\" (UID: \"49f3b180-01ca-489f-9a12-5e22d186b1b7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl" Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.300188 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49f3b180-01ca-489f-9a12-5e22d186b1b7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl\" (UID: \"49f3b180-01ca-489f-9a12-5e22d186b1b7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl" Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.401746 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49f3b180-01ca-489f-9a12-5e22d186b1b7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl\" (UID: \"49f3b180-01ca-489f-9a12-5e22d186b1b7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl" Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.401844 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49f3b180-01ca-489f-9a12-5e22d186b1b7-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl\" (UID: \"49f3b180-01ca-489f-9a12-5e22d186b1b7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl" Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.401933 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwcgp\" (UniqueName: \"kubernetes.io/projected/49f3b180-01ca-489f-9a12-5e22d186b1b7-kube-api-access-pwcgp\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl\" (UID: \"49f3b180-01ca-489f-9a12-5e22d186b1b7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl" Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.405639 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49f3b180-01ca-489f-9a12-5e22d186b1b7-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl\" (UID: \"49f3b180-01ca-489f-9a12-5e22d186b1b7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl" Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.406918 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49f3b180-01ca-489f-9a12-5e22d186b1b7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl\" (UID: \"49f3b180-01ca-489f-9a12-5e22d186b1b7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl" Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.418557 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwcgp\" (UniqueName: \"kubernetes.io/projected/49f3b180-01ca-489f-9a12-5e22d186b1b7-kube-api-access-pwcgp\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl\" (UID: \"49f3b180-01ca-489f-9a12-5e22d186b1b7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl" Oct 09 15:44:31 crc kubenswrapper[4719]: I1009 15:44:31.564326 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl" Oct 09 15:44:32 crc kubenswrapper[4719]: I1009 15:44:32.055192 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl"] Oct 09 15:44:32 crc kubenswrapper[4719]: I1009 15:44:32.060139 4719 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 15:44:32 crc kubenswrapper[4719]: I1009 15:44:32.160067 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl" event={"ID":"49f3b180-01ca-489f-9a12-5e22d186b1b7","Type":"ContainerStarted","Data":"3271336da124477bf6ed333b3afc7340caa7511ae5dac99f88462f66b447d9a8"} Oct 09 15:44:33 crc kubenswrapper[4719]: I1009 15:44:33.185366 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl" event={"ID":"49f3b180-01ca-489f-9a12-5e22d186b1b7","Type":"ContainerStarted","Data":"6c45c328d2ba5ea53cf5ad1a3fcf8e0b7072822c06b97c195599bac7836e440a"} Oct 09 15:44:33 crc kubenswrapper[4719]: I1009 15:44:33.212196 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl" podStartSLOduration=1.720577568 podStartE2EDuration="2.212176082s" podCreationTimestamp="2025-10-09 15:44:31 +0000 UTC" firstStartedPulling="2025-10-09 15:44:32.059897353 +0000 UTC m=+1577.569608638" lastFinishedPulling="2025-10-09 15:44:32.551495867 +0000 UTC m=+1578.061207152" observedRunningTime="2025-10-09 15:44:33.202264064 +0000 UTC m=+1578.711975359" watchObservedRunningTime="2025-10-09 15:44:33.212176082 +0000 UTC m=+1578.721887377" Oct 09 15:44:35 crc kubenswrapper[4719]: I1009 15:44:35.040777 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-fc37-account-create-lfn7t"] Oct 09 15:44:35 crc kubenswrapper[4719]: I1009 15:44:35.050859 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-fc37-account-create-lfn7t"] Oct 09 15:44:35 crc kubenswrapper[4719]: I1009 15:44:35.179300 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baf7be65-641f-4d6a-a15c-ac903de135ab" path="/var/lib/kubelet/pods/baf7be65-641f-4d6a-a15c-ac903de135ab/volumes" Oct 09 15:44:44 crc kubenswrapper[4719]: I1009 15:44:44.031926 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-c1e6-account-create-wc9df"] Oct 09 15:44:44 crc kubenswrapper[4719]: I1009 15:44:44.042518 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-c1e6-account-create-wc9df"] Oct 09 15:44:45 crc kubenswrapper[4719]: I1009 15:44:45.172334 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a14bea55-0b69-471d-8bf2-86963f482288" path="/var/lib/kubelet/pods/a14bea55-0b69-471d-8bf2-86963f482288/volumes" Oct 09 15:44:51 crc kubenswrapper[4719]: I1009 15:44:51.023807 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-ngwlj"] Oct 09 15:44:51 crc kubenswrapper[4719]: I1009 15:44:51.033242 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-ngwlj"] Oct 09 15:44:51 crc kubenswrapper[4719]: I1009 15:44:51.174492 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cfd57d6-febf-4fcc-878e-aba5c948348a" path="/var/lib/kubelet/pods/8cfd57d6-febf-4fcc-878e-aba5c948348a/volumes" Oct 09 15:45:00 crc kubenswrapper[4719]: I1009 15:45:00.157423 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333745-wfzhh"] Oct 09 15:45:00 crc kubenswrapper[4719]: I1009 15:45:00.159732 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333745-wfzhh" Oct 09 15:45:00 crc kubenswrapper[4719]: I1009 15:45:00.162449 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 15:45:00 crc kubenswrapper[4719]: I1009 15:45:00.163389 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 15:45:00 crc kubenswrapper[4719]: I1009 15:45:00.173740 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333745-wfzhh"] Oct 09 15:45:00 crc kubenswrapper[4719]: I1009 15:45:00.271731 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ff99c39-7c61-46a2-bb84-05cb745323bf-secret-volume\") pod \"collect-profiles-29333745-wfzhh\" (UID: \"8ff99c39-7c61-46a2-bb84-05cb745323bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333745-wfzhh" Oct 09 15:45:00 crc kubenswrapper[4719]: I1009 15:45:00.271920 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ff99c39-7c61-46a2-bb84-05cb745323bf-config-volume\") pod \"collect-profiles-29333745-wfzhh\" (UID: \"8ff99c39-7c61-46a2-bb84-05cb745323bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333745-wfzhh" Oct 09 15:45:00 crc kubenswrapper[4719]: I1009 15:45:00.272031 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9g5p\" (UniqueName: \"kubernetes.io/projected/8ff99c39-7c61-46a2-bb84-05cb745323bf-kube-api-access-v9g5p\") pod \"collect-profiles-29333745-wfzhh\" (UID: \"8ff99c39-7c61-46a2-bb84-05cb745323bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333745-wfzhh" Oct 09 15:45:00 crc kubenswrapper[4719]: I1009 15:45:00.374702 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ff99c39-7c61-46a2-bb84-05cb745323bf-secret-volume\") pod \"collect-profiles-29333745-wfzhh\" (UID: \"8ff99c39-7c61-46a2-bb84-05cb745323bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333745-wfzhh" Oct 09 15:45:00 crc kubenswrapper[4719]: I1009 15:45:00.375758 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ff99c39-7c61-46a2-bb84-05cb745323bf-config-volume\") pod \"collect-profiles-29333745-wfzhh\" (UID: \"8ff99c39-7c61-46a2-bb84-05cb745323bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333745-wfzhh" Oct 09 15:45:00 crc kubenswrapper[4719]: I1009 15:45:00.375971 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9g5p\" (UniqueName: \"kubernetes.io/projected/8ff99c39-7c61-46a2-bb84-05cb745323bf-kube-api-access-v9g5p\") pod \"collect-profiles-29333745-wfzhh\" (UID: \"8ff99c39-7c61-46a2-bb84-05cb745323bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333745-wfzhh" Oct 09 15:45:00 crc kubenswrapper[4719]: I1009 15:45:00.376738 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ff99c39-7c61-46a2-bb84-05cb745323bf-config-volume\") pod \"collect-profiles-29333745-wfzhh\" (UID: \"8ff99c39-7c61-46a2-bb84-05cb745323bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333745-wfzhh" Oct 09 15:45:00 crc kubenswrapper[4719]: I1009 15:45:00.385191 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ff99c39-7c61-46a2-bb84-05cb745323bf-secret-volume\") pod \"collect-profiles-29333745-wfzhh\" (UID: \"8ff99c39-7c61-46a2-bb84-05cb745323bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333745-wfzhh" Oct 09 15:45:00 crc kubenswrapper[4719]: I1009 15:45:00.393016 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9g5p\" (UniqueName: \"kubernetes.io/projected/8ff99c39-7c61-46a2-bb84-05cb745323bf-kube-api-access-v9g5p\") pod \"collect-profiles-29333745-wfzhh\" (UID: \"8ff99c39-7c61-46a2-bb84-05cb745323bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333745-wfzhh" Oct 09 15:45:00 crc kubenswrapper[4719]: I1009 15:45:00.487969 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333745-wfzhh" Oct 09 15:45:00 crc kubenswrapper[4719]: I1009 15:45:00.937236 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333745-wfzhh"] Oct 09 15:45:01 crc kubenswrapper[4719]: I1009 15:45:01.433212 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333745-wfzhh" event={"ID":"8ff99c39-7c61-46a2-bb84-05cb745323bf","Type":"ContainerStarted","Data":"8515f28740ec0dc7d4038f5effd2910f167b9b8df00ae6c2177db8c435b574cf"} Oct 09 15:45:01 crc kubenswrapper[4719]: I1009 15:45:01.433587 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333745-wfzhh" event={"ID":"8ff99c39-7c61-46a2-bb84-05cb745323bf","Type":"ContainerStarted","Data":"e6ff2800a7a827a9458be00669bf571317264dae90b95a0269b7ce3b979e1379"} Oct 09 15:45:01 crc kubenswrapper[4719]: I1009 15:45:01.455455 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29333745-wfzhh" podStartSLOduration=1.45542968 podStartE2EDuration="1.45542968s" podCreationTimestamp="2025-10-09 15:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 15:45:01.451926378 +0000 UTC m=+1606.961637683" watchObservedRunningTime="2025-10-09 15:45:01.45542968 +0000 UTC m=+1606.965140975" Oct 09 15:45:02 crc kubenswrapper[4719]: I1009 15:45:02.030112 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-ac57-account-create-g6pg7"] Oct 09 15:45:02 crc kubenswrapper[4719]: I1009 15:45:02.039083 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-ac57-account-create-g6pg7"] Oct 09 15:45:02 crc kubenswrapper[4719]: I1009 15:45:02.444711 4719 generic.go:334] "Generic (PLEG): container finished" podID="8ff99c39-7c61-46a2-bb84-05cb745323bf" containerID="8515f28740ec0dc7d4038f5effd2910f167b9b8df00ae6c2177db8c435b574cf" exitCode=0 Oct 09 15:45:02 crc kubenswrapper[4719]: I1009 15:45:02.444757 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333745-wfzhh" event={"ID":"8ff99c39-7c61-46a2-bb84-05cb745323bf","Type":"ContainerDied","Data":"8515f28740ec0dc7d4038f5effd2910f167b9b8df00ae6c2177db8c435b574cf"} Oct 09 15:45:03 crc kubenswrapper[4719]: I1009 15:45:03.173760 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ee9b71f-3824-456a-9466-6b39fb613885" path="/var/lib/kubelet/pods/9ee9b71f-3824-456a-9466-6b39fb613885/volumes" Oct 09 15:45:03 crc kubenswrapper[4719]: I1009 15:45:03.796329 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333745-wfzhh" Oct 09 15:45:03 crc kubenswrapper[4719]: I1009 15:45:03.849280 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ff99c39-7c61-46a2-bb84-05cb745323bf-config-volume\") pod \"8ff99c39-7c61-46a2-bb84-05cb745323bf\" (UID: \"8ff99c39-7c61-46a2-bb84-05cb745323bf\") " Oct 09 15:45:03 crc kubenswrapper[4719]: I1009 15:45:03.849380 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9g5p\" (UniqueName: \"kubernetes.io/projected/8ff99c39-7c61-46a2-bb84-05cb745323bf-kube-api-access-v9g5p\") pod \"8ff99c39-7c61-46a2-bb84-05cb745323bf\" (UID: \"8ff99c39-7c61-46a2-bb84-05cb745323bf\") " Oct 09 15:45:03 crc kubenswrapper[4719]: I1009 15:45:03.849549 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ff99c39-7c61-46a2-bb84-05cb745323bf-secret-volume\") pod \"8ff99c39-7c61-46a2-bb84-05cb745323bf\" (UID: \"8ff99c39-7c61-46a2-bb84-05cb745323bf\") " Oct 09 15:45:03 crc kubenswrapper[4719]: I1009 15:45:03.850225 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ff99c39-7c61-46a2-bb84-05cb745323bf-config-volume" (OuterVolumeSpecName: "config-volume") pod "8ff99c39-7c61-46a2-bb84-05cb745323bf" (UID: "8ff99c39-7c61-46a2-bb84-05cb745323bf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:45:03 crc kubenswrapper[4719]: I1009 15:45:03.855575 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ff99c39-7c61-46a2-bb84-05cb745323bf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8ff99c39-7c61-46a2-bb84-05cb745323bf" (UID: "8ff99c39-7c61-46a2-bb84-05cb745323bf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:45:03 crc kubenswrapper[4719]: I1009 15:45:03.856017 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ff99c39-7c61-46a2-bb84-05cb745323bf-kube-api-access-v9g5p" (OuterVolumeSpecName: "kube-api-access-v9g5p") pod "8ff99c39-7c61-46a2-bb84-05cb745323bf" (UID: "8ff99c39-7c61-46a2-bb84-05cb745323bf"). InnerVolumeSpecName "kube-api-access-v9g5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:45:03 crc kubenswrapper[4719]: I1009 15:45:03.952570 4719 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ff99c39-7c61-46a2-bb84-05cb745323bf-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 15:45:03 crc kubenswrapper[4719]: I1009 15:45:03.952637 4719 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ff99c39-7c61-46a2-bb84-05cb745323bf-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 15:45:03 crc kubenswrapper[4719]: I1009 15:45:03.952651 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9g5p\" (UniqueName: \"kubernetes.io/projected/8ff99c39-7c61-46a2-bb84-05cb745323bf-kube-api-access-v9g5p\") on node \"crc\" DevicePath \"\"" Oct 09 15:45:04 crc kubenswrapper[4719]: I1009 15:45:04.465986 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333745-wfzhh" event={"ID":"8ff99c39-7c61-46a2-bb84-05cb745323bf","Type":"ContainerDied","Data":"e6ff2800a7a827a9458be00669bf571317264dae90b95a0269b7ce3b979e1379"} Oct 09 15:45:04 crc kubenswrapper[4719]: I1009 15:45:04.466028 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6ff2800a7a827a9458be00669bf571317264dae90b95a0269b7ce3b979e1379" Oct 09 15:45:04 crc kubenswrapper[4719]: I1009 15:45:04.466053 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333745-wfzhh" Oct 09 15:45:06 crc kubenswrapper[4719]: I1009 15:45:06.977056 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:45:06 crc kubenswrapper[4719]: I1009 15:45:06.978529 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:45:08 crc kubenswrapper[4719]: I1009 15:45:08.045218 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-t5rw4"] Oct 09 15:45:08 crc kubenswrapper[4719]: I1009 15:45:08.053981 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-hcncf"] Oct 09 15:45:08 crc kubenswrapper[4719]: I1009 15:45:08.063150 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-l5gd9"] Oct 09 15:45:08 crc kubenswrapper[4719]: I1009 15:45:08.070336 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-hcncf"] Oct 09 15:45:08 crc kubenswrapper[4719]: I1009 15:45:08.076996 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-t5rw4"] Oct 09 15:45:08 crc kubenswrapper[4719]: I1009 15:45:08.083863 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-l5gd9"] Oct 09 15:45:09 crc kubenswrapper[4719]: I1009 15:45:09.173959 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="527d79a8-0ef1-485b-94a1-eff7ee279a5a" path="/var/lib/kubelet/pods/527d79a8-0ef1-485b-94a1-eff7ee279a5a/volumes" Oct 09 15:45:09 crc kubenswrapper[4719]: I1009 15:45:09.174638 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b475a082-9d75-4284-8e09-35f2a96501b9" path="/var/lib/kubelet/pods/b475a082-9d75-4284-8e09-35f2a96501b9/volumes" Oct 09 15:45:09 crc kubenswrapper[4719]: I1009 15:45:09.175228 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f560e5ac-9e54-4a07-a0d4-88a94c2004c5" path="/var/lib/kubelet/pods/f560e5ac-9e54-4a07-a0d4-88a94c2004c5/volumes" Oct 09 15:45:22 crc kubenswrapper[4719]: I1009 15:45:22.032788 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-whtzg"] Oct 09 15:45:22 crc kubenswrapper[4719]: I1009 15:45:22.041568 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-qs479"] Oct 09 15:45:22 crc kubenswrapper[4719]: I1009 15:45:22.056324 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-whtzg"] Oct 09 15:45:22 crc kubenswrapper[4719]: I1009 15:45:22.064296 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-qs479"] Oct 09 15:45:23 crc kubenswrapper[4719]: I1009 15:45:23.173549 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1142ae88-4e1f-4957-b735-23d64604712f" path="/var/lib/kubelet/pods/1142ae88-4e1f-4957-b735-23d64604712f/volumes" Oct 09 15:45:23 crc kubenswrapper[4719]: I1009 15:45:23.174405 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="352eb1e1-d9bb-4184-9512-7cb1e9787edb" path="/var/lib/kubelet/pods/352eb1e1-d9bb-4184-9512-7cb1e9787edb/volumes" Oct 09 15:45:24 crc kubenswrapper[4719]: I1009 15:45:24.729331 4719 scope.go:117] "RemoveContainer" containerID="e295f1d7595245f40b732ea498b1e7088f654781f310a36b2ac93249892d8362" Oct 09 15:45:24 crc kubenswrapper[4719]: I1009 15:45:24.757871 4719 scope.go:117] "RemoveContainer" containerID="f9c8b01f404ca8805122237e75652c2b8a571000f16a96338eb4d76b3aae1182" Oct 09 15:45:24 crc kubenswrapper[4719]: I1009 15:45:24.816153 4719 scope.go:117] "RemoveContainer" containerID="de6c6165788ef55db5e7679468882258697386d318be78b26eee189543dcc45e" Oct 09 15:45:24 crc kubenswrapper[4719]: I1009 15:45:24.871272 4719 scope.go:117] "RemoveContainer" containerID="40ff740b355745cf960a303461ef82e422aacf339afb07c86ac099b6c958acff" Oct 09 15:45:24 crc kubenswrapper[4719]: I1009 15:45:24.928116 4719 scope.go:117] "RemoveContainer" containerID="ac9b9a20c18c91e91e4911f84dd9d8a0f4472764bf3925c08d1508d968d868d9" Oct 09 15:45:24 crc kubenswrapper[4719]: I1009 15:45:24.982204 4719 scope.go:117] "RemoveContainer" containerID="b94bc5a098d2dcd72a5e07df74fded744b44e2c318c7a38a4459bf7b58c239a7" Oct 09 15:45:25 crc kubenswrapper[4719]: I1009 15:45:25.031277 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-60a1-account-create-8lvrq"] Oct 09 15:45:25 crc kubenswrapper[4719]: I1009 15:45:25.041595 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-60a1-account-create-8lvrq"] Oct 09 15:45:25 crc kubenswrapper[4719]: I1009 15:45:25.043338 4719 scope.go:117] "RemoveContainer" containerID="f4a393ba44362546342ffae25032fec389bab28ffef9c838dd19f0572e59fc1a" Oct 09 15:45:25 crc kubenswrapper[4719]: I1009 15:45:25.064467 4719 scope.go:117] "RemoveContainer" containerID="8adc252ae0feb6c2483c2829e2f3067b3cb04b2cbe66ec68f79ee7d13e96a21c" Oct 09 15:45:25 crc kubenswrapper[4719]: I1009 15:45:25.082866 4719 scope.go:117] "RemoveContainer" containerID="4dc445a265df781fb85e53d3b213cdcc351e2ae28b87a95a76e264ff7ab3032f" Oct 09 15:45:25 crc kubenswrapper[4719]: I1009 15:45:25.099321 4719 scope.go:117] "RemoveContainer" containerID="e7bc012572a6a34b51dd2c192460f991ac66243f04a379944c9b7111d4823f07" Oct 09 15:45:25 crc kubenswrapper[4719]: I1009 15:45:25.127573 4719 scope.go:117] "RemoveContainer" containerID="bde976d8d7298c597ed2591752bb5c5b31dd4c52cf3165d99be7ce2c48d999a2" Oct 09 15:45:25 crc kubenswrapper[4719]: I1009 15:45:25.150150 4719 scope.go:117] "RemoveContainer" containerID="ba527df7ae0358360bde7c2cf1f92ba95bb2669735179bd8458ac4af5a2aa4a9" Oct 09 15:45:25 crc kubenswrapper[4719]: I1009 15:45:25.171557 4719 scope.go:117] "RemoveContainer" containerID="fc0921523ad54c212ee320465523c436b11ac7952117fe7ad106462b5529fca6" Oct 09 15:45:25 crc kubenswrapper[4719]: I1009 15:45:25.175939 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fca1d680-ba45-4687-bce1-4dc7d3e029f4" path="/var/lib/kubelet/pods/fca1d680-ba45-4687-bce1-4dc7d3e029f4/volumes" Oct 09 15:45:25 crc kubenswrapper[4719]: I1009 15:45:25.190732 4719 scope.go:117] "RemoveContainer" containerID="530c831ba0dc13bf93ad6bb6d6a41fd0d4ab1867ad66fc150dbd54682573a6cc" Oct 09 15:45:26 crc kubenswrapper[4719]: I1009 15:45:26.035602 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-2a57-account-create-9ctdx"] Oct 09 15:45:26 crc kubenswrapper[4719]: I1009 15:45:26.044916 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-d4a7-account-create-fnx7r"] Oct 09 15:45:26 crc kubenswrapper[4719]: I1009 15:45:26.063966 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-2a57-account-create-9ctdx"] Oct 09 15:45:26 crc kubenswrapper[4719]: I1009 15:45:26.075819 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-d4a7-account-create-fnx7r"] Oct 09 15:45:27 crc kubenswrapper[4719]: I1009 15:45:27.182791 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c57b6800-99e7-4fc0-a09e-d963495a39c8" path="/var/lib/kubelet/pods/c57b6800-99e7-4fc0-a09e-d963495a39c8/volumes" Oct 09 15:45:27 crc kubenswrapper[4719]: I1009 15:45:27.184008 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccfcbb78-ac73-4cea-a8c3-f676443f187b" path="/var/lib/kubelet/pods/ccfcbb78-ac73-4cea-a8c3-f676443f187b/volumes" Oct 09 15:45:36 crc kubenswrapper[4719]: I1009 15:45:36.977033 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:45:36 crc kubenswrapper[4719]: I1009 15:45:36.977616 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:45:42 crc kubenswrapper[4719]: I1009 15:45:42.039553 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-xs6f4"] Oct 09 15:45:42 crc kubenswrapper[4719]: I1009 15:45:42.048330 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-xs6f4"] Oct 09 15:45:43 crc kubenswrapper[4719]: I1009 15:45:43.173086 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52932375-ade4-4056-a4f8-6758db0df52f" path="/var/lib/kubelet/pods/52932375-ade4-4056-a4f8-6758db0df52f/volumes" Oct 09 15:45:56 crc kubenswrapper[4719]: I1009 15:45:56.036269 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-cqxgt"] Oct 09 15:45:56 crc kubenswrapper[4719]: I1009 15:45:56.045021 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-cqxgt"] Oct 09 15:45:56 crc kubenswrapper[4719]: I1009 15:45:56.058038 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-d2888"] Oct 09 15:45:56 crc kubenswrapper[4719]: I1009 15:45:56.065863 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-d2888"] Oct 09 15:45:57 crc kubenswrapper[4719]: I1009 15:45:57.173576 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19cf902a-77e9-4e57-89d0-36765e27f361" path="/var/lib/kubelet/pods/19cf902a-77e9-4e57-89d0-36765e27f361/volumes" Oct 09 15:45:57 crc kubenswrapper[4719]: I1009 15:45:57.174156 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e76eab7-abf1-4d15-9f62-52aefceaf1cd" path="/var/lib/kubelet/pods/3e76eab7-abf1-4d15-9f62-52aefceaf1cd/volumes" Oct 09 15:46:06 crc kubenswrapper[4719]: I1009 15:46:06.976557 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:46:06 crc kubenswrapper[4719]: I1009 15:46:06.977649 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:46:06 crc kubenswrapper[4719]: I1009 15:46:06.977796 4719 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 15:46:06 crc kubenswrapper[4719]: I1009 15:46:06.978523 4719 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b"} pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 15:46:06 crc kubenswrapper[4719]: I1009 15:46:06.978593 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" containerID="cri-o://55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" gracePeriod=600 Oct 09 15:46:07 crc kubenswrapper[4719]: I1009 15:46:07.079972 4719 generic.go:334] "Generic (PLEG): container finished" podID="49f3b180-01ca-489f-9a12-5e22d186b1b7" containerID="6c45c328d2ba5ea53cf5ad1a3fcf8e0b7072822c06b97c195599bac7836e440a" exitCode=0 Oct 09 15:46:07 crc kubenswrapper[4719]: I1009 15:46:07.080010 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl" event={"ID":"49f3b180-01ca-489f-9a12-5e22d186b1b7","Type":"ContainerDied","Data":"6c45c328d2ba5ea53cf5ad1a3fcf8e0b7072822c06b97c195599bac7836e440a"} Oct 09 15:46:07 crc kubenswrapper[4719]: E1009 15:46:07.136317 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:46:08 crc kubenswrapper[4719]: I1009 15:46:08.091041 4719 generic.go:334] "Generic (PLEG): container finished" podID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" exitCode=0 Oct 09 15:46:08 crc kubenswrapper[4719]: I1009 15:46:08.091119 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerDied","Data":"55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b"} Oct 09 15:46:08 crc kubenswrapper[4719]: I1009 15:46:08.091188 4719 scope.go:117] "RemoveContainer" containerID="a69453ebf4e1aaf18164eaf7feb2c37cbe0797331962fcb2782850eab34c0940" Oct 09 15:46:08 crc kubenswrapper[4719]: I1009 15:46:08.091959 4719 scope.go:117] "RemoveContainer" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" Oct 09 15:46:08 crc kubenswrapper[4719]: E1009 15:46:08.092256 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:46:08 crc kubenswrapper[4719]: I1009 15:46:08.500556 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl" Oct 09 15:46:08 crc kubenswrapper[4719]: I1009 15:46:08.607342 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49f3b180-01ca-489f-9a12-5e22d186b1b7-inventory\") pod \"49f3b180-01ca-489f-9a12-5e22d186b1b7\" (UID: \"49f3b180-01ca-489f-9a12-5e22d186b1b7\") " Oct 09 15:46:08 crc kubenswrapper[4719]: I1009 15:46:08.607405 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwcgp\" (UniqueName: \"kubernetes.io/projected/49f3b180-01ca-489f-9a12-5e22d186b1b7-kube-api-access-pwcgp\") pod \"49f3b180-01ca-489f-9a12-5e22d186b1b7\" (UID: \"49f3b180-01ca-489f-9a12-5e22d186b1b7\") " Oct 09 15:46:08 crc kubenswrapper[4719]: I1009 15:46:08.607478 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49f3b180-01ca-489f-9a12-5e22d186b1b7-ssh-key\") pod \"49f3b180-01ca-489f-9a12-5e22d186b1b7\" (UID: \"49f3b180-01ca-489f-9a12-5e22d186b1b7\") " Oct 09 15:46:08 crc kubenswrapper[4719]: I1009 15:46:08.619566 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49f3b180-01ca-489f-9a12-5e22d186b1b7-kube-api-access-pwcgp" (OuterVolumeSpecName: "kube-api-access-pwcgp") pod "49f3b180-01ca-489f-9a12-5e22d186b1b7" (UID: "49f3b180-01ca-489f-9a12-5e22d186b1b7"). InnerVolumeSpecName "kube-api-access-pwcgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:46:08 crc kubenswrapper[4719]: I1009 15:46:08.634963 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49f3b180-01ca-489f-9a12-5e22d186b1b7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "49f3b180-01ca-489f-9a12-5e22d186b1b7" (UID: "49f3b180-01ca-489f-9a12-5e22d186b1b7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:46:08 crc kubenswrapper[4719]: I1009 15:46:08.638963 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49f3b180-01ca-489f-9a12-5e22d186b1b7-inventory" (OuterVolumeSpecName: "inventory") pod "49f3b180-01ca-489f-9a12-5e22d186b1b7" (UID: "49f3b180-01ca-489f-9a12-5e22d186b1b7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:46:08 crc kubenswrapper[4719]: I1009 15:46:08.711060 4719 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49f3b180-01ca-489f-9a12-5e22d186b1b7-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 15:46:08 crc kubenswrapper[4719]: I1009 15:46:08.711133 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwcgp\" (UniqueName: \"kubernetes.io/projected/49f3b180-01ca-489f-9a12-5e22d186b1b7-kube-api-access-pwcgp\") on node \"crc\" DevicePath \"\"" Oct 09 15:46:08 crc kubenswrapper[4719]: I1009 15:46:08.711146 4719 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49f3b180-01ca-489f-9a12-5e22d186b1b7-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 15:46:09 crc kubenswrapper[4719]: I1009 15:46:09.100780 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl" Oct 09 15:46:09 crc kubenswrapper[4719]: I1009 15:46:09.100788 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl" event={"ID":"49f3b180-01ca-489f-9a12-5e22d186b1b7","Type":"ContainerDied","Data":"3271336da124477bf6ed333b3afc7340caa7511ae5dac99f88462f66b447d9a8"} Oct 09 15:46:09 crc kubenswrapper[4719]: I1009 15:46:09.101172 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3271336da124477bf6ed333b3afc7340caa7511ae5dac99f88462f66b447d9a8" Oct 09 15:46:09 crc kubenswrapper[4719]: I1009 15:46:09.187742 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstf7"] Oct 09 15:46:09 crc kubenswrapper[4719]: E1009 15:46:09.188244 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ff99c39-7c61-46a2-bb84-05cb745323bf" containerName="collect-profiles" Oct 09 15:46:09 crc kubenswrapper[4719]: I1009 15:46:09.188266 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ff99c39-7c61-46a2-bb84-05cb745323bf" containerName="collect-profiles" Oct 09 15:46:09 crc kubenswrapper[4719]: E1009 15:46:09.188277 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f3b180-01ca-489f-9a12-5e22d186b1b7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 09 15:46:09 crc kubenswrapper[4719]: I1009 15:46:09.188287 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f3b180-01ca-489f-9a12-5e22d186b1b7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 09 15:46:09 crc kubenswrapper[4719]: I1009 15:46:09.188635 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ff99c39-7c61-46a2-bb84-05cb745323bf" containerName="collect-profiles" Oct 09 15:46:09 crc kubenswrapper[4719]: I1009 15:46:09.188686 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="49f3b180-01ca-489f-9a12-5e22d186b1b7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 09 15:46:09 crc kubenswrapper[4719]: I1009 15:46:09.189564 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstf7" Oct 09 15:46:09 crc kubenswrapper[4719]: I1009 15:46:09.194680 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 15:46:09 crc kubenswrapper[4719]: I1009 15:46:09.194872 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 15:46:09 crc kubenswrapper[4719]: I1009 15:46:09.194997 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ssvsw" Oct 09 15:46:09 crc kubenswrapper[4719]: I1009 15:46:09.195537 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 15:46:09 crc kubenswrapper[4719]: I1009 15:46:09.214426 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstf7"] Oct 09 15:46:09 crc kubenswrapper[4719]: I1009 15:46:09.320523 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6368a031-4a2d-43bd-a289-fd9966d38182-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fstf7\" (UID: \"6368a031-4a2d-43bd-a289-fd9966d38182\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstf7" Oct 09 15:46:09 crc kubenswrapper[4719]: I1009 15:46:09.320589 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kbnv\" (UniqueName: \"kubernetes.io/projected/6368a031-4a2d-43bd-a289-fd9966d38182-kube-api-access-2kbnv\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fstf7\" (UID: \"6368a031-4a2d-43bd-a289-fd9966d38182\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstf7" Oct 09 15:46:09 crc kubenswrapper[4719]: I1009 15:46:09.320796 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6368a031-4a2d-43bd-a289-fd9966d38182-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fstf7\" (UID: \"6368a031-4a2d-43bd-a289-fd9966d38182\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstf7" Oct 09 15:46:09 crc kubenswrapper[4719]: I1009 15:46:09.422091 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6368a031-4a2d-43bd-a289-fd9966d38182-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fstf7\" (UID: \"6368a031-4a2d-43bd-a289-fd9966d38182\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstf7" Oct 09 15:46:09 crc kubenswrapper[4719]: I1009 15:46:09.422191 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6368a031-4a2d-43bd-a289-fd9966d38182-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fstf7\" (UID: \"6368a031-4a2d-43bd-a289-fd9966d38182\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstf7" Oct 09 15:46:09 crc kubenswrapper[4719]: I1009 15:46:09.422247 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kbnv\" (UniqueName: \"kubernetes.io/projected/6368a031-4a2d-43bd-a289-fd9966d38182-kube-api-access-2kbnv\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fstf7\" (UID: \"6368a031-4a2d-43bd-a289-fd9966d38182\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstf7" Oct 09 15:46:09 crc kubenswrapper[4719]: I1009 15:46:09.428050 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6368a031-4a2d-43bd-a289-fd9966d38182-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fstf7\" (UID: \"6368a031-4a2d-43bd-a289-fd9966d38182\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstf7" Oct 09 15:46:09 crc kubenswrapper[4719]: I1009 15:46:09.428094 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6368a031-4a2d-43bd-a289-fd9966d38182-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fstf7\" (UID: \"6368a031-4a2d-43bd-a289-fd9966d38182\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstf7" Oct 09 15:46:09 crc kubenswrapper[4719]: I1009 15:46:09.441511 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kbnv\" (UniqueName: \"kubernetes.io/projected/6368a031-4a2d-43bd-a289-fd9966d38182-kube-api-access-2kbnv\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fstf7\" (UID: \"6368a031-4a2d-43bd-a289-fd9966d38182\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstf7" Oct 09 15:46:09 crc kubenswrapper[4719]: I1009 15:46:09.517314 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstf7" Oct 09 15:46:10 crc kubenswrapper[4719]: I1009 15:46:10.029760 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstf7"] Oct 09 15:46:10 crc kubenswrapper[4719]: I1009 15:46:10.112293 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstf7" event={"ID":"6368a031-4a2d-43bd-a289-fd9966d38182","Type":"ContainerStarted","Data":"79821808460b248e52a8f7a8692b7489ee86c693715a61b77c4a0fb527bb44df"} Oct 09 15:46:11 crc kubenswrapper[4719]: I1009 15:46:11.143470 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstf7" event={"ID":"6368a031-4a2d-43bd-a289-fd9966d38182","Type":"ContainerStarted","Data":"d1c96d970ef9a31828a7c54bd9ee203dcfaa0a472dc89204fde6c824ac8dd9f4"} Oct 09 15:46:11 crc kubenswrapper[4719]: I1009 15:46:11.173841 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstf7" podStartSLOduration=1.611980849 podStartE2EDuration="2.173820127s" podCreationTimestamp="2025-10-09 15:46:09 +0000 UTC" firstStartedPulling="2025-10-09 15:46:10.034186043 +0000 UTC m=+1675.543897328" lastFinishedPulling="2025-10-09 15:46:10.596025321 +0000 UTC m=+1676.105736606" observedRunningTime="2025-10-09 15:46:11.161763243 +0000 UTC m=+1676.671474568" watchObservedRunningTime="2025-10-09 15:46:11.173820127 +0000 UTC m=+1676.683531422" Oct 09 15:46:20 crc kubenswrapper[4719]: I1009 15:46:20.160704 4719 scope.go:117] "RemoveContainer" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" Oct 09 15:46:20 crc kubenswrapper[4719]: E1009 15:46:20.161456 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:46:22 crc kubenswrapper[4719]: I1009 15:46:22.039147 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-jk6nr"] Oct 09 15:46:22 crc kubenswrapper[4719]: I1009 15:46:22.048696 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-gfc75"] Oct 09 15:46:22 crc kubenswrapper[4719]: I1009 15:46:22.058792 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-gfc75"] Oct 09 15:46:22 crc kubenswrapper[4719]: I1009 15:46:22.068066 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-jk6nr"] Oct 09 15:46:23 crc kubenswrapper[4719]: I1009 15:46:23.172673 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08e04378-245e-4a13-b1de-f11cf96579ef" path="/var/lib/kubelet/pods/08e04378-245e-4a13-b1de-f11cf96579ef/volumes" Oct 09 15:46:23 crc kubenswrapper[4719]: I1009 15:46:23.173514 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2205fae-acbe-4123-936d-ad78cd542565" path="/var/lib/kubelet/pods/a2205fae-acbe-4123-936d-ad78cd542565/volumes" Oct 09 15:46:25 crc kubenswrapper[4719]: I1009 15:46:25.409921 4719 scope.go:117] "RemoveContainer" containerID="d6277e8a831c6da4e6159e08cafb95492961642c0693a9ebcd2a6bee847db126" Oct 09 15:46:25 crc kubenswrapper[4719]: I1009 15:46:25.435444 4719 scope.go:117] "RemoveContainer" containerID="d2dcc9015f72d339f90b0e01e06a39cf15ee638599453598e6e0077f826a49f9" Oct 09 15:46:25 crc kubenswrapper[4719]: I1009 15:46:25.493125 4719 scope.go:117] "RemoveContainer" containerID="3f3e45fd5d839263a7a9ccdb3c56692cd7a0ab79628512ad452c31b55488823e" Oct 09 15:46:25 crc kubenswrapper[4719]: I1009 15:46:25.531902 4719 scope.go:117] "RemoveContainer" containerID="65ad4b7954f202b5cb00ecc8ad75a27ee18084f1c55c8114d5015d1a1dff8f24" Oct 09 15:46:25 crc kubenswrapper[4719]: I1009 15:46:25.552566 4719 scope.go:117] "RemoveContainer" containerID="d6a2915bfb2afc60553b83441c2b55c901203693a34c5b734c42433fa98b8859" Oct 09 15:46:25 crc kubenswrapper[4719]: I1009 15:46:25.607870 4719 scope.go:117] "RemoveContainer" containerID="63237fb476ce410d4fe48e51b67339fa6dc864befe7f8f952a555fafc005e3f4" Oct 09 15:46:25 crc kubenswrapper[4719]: I1009 15:46:25.665885 4719 scope.go:117] "RemoveContainer" containerID="9fd313329586b35b942fe6233e6f3512d120cb80c30bd770fb6e47f079d5fa27" Oct 09 15:46:25 crc kubenswrapper[4719]: I1009 15:46:25.704657 4719 scope.go:117] "RemoveContainer" containerID="6ef3888a3c2854fec9e0745b81053353ae3902ae05b20f400eabf7db3560f157" Oct 09 15:46:25 crc kubenswrapper[4719]: I1009 15:46:25.724134 4719 scope.go:117] "RemoveContainer" containerID="eb4a3aadc347171a450ed9511fb7da84e6b26b180fcca0960a7cccab31898403" Oct 09 15:46:25 crc kubenswrapper[4719]: I1009 15:46:25.748246 4719 scope.go:117] "RemoveContainer" containerID="a78ab41081ba1e425676665ce30a819dc993b463bc2c9e718253b4f57da6b9c4" Oct 09 15:46:35 crc kubenswrapper[4719]: I1009 15:46:35.170748 4719 scope.go:117] "RemoveContainer" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" Oct 09 15:46:35 crc kubenswrapper[4719]: E1009 15:46:35.171799 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:46:42 crc kubenswrapper[4719]: I1009 15:46:42.066405 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-ztgbm"] Oct 09 15:46:42 crc kubenswrapper[4719]: I1009 15:46:42.081120 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-ztgbm"] Oct 09 15:46:43 crc kubenswrapper[4719]: I1009 15:46:43.173164 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e899b0de-03a2-44a5-a165-25c988e8489d" path="/var/lib/kubelet/pods/e899b0de-03a2-44a5-a165-25c988e8489d/volumes" Oct 09 15:46:47 crc kubenswrapper[4719]: I1009 15:46:47.162580 4719 scope.go:117] "RemoveContainer" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" Oct 09 15:46:47 crc kubenswrapper[4719]: E1009 15:46:47.169174 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:46:55 crc kubenswrapper[4719]: I1009 15:46:55.026122 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-wrbqp"] Oct 09 15:46:55 crc kubenswrapper[4719]: I1009 15:46:55.035506 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-wrbqp"] Oct 09 15:46:55 crc kubenswrapper[4719]: I1009 15:46:55.171284 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3b4bbf2-4c3a-41bf-bc92-8d267af7a236" path="/var/lib/kubelet/pods/c3b4bbf2-4c3a-41bf-bc92-8d267af7a236/volumes" Oct 09 15:46:56 crc kubenswrapper[4719]: I1009 15:46:56.058609 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-rlr7g"] Oct 09 15:46:56 crc kubenswrapper[4719]: I1009 15:46:56.071888 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-xbrpr"] Oct 09 15:46:56 crc kubenswrapper[4719]: I1009 15:46:56.080186 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-rlr7g"] Oct 09 15:46:56 crc kubenswrapper[4719]: I1009 15:46:56.087679 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-xbrpr"] Oct 09 15:46:57 crc kubenswrapper[4719]: I1009 15:46:57.175277 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2352adae-2de1-4980-8391-53cf9c4c14a2" path="/var/lib/kubelet/pods/2352adae-2de1-4980-8391-53cf9c4c14a2/volumes" Oct 09 15:46:57 crc kubenswrapper[4719]: I1009 15:46:57.176162 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e78db87f-acd4-471d-82f8-e854df1b36ea" path="/var/lib/kubelet/pods/e78db87f-acd4-471d-82f8-e854df1b36ea/volumes" Oct 09 15:47:01 crc kubenswrapper[4719]: I1009 15:47:01.028052 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-b7f5-account-create-gknm7"] Oct 09 15:47:01 crc kubenswrapper[4719]: I1009 15:47:01.037855 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-8326-account-create-lvblm"] Oct 09 15:47:01 crc kubenswrapper[4719]: I1009 15:47:01.047016 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-8488-account-create-g9bpn"] Oct 09 15:47:01 crc kubenswrapper[4719]: I1009 15:47:01.055072 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-b7f5-account-create-gknm7"] Oct 09 15:47:01 crc kubenswrapper[4719]: I1009 15:47:01.064661 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-8326-account-create-lvblm"] Oct 09 15:47:01 crc kubenswrapper[4719]: I1009 15:47:01.072932 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-8488-account-create-g9bpn"] Oct 09 15:47:01 crc kubenswrapper[4719]: I1009 15:47:01.162002 4719 scope.go:117] "RemoveContainer" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" Oct 09 15:47:01 crc kubenswrapper[4719]: E1009 15:47:01.162286 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:47:01 crc kubenswrapper[4719]: I1009 15:47:01.172995 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="492c63ff-0d48-43ff-964c-dcfa64728450" path="/var/lib/kubelet/pods/492c63ff-0d48-43ff-964c-dcfa64728450/volumes" Oct 09 15:47:01 crc kubenswrapper[4719]: I1009 15:47:01.173569 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52a46418-61ee-44a3-b1e3-128f043ad33d" path="/var/lib/kubelet/pods/52a46418-61ee-44a3-b1e3-128f043ad33d/volumes" Oct 09 15:47:01 crc kubenswrapper[4719]: I1009 15:47:01.174075 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f57fc9e0-e7f0-43da-aa68-e507aef3750a" path="/var/lib/kubelet/pods/f57fc9e0-e7f0-43da-aa68-e507aef3750a/volumes" Oct 09 15:47:16 crc kubenswrapper[4719]: I1009 15:47:16.161205 4719 scope.go:117] "RemoveContainer" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" Oct 09 15:47:16 crc kubenswrapper[4719]: E1009 15:47:16.162293 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:47:21 crc kubenswrapper[4719]: I1009 15:47:21.761201 4719 generic.go:334] "Generic (PLEG): container finished" podID="6368a031-4a2d-43bd-a289-fd9966d38182" containerID="d1c96d970ef9a31828a7c54bd9ee203dcfaa0a472dc89204fde6c824ac8dd9f4" exitCode=0 Oct 09 15:47:21 crc kubenswrapper[4719]: I1009 15:47:21.761279 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstf7" event={"ID":"6368a031-4a2d-43bd-a289-fd9966d38182","Type":"ContainerDied","Data":"d1c96d970ef9a31828a7c54bd9ee203dcfaa0a472dc89204fde6c824ac8dd9f4"} Oct 09 15:47:23 crc kubenswrapper[4719]: I1009 15:47:23.178865 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstf7" Oct 09 15:47:23 crc kubenswrapper[4719]: I1009 15:47:23.303367 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kbnv\" (UniqueName: \"kubernetes.io/projected/6368a031-4a2d-43bd-a289-fd9966d38182-kube-api-access-2kbnv\") pod \"6368a031-4a2d-43bd-a289-fd9966d38182\" (UID: \"6368a031-4a2d-43bd-a289-fd9966d38182\") " Oct 09 15:47:23 crc kubenswrapper[4719]: I1009 15:47:23.303694 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6368a031-4a2d-43bd-a289-fd9966d38182-inventory\") pod \"6368a031-4a2d-43bd-a289-fd9966d38182\" (UID: \"6368a031-4a2d-43bd-a289-fd9966d38182\") " Oct 09 15:47:23 crc kubenswrapper[4719]: I1009 15:47:23.303721 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6368a031-4a2d-43bd-a289-fd9966d38182-ssh-key\") pod \"6368a031-4a2d-43bd-a289-fd9966d38182\" (UID: \"6368a031-4a2d-43bd-a289-fd9966d38182\") " Oct 09 15:47:23 crc kubenswrapper[4719]: I1009 15:47:23.310595 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6368a031-4a2d-43bd-a289-fd9966d38182-kube-api-access-2kbnv" (OuterVolumeSpecName: "kube-api-access-2kbnv") pod "6368a031-4a2d-43bd-a289-fd9966d38182" (UID: "6368a031-4a2d-43bd-a289-fd9966d38182"). InnerVolumeSpecName "kube-api-access-2kbnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:47:23 crc kubenswrapper[4719]: I1009 15:47:23.336577 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6368a031-4a2d-43bd-a289-fd9966d38182-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6368a031-4a2d-43bd-a289-fd9966d38182" (UID: "6368a031-4a2d-43bd-a289-fd9966d38182"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:47:23 crc kubenswrapper[4719]: I1009 15:47:23.340040 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6368a031-4a2d-43bd-a289-fd9966d38182-inventory" (OuterVolumeSpecName: "inventory") pod "6368a031-4a2d-43bd-a289-fd9966d38182" (UID: "6368a031-4a2d-43bd-a289-fd9966d38182"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:47:23 crc kubenswrapper[4719]: I1009 15:47:23.406889 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kbnv\" (UniqueName: \"kubernetes.io/projected/6368a031-4a2d-43bd-a289-fd9966d38182-kube-api-access-2kbnv\") on node \"crc\" DevicePath \"\"" Oct 09 15:47:23 crc kubenswrapper[4719]: I1009 15:47:23.406941 4719 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6368a031-4a2d-43bd-a289-fd9966d38182-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 15:47:23 crc kubenswrapper[4719]: I1009 15:47:23.406957 4719 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6368a031-4a2d-43bd-a289-fd9966d38182-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 15:47:23 crc kubenswrapper[4719]: I1009 15:47:23.780754 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstf7" event={"ID":"6368a031-4a2d-43bd-a289-fd9966d38182","Type":"ContainerDied","Data":"79821808460b248e52a8f7a8692b7489ee86c693715a61b77c4a0fb527bb44df"} Oct 09 15:47:23 crc kubenswrapper[4719]: I1009 15:47:23.780795 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79821808460b248e52a8f7a8692b7489ee86c693715a61b77c4a0fb527bb44df" Oct 09 15:47:23 crc kubenswrapper[4719]: I1009 15:47:23.780819 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fstf7" Oct 09 15:47:23 crc kubenswrapper[4719]: I1009 15:47:23.866190 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb"] Oct 09 15:47:23 crc kubenswrapper[4719]: E1009 15:47:23.866612 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6368a031-4a2d-43bd-a289-fd9966d38182" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 09 15:47:23 crc kubenswrapper[4719]: I1009 15:47:23.866631 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="6368a031-4a2d-43bd-a289-fd9966d38182" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 09 15:47:23 crc kubenswrapper[4719]: I1009 15:47:23.866845 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="6368a031-4a2d-43bd-a289-fd9966d38182" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 09 15:47:23 crc kubenswrapper[4719]: I1009 15:47:23.867656 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb" Oct 09 15:47:23 crc kubenswrapper[4719]: I1009 15:47:23.870994 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 15:47:23 crc kubenswrapper[4719]: I1009 15:47:23.871082 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 15:47:23 crc kubenswrapper[4719]: I1009 15:47:23.871187 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 15:47:23 crc kubenswrapper[4719]: I1009 15:47:23.871510 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ssvsw" Oct 09 15:47:23 crc kubenswrapper[4719]: I1009 15:47:23.876443 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb"] Oct 09 15:47:24 crc kubenswrapper[4719]: I1009 15:47:24.017793 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/506813a5-78ae-4083-8d8f-27f6a46858c8-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb\" (UID: \"506813a5-78ae-4083-8d8f-27f6a46858c8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb" Oct 09 15:47:24 crc kubenswrapper[4719]: I1009 15:47:24.018144 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f826k\" (UniqueName: \"kubernetes.io/projected/506813a5-78ae-4083-8d8f-27f6a46858c8-kube-api-access-f826k\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb\" (UID: \"506813a5-78ae-4083-8d8f-27f6a46858c8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb" Oct 09 15:47:24 crc kubenswrapper[4719]: I1009 15:47:24.018292 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/506813a5-78ae-4083-8d8f-27f6a46858c8-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb\" (UID: \"506813a5-78ae-4083-8d8f-27f6a46858c8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb" Oct 09 15:47:24 crc kubenswrapper[4719]: I1009 15:47:24.119626 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/506813a5-78ae-4083-8d8f-27f6a46858c8-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb\" (UID: \"506813a5-78ae-4083-8d8f-27f6a46858c8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb" Oct 09 15:47:24 crc kubenswrapper[4719]: I1009 15:47:24.119748 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/506813a5-78ae-4083-8d8f-27f6a46858c8-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb\" (UID: \"506813a5-78ae-4083-8d8f-27f6a46858c8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb" Oct 09 15:47:24 crc kubenswrapper[4719]: I1009 15:47:24.119789 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f826k\" (UniqueName: \"kubernetes.io/projected/506813a5-78ae-4083-8d8f-27f6a46858c8-kube-api-access-f826k\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb\" (UID: \"506813a5-78ae-4083-8d8f-27f6a46858c8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb" Oct 09 15:47:24 crc kubenswrapper[4719]: I1009 15:47:24.125125 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/506813a5-78ae-4083-8d8f-27f6a46858c8-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb\" (UID: \"506813a5-78ae-4083-8d8f-27f6a46858c8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb" Oct 09 15:47:24 crc kubenswrapper[4719]: I1009 15:47:24.125148 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/506813a5-78ae-4083-8d8f-27f6a46858c8-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb\" (UID: \"506813a5-78ae-4083-8d8f-27f6a46858c8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb" Oct 09 15:47:24 crc kubenswrapper[4719]: I1009 15:47:24.135282 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f826k\" (UniqueName: \"kubernetes.io/projected/506813a5-78ae-4083-8d8f-27f6a46858c8-kube-api-access-f826k\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb\" (UID: \"506813a5-78ae-4083-8d8f-27f6a46858c8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb" Oct 09 15:47:24 crc kubenswrapper[4719]: I1009 15:47:24.188086 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb" Oct 09 15:47:24 crc kubenswrapper[4719]: I1009 15:47:24.733704 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb"] Oct 09 15:47:24 crc kubenswrapper[4719]: I1009 15:47:24.788944 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb" event={"ID":"506813a5-78ae-4083-8d8f-27f6a46858c8","Type":"ContainerStarted","Data":"8afa0be61b668075dd8b77ade3a5156bc5c8bdab3d67628659e0eba948421492"} Oct 09 15:47:25 crc kubenswrapper[4719]: I1009 15:47:25.799157 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb" event={"ID":"506813a5-78ae-4083-8d8f-27f6a46858c8","Type":"ContainerStarted","Data":"a4babbd5a233578a976776d8bb58e5e5251e2807400b7e01ea8a61c41402dffd"} Oct 09 15:47:25 crc kubenswrapper[4719]: I1009 15:47:25.820616 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb" podStartSLOduration=2.19271102 podStartE2EDuration="2.820596356s" podCreationTimestamp="2025-10-09 15:47:23 +0000 UTC" firstStartedPulling="2025-10-09 15:47:24.739600958 +0000 UTC m=+1750.249312243" lastFinishedPulling="2025-10-09 15:47:25.367486294 +0000 UTC m=+1750.877197579" observedRunningTime="2025-10-09 15:47:25.819650916 +0000 UTC m=+1751.329362211" watchObservedRunningTime="2025-10-09 15:47:25.820596356 +0000 UTC m=+1751.330307641" Oct 09 15:47:25 crc kubenswrapper[4719]: I1009 15:47:25.941879 4719 scope.go:117] "RemoveContainer" containerID="77244bc6605fd24aa78812800ddb48aa5ee1251fc43de8a1b1bca2ba99147999" Oct 09 15:47:25 crc kubenswrapper[4719]: I1009 15:47:25.964010 4719 scope.go:117] "RemoveContainer" containerID="f2da0bb2674d9b973871e6dfdeb35437e6c8bdc74878e04a986d14a0da562bb9" Oct 09 15:47:26 crc kubenswrapper[4719]: I1009 15:47:26.019159 4719 scope.go:117] "RemoveContainer" containerID="d6e48031f32a43b042e0809dc21ad822e1e42e18482eb44ff54f17a57b726d49" Oct 09 15:47:26 crc kubenswrapper[4719]: I1009 15:47:26.077969 4719 scope.go:117] "RemoveContainer" containerID="2d4b60d2896ce3b31512fd54dee1348bfbea3cfa46ee0906f6ed3c117e22642b" Oct 09 15:47:26 crc kubenswrapper[4719]: I1009 15:47:26.118835 4719 scope.go:117] "RemoveContainer" containerID="aa349a351ba258207f2c6d303e05f5727dac541b5a9c2d7f825f913a1963c5ea" Oct 09 15:47:26 crc kubenswrapper[4719]: I1009 15:47:26.152898 4719 scope.go:117] "RemoveContainer" containerID="59c34ae01bc6bb0027e3d13a01e3f5fe42d9237382758c7f1e8cfb2ef7371a2e" Oct 09 15:47:26 crc kubenswrapper[4719]: I1009 15:47:26.174373 4719 scope.go:117] "RemoveContainer" containerID="d2a7b4859cbaf068dc0aa79909bca5a2ecb32678b5d80d753c30064df2cd59cb" Oct 09 15:47:30 crc kubenswrapper[4719]: I1009 15:47:30.854404 4719 generic.go:334] "Generic (PLEG): container finished" podID="506813a5-78ae-4083-8d8f-27f6a46858c8" containerID="a4babbd5a233578a976776d8bb58e5e5251e2807400b7e01ea8a61c41402dffd" exitCode=0 Oct 09 15:47:30 crc kubenswrapper[4719]: I1009 15:47:30.854485 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb" event={"ID":"506813a5-78ae-4083-8d8f-27f6a46858c8","Type":"ContainerDied","Data":"a4babbd5a233578a976776d8bb58e5e5251e2807400b7e01ea8a61c41402dffd"} Oct 09 15:47:31 crc kubenswrapper[4719]: I1009 15:47:31.035477 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-r57qh"] Oct 09 15:47:31 crc kubenswrapper[4719]: I1009 15:47:31.042666 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-r57qh"] Oct 09 15:47:31 crc kubenswrapper[4719]: I1009 15:47:31.161917 4719 scope.go:117] "RemoveContainer" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" Oct 09 15:47:31 crc kubenswrapper[4719]: E1009 15:47:31.162242 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:47:31 crc kubenswrapper[4719]: I1009 15:47:31.172964 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de61779c-4ad9-40bd-908e-27b82b5c82cb" path="/var/lib/kubelet/pods/de61779c-4ad9-40bd-908e-27b82b5c82cb/volumes" Oct 09 15:47:32 crc kubenswrapper[4719]: I1009 15:47:32.348207 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb" Oct 09 15:47:32 crc kubenswrapper[4719]: I1009 15:47:32.493529 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/506813a5-78ae-4083-8d8f-27f6a46858c8-inventory\") pod \"506813a5-78ae-4083-8d8f-27f6a46858c8\" (UID: \"506813a5-78ae-4083-8d8f-27f6a46858c8\") " Oct 09 15:47:32 crc kubenswrapper[4719]: I1009 15:47:32.493852 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/506813a5-78ae-4083-8d8f-27f6a46858c8-ssh-key\") pod \"506813a5-78ae-4083-8d8f-27f6a46858c8\" (UID: \"506813a5-78ae-4083-8d8f-27f6a46858c8\") " Oct 09 15:47:32 crc kubenswrapper[4719]: I1009 15:47:32.493930 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f826k\" (UniqueName: \"kubernetes.io/projected/506813a5-78ae-4083-8d8f-27f6a46858c8-kube-api-access-f826k\") pod \"506813a5-78ae-4083-8d8f-27f6a46858c8\" (UID: \"506813a5-78ae-4083-8d8f-27f6a46858c8\") " Oct 09 15:47:32 crc kubenswrapper[4719]: I1009 15:47:32.500058 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/506813a5-78ae-4083-8d8f-27f6a46858c8-kube-api-access-f826k" (OuterVolumeSpecName: "kube-api-access-f826k") pod "506813a5-78ae-4083-8d8f-27f6a46858c8" (UID: "506813a5-78ae-4083-8d8f-27f6a46858c8"). InnerVolumeSpecName "kube-api-access-f826k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:47:32 crc kubenswrapper[4719]: I1009 15:47:32.522072 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/506813a5-78ae-4083-8d8f-27f6a46858c8-inventory" (OuterVolumeSpecName: "inventory") pod "506813a5-78ae-4083-8d8f-27f6a46858c8" (UID: "506813a5-78ae-4083-8d8f-27f6a46858c8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:47:32 crc kubenswrapper[4719]: I1009 15:47:32.523835 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/506813a5-78ae-4083-8d8f-27f6a46858c8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "506813a5-78ae-4083-8d8f-27f6a46858c8" (UID: "506813a5-78ae-4083-8d8f-27f6a46858c8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:47:32 crc kubenswrapper[4719]: I1009 15:47:32.595875 4719 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/506813a5-78ae-4083-8d8f-27f6a46858c8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 15:47:32 crc kubenswrapper[4719]: I1009 15:47:32.596138 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f826k\" (UniqueName: \"kubernetes.io/projected/506813a5-78ae-4083-8d8f-27f6a46858c8-kube-api-access-f826k\") on node \"crc\" DevicePath \"\"" Oct 09 15:47:32 crc kubenswrapper[4719]: I1009 15:47:32.596241 4719 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/506813a5-78ae-4083-8d8f-27f6a46858c8-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 15:47:32 crc kubenswrapper[4719]: I1009 15:47:32.875836 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb" event={"ID":"506813a5-78ae-4083-8d8f-27f6a46858c8","Type":"ContainerDied","Data":"8afa0be61b668075dd8b77ade3a5156bc5c8bdab3d67628659e0eba948421492"} Oct 09 15:47:32 crc kubenswrapper[4719]: I1009 15:47:32.875876 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8afa0be61b668075dd8b77ade3a5156bc5c8bdab3d67628659e0eba948421492" Oct 09 15:47:32 crc kubenswrapper[4719]: I1009 15:47:32.875926 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb" Oct 09 15:47:32 crc kubenswrapper[4719]: I1009 15:47:32.959216 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-fm7mp"] Oct 09 15:47:32 crc kubenswrapper[4719]: E1009 15:47:32.959752 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="506813a5-78ae-4083-8d8f-27f6a46858c8" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 09 15:47:32 crc kubenswrapper[4719]: I1009 15:47:32.959784 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="506813a5-78ae-4083-8d8f-27f6a46858c8" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 09 15:47:32 crc kubenswrapper[4719]: I1009 15:47:32.960027 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="506813a5-78ae-4083-8d8f-27f6a46858c8" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 09 15:47:32 crc kubenswrapper[4719]: I1009 15:47:32.960957 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fm7mp" Oct 09 15:47:32 crc kubenswrapper[4719]: I1009 15:47:32.964285 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 15:47:32 crc kubenswrapper[4719]: I1009 15:47:32.964687 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 15:47:32 crc kubenswrapper[4719]: I1009 15:47:32.965576 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 15:47:32 crc kubenswrapper[4719]: I1009 15:47:32.965755 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ssvsw" Oct 09 15:47:32 crc kubenswrapper[4719]: I1009 15:47:32.973375 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-fm7mp"] Oct 09 15:47:33 crc kubenswrapper[4719]: I1009 15:47:33.110419 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f37f188-1b48-4b10-a085-e6a44d7e16d5-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fm7mp\" (UID: \"1f37f188-1b48-4b10-a085-e6a44d7e16d5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fm7mp" Oct 09 15:47:33 crc kubenswrapper[4719]: I1009 15:47:33.110526 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-568tk\" (UniqueName: \"kubernetes.io/projected/1f37f188-1b48-4b10-a085-e6a44d7e16d5-kube-api-access-568tk\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fm7mp\" (UID: \"1f37f188-1b48-4b10-a085-e6a44d7e16d5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fm7mp" Oct 09 15:47:33 crc kubenswrapper[4719]: I1009 15:47:33.110875 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f37f188-1b48-4b10-a085-e6a44d7e16d5-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fm7mp\" (UID: \"1f37f188-1b48-4b10-a085-e6a44d7e16d5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fm7mp" Oct 09 15:47:33 crc kubenswrapper[4719]: I1009 15:47:33.213213 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f37f188-1b48-4b10-a085-e6a44d7e16d5-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fm7mp\" (UID: \"1f37f188-1b48-4b10-a085-e6a44d7e16d5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fm7mp" Oct 09 15:47:33 crc kubenswrapper[4719]: I1009 15:47:33.213276 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-568tk\" (UniqueName: \"kubernetes.io/projected/1f37f188-1b48-4b10-a085-e6a44d7e16d5-kube-api-access-568tk\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fm7mp\" (UID: \"1f37f188-1b48-4b10-a085-e6a44d7e16d5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fm7mp" Oct 09 15:47:33 crc kubenswrapper[4719]: I1009 15:47:33.213408 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f37f188-1b48-4b10-a085-e6a44d7e16d5-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fm7mp\" (UID: \"1f37f188-1b48-4b10-a085-e6a44d7e16d5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fm7mp" Oct 09 15:47:33 crc kubenswrapper[4719]: I1009 15:47:33.220270 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f37f188-1b48-4b10-a085-e6a44d7e16d5-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fm7mp\" (UID: \"1f37f188-1b48-4b10-a085-e6a44d7e16d5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fm7mp" Oct 09 15:47:33 crc kubenswrapper[4719]: I1009 15:47:33.220286 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f37f188-1b48-4b10-a085-e6a44d7e16d5-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fm7mp\" (UID: \"1f37f188-1b48-4b10-a085-e6a44d7e16d5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fm7mp" Oct 09 15:47:33 crc kubenswrapper[4719]: I1009 15:47:33.230285 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-568tk\" (UniqueName: \"kubernetes.io/projected/1f37f188-1b48-4b10-a085-e6a44d7e16d5-kube-api-access-568tk\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fm7mp\" (UID: \"1f37f188-1b48-4b10-a085-e6a44d7e16d5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fm7mp" Oct 09 15:47:33 crc kubenswrapper[4719]: I1009 15:47:33.284092 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fm7mp" Oct 09 15:47:33 crc kubenswrapper[4719]: W1009 15:47:33.785714 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f37f188_1b48_4b10_a085_e6a44d7e16d5.slice/crio-9aae60bcd123d7a2e3038337505fe7394ab69920d3e645b75ab9cf4ba2d267da WatchSource:0}: Error finding container 9aae60bcd123d7a2e3038337505fe7394ab69920d3e645b75ab9cf4ba2d267da: Status 404 returned error can't find the container with id 9aae60bcd123d7a2e3038337505fe7394ab69920d3e645b75ab9cf4ba2d267da Oct 09 15:47:33 crc kubenswrapper[4719]: I1009 15:47:33.785720 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-fm7mp"] Oct 09 15:47:33 crc kubenswrapper[4719]: I1009 15:47:33.888141 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fm7mp" event={"ID":"1f37f188-1b48-4b10-a085-e6a44d7e16d5","Type":"ContainerStarted","Data":"9aae60bcd123d7a2e3038337505fe7394ab69920d3e645b75ab9cf4ba2d267da"} Oct 09 15:47:34 crc kubenswrapper[4719]: I1009 15:47:34.912715 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fm7mp" event={"ID":"1f37f188-1b48-4b10-a085-e6a44d7e16d5","Type":"ContainerStarted","Data":"aba6024feb294c5df5781c8eae44625fb265eff6493cec4f25fd60936b67ab66"} Oct 09 15:47:34 crc kubenswrapper[4719]: I1009 15:47:34.947120 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fm7mp" podStartSLOduration=2.51289503 podStartE2EDuration="2.94710159s" podCreationTimestamp="2025-10-09 15:47:32 +0000 UTC" firstStartedPulling="2025-10-09 15:47:33.788566486 +0000 UTC m=+1759.298277771" lastFinishedPulling="2025-10-09 15:47:34.222773056 +0000 UTC m=+1759.732484331" observedRunningTime="2025-10-09 15:47:34.941570514 +0000 UTC m=+1760.451281819" watchObservedRunningTime="2025-10-09 15:47:34.94710159 +0000 UTC m=+1760.456812875" Oct 09 15:47:45 crc kubenswrapper[4719]: I1009 15:47:45.168989 4719 scope.go:117] "RemoveContainer" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" Oct 09 15:47:45 crc kubenswrapper[4719]: E1009 15:47:45.171202 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:47:57 crc kubenswrapper[4719]: I1009 15:47:57.161624 4719 scope.go:117] "RemoveContainer" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" Oct 09 15:47:57 crc kubenswrapper[4719]: E1009 15:47:57.162419 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:48:12 crc kubenswrapper[4719]: I1009 15:48:12.161066 4719 scope.go:117] "RemoveContainer" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" Oct 09 15:48:12 crc kubenswrapper[4719]: E1009 15:48:12.161868 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:48:12 crc kubenswrapper[4719]: I1009 15:48:12.254681 4719 generic.go:334] "Generic (PLEG): container finished" podID="1f37f188-1b48-4b10-a085-e6a44d7e16d5" containerID="aba6024feb294c5df5781c8eae44625fb265eff6493cec4f25fd60936b67ab66" exitCode=0 Oct 09 15:48:12 crc kubenswrapper[4719]: I1009 15:48:12.254734 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fm7mp" event={"ID":"1f37f188-1b48-4b10-a085-e6a44d7e16d5","Type":"ContainerDied","Data":"aba6024feb294c5df5781c8eae44625fb265eff6493cec4f25fd60936b67ab66"} Oct 09 15:48:13 crc kubenswrapper[4719]: I1009 15:48:13.697716 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fm7mp" Oct 09 15:48:13 crc kubenswrapper[4719]: I1009 15:48:13.871411 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f37f188-1b48-4b10-a085-e6a44d7e16d5-inventory\") pod \"1f37f188-1b48-4b10-a085-e6a44d7e16d5\" (UID: \"1f37f188-1b48-4b10-a085-e6a44d7e16d5\") " Oct 09 15:48:13 crc kubenswrapper[4719]: I1009 15:48:13.871895 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-568tk\" (UniqueName: \"kubernetes.io/projected/1f37f188-1b48-4b10-a085-e6a44d7e16d5-kube-api-access-568tk\") pod \"1f37f188-1b48-4b10-a085-e6a44d7e16d5\" (UID: \"1f37f188-1b48-4b10-a085-e6a44d7e16d5\") " Oct 09 15:48:13 crc kubenswrapper[4719]: I1009 15:48:13.871960 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f37f188-1b48-4b10-a085-e6a44d7e16d5-ssh-key\") pod \"1f37f188-1b48-4b10-a085-e6a44d7e16d5\" (UID: \"1f37f188-1b48-4b10-a085-e6a44d7e16d5\") " Oct 09 15:48:13 crc kubenswrapper[4719]: I1009 15:48:13.877181 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f37f188-1b48-4b10-a085-e6a44d7e16d5-kube-api-access-568tk" (OuterVolumeSpecName: "kube-api-access-568tk") pod "1f37f188-1b48-4b10-a085-e6a44d7e16d5" (UID: "1f37f188-1b48-4b10-a085-e6a44d7e16d5"). InnerVolumeSpecName "kube-api-access-568tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:48:13 crc kubenswrapper[4719]: I1009 15:48:13.901280 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f37f188-1b48-4b10-a085-e6a44d7e16d5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1f37f188-1b48-4b10-a085-e6a44d7e16d5" (UID: "1f37f188-1b48-4b10-a085-e6a44d7e16d5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:48:13 crc kubenswrapper[4719]: I1009 15:48:13.903693 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f37f188-1b48-4b10-a085-e6a44d7e16d5-inventory" (OuterVolumeSpecName: "inventory") pod "1f37f188-1b48-4b10-a085-e6a44d7e16d5" (UID: "1f37f188-1b48-4b10-a085-e6a44d7e16d5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:48:13 crc kubenswrapper[4719]: I1009 15:48:13.975377 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-568tk\" (UniqueName: \"kubernetes.io/projected/1f37f188-1b48-4b10-a085-e6a44d7e16d5-kube-api-access-568tk\") on node \"crc\" DevicePath \"\"" Oct 09 15:48:13 crc kubenswrapper[4719]: I1009 15:48:13.975542 4719 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f37f188-1b48-4b10-a085-e6a44d7e16d5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 15:48:13 crc kubenswrapper[4719]: I1009 15:48:13.975669 4719 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f37f188-1b48-4b10-a085-e6a44d7e16d5-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 15:48:14 crc kubenswrapper[4719]: I1009 15:48:14.275201 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fm7mp" event={"ID":"1f37f188-1b48-4b10-a085-e6a44d7e16d5","Type":"ContainerDied","Data":"9aae60bcd123d7a2e3038337505fe7394ab69920d3e645b75ab9cf4ba2d267da"} Oct 09 15:48:14 crc kubenswrapper[4719]: I1009 15:48:14.275474 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9aae60bcd123d7a2e3038337505fe7394ab69920d3e645b75ab9cf4ba2d267da" Oct 09 15:48:14 crc kubenswrapper[4719]: I1009 15:48:14.275536 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fm7mp" Oct 09 15:48:14 crc kubenswrapper[4719]: I1009 15:48:14.406428 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-blvrz"] Oct 09 15:48:14 crc kubenswrapper[4719]: E1009 15:48:14.406902 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f37f188-1b48-4b10-a085-e6a44d7e16d5" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 09 15:48:14 crc kubenswrapper[4719]: I1009 15:48:14.406923 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f37f188-1b48-4b10-a085-e6a44d7e16d5" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 09 15:48:14 crc kubenswrapper[4719]: I1009 15:48:14.407169 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f37f188-1b48-4b10-a085-e6a44d7e16d5" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 09 15:48:14 crc kubenswrapper[4719]: I1009 15:48:14.407873 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-blvrz" Oct 09 15:48:14 crc kubenswrapper[4719]: I1009 15:48:14.410802 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 15:48:14 crc kubenswrapper[4719]: I1009 15:48:14.411045 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 15:48:14 crc kubenswrapper[4719]: I1009 15:48:14.411754 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 15:48:14 crc kubenswrapper[4719]: I1009 15:48:14.416202 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ssvsw" Oct 09 15:48:14 crc kubenswrapper[4719]: I1009 15:48:14.422994 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-blvrz"] Oct 09 15:48:14 crc kubenswrapper[4719]: I1009 15:48:14.588943 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48bbc\" (UniqueName: \"kubernetes.io/projected/c2ecd37c-0c41-4b8f-8072-c690aa729218-kube-api-access-48bbc\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-blvrz\" (UID: \"c2ecd37c-0c41-4b8f-8072-c690aa729218\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-blvrz" Oct 09 15:48:14 crc kubenswrapper[4719]: I1009 15:48:14.589064 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2ecd37c-0c41-4b8f-8072-c690aa729218-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-blvrz\" (UID: \"c2ecd37c-0c41-4b8f-8072-c690aa729218\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-blvrz" Oct 09 15:48:14 crc kubenswrapper[4719]: I1009 15:48:14.589109 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2ecd37c-0c41-4b8f-8072-c690aa729218-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-blvrz\" (UID: \"c2ecd37c-0c41-4b8f-8072-c690aa729218\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-blvrz" Oct 09 15:48:14 crc kubenswrapper[4719]: I1009 15:48:14.690846 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2ecd37c-0c41-4b8f-8072-c690aa729218-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-blvrz\" (UID: \"c2ecd37c-0c41-4b8f-8072-c690aa729218\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-blvrz" Oct 09 15:48:14 crc kubenswrapper[4719]: I1009 15:48:14.690924 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2ecd37c-0c41-4b8f-8072-c690aa729218-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-blvrz\" (UID: \"c2ecd37c-0c41-4b8f-8072-c690aa729218\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-blvrz" Oct 09 15:48:14 crc kubenswrapper[4719]: I1009 15:48:14.691201 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48bbc\" (UniqueName: \"kubernetes.io/projected/c2ecd37c-0c41-4b8f-8072-c690aa729218-kube-api-access-48bbc\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-blvrz\" (UID: \"c2ecd37c-0c41-4b8f-8072-c690aa729218\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-blvrz" Oct 09 15:48:14 crc kubenswrapper[4719]: I1009 15:48:14.696441 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2ecd37c-0c41-4b8f-8072-c690aa729218-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-blvrz\" (UID: \"c2ecd37c-0c41-4b8f-8072-c690aa729218\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-blvrz" Oct 09 15:48:14 crc kubenswrapper[4719]: I1009 15:48:14.703975 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2ecd37c-0c41-4b8f-8072-c690aa729218-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-blvrz\" (UID: \"c2ecd37c-0c41-4b8f-8072-c690aa729218\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-blvrz" Oct 09 15:48:14 crc kubenswrapper[4719]: I1009 15:48:14.709619 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48bbc\" (UniqueName: \"kubernetes.io/projected/c2ecd37c-0c41-4b8f-8072-c690aa729218-kube-api-access-48bbc\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-blvrz\" (UID: \"c2ecd37c-0c41-4b8f-8072-c690aa729218\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-blvrz" Oct 09 15:48:14 crc kubenswrapper[4719]: I1009 15:48:14.728962 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-blvrz" Oct 09 15:48:15 crc kubenswrapper[4719]: I1009 15:48:15.253949 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-blvrz"] Oct 09 15:48:15 crc kubenswrapper[4719]: I1009 15:48:15.284104 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-blvrz" event={"ID":"c2ecd37c-0c41-4b8f-8072-c690aa729218","Type":"ContainerStarted","Data":"20ec5cc1325bc615f8c3567e16ae8e729586f0e0007dbb818db8b6f2e262413d"} Oct 09 15:48:15 crc kubenswrapper[4719]: I1009 15:48:15.718561 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 15:48:16 crc kubenswrapper[4719]: I1009 15:48:16.295969 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-blvrz" event={"ID":"c2ecd37c-0c41-4b8f-8072-c690aa729218","Type":"ContainerStarted","Data":"aa85e44a0ccb6b4139e47b27865a4546f500363d4e31329a7c01fe93906d59da"} Oct 09 15:48:23 crc kubenswrapper[4719]: I1009 15:48:23.161230 4719 scope.go:117] "RemoveContainer" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" Oct 09 15:48:23 crc kubenswrapper[4719]: E1009 15:48:23.162339 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:48:26 crc kubenswrapper[4719]: I1009 15:48:26.323007 4719 scope.go:117] "RemoveContainer" containerID="690296cf308dd5fe94519d81b80bdbbd8feb6d565660c122d20de7a8f1fba837" Oct 09 15:48:33 crc kubenswrapper[4719]: I1009 15:48:33.054368 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-blvrz" podStartSLOduration=18.599371722 podStartE2EDuration="19.054333232s" podCreationTimestamp="2025-10-09 15:48:14 +0000 UTC" firstStartedPulling="2025-10-09 15:48:15.260129566 +0000 UTC m=+1800.769840851" lastFinishedPulling="2025-10-09 15:48:15.715091076 +0000 UTC m=+1801.224802361" observedRunningTime="2025-10-09 15:48:16.315421788 +0000 UTC m=+1801.825133093" watchObservedRunningTime="2025-10-09 15:48:33.054333232 +0000 UTC m=+1818.564044517" Oct 09 15:48:33 crc kubenswrapper[4719]: I1009 15:48:33.062274 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-9pqdd"] Oct 09 15:48:33 crc kubenswrapper[4719]: I1009 15:48:33.070124 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-9pqdd"] Oct 09 15:48:33 crc kubenswrapper[4719]: I1009 15:48:33.179714 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04e99f7a-bb5e-41c0-a55a-02b671a69ad8" path="/var/lib/kubelet/pods/04e99f7a-bb5e-41c0-a55a-02b671a69ad8/volumes" Oct 09 15:48:34 crc kubenswrapper[4719]: I1009 15:48:34.033476 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bfqtw"] Oct 09 15:48:34 crc kubenswrapper[4719]: I1009 15:48:34.056500 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bfqtw"] Oct 09 15:48:34 crc kubenswrapper[4719]: I1009 15:48:34.160993 4719 scope.go:117] "RemoveContainer" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" Oct 09 15:48:34 crc kubenswrapper[4719]: E1009 15:48:34.161230 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:48:35 crc kubenswrapper[4719]: I1009 15:48:35.173202 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e7ae899-fa79-4024-a512-6a7648d7fd6a" path="/var/lib/kubelet/pods/2e7ae899-fa79-4024-a512-6a7648d7fd6a/volumes" Oct 09 15:48:48 crc kubenswrapper[4719]: I1009 15:48:48.162585 4719 scope.go:117] "RemoveContainer" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" Oct 09 15:48:48 crc kubenswrapper[4719]: E1009 15:48:48.163802 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:49:02 crc kubenswrapper[4719]: I1009 15:49:02.161994 4719 scope.go:117] "RemoveContainer" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" Oct 09 15:49:02 crc kubenswrapper[4719]: E1009 15:49:02.163079 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:49:10 crc kubenswrapper[4719]: I1009 15:49:10.776489 4719 generic.go:334] "Generic (PLEG): container finished" podID="c2ecd37c-0c41-4b8f-8072-c690aa729218" containerID="aa85e44a0ccb6b4139e47b27865a4546f500363d4e31329a7c01fe93906d59da" exitCode=2 Oct 09 15:49:10 crc kubenswrapper[4719]: I1009 15:49:10.776576 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-blvrz" event={"ID":"c2ecd37c-0c41-4b8f-8072-c690aa729218","Type":"ContainerDied","Data":"aa85e44a0ccb6b4139e47b27865a4546f500363d4e31329a7c01fe93906d59da"} Oct 09 15:49:12 crc kubenswrapper[4719]: I1009 15:49:12.208116 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-blvrz" Oct 09 15:49:12 crc kubenswrapper[4719]: I1009 15:49:12.344883 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48bbc\" (UniqueName: \"kubernetes.io/projected/c2ecd37c-0c41-4b8f-8072-c690aa729218-kube-api-access-48bbc\") pod \"c2ecd37c-0c41-4b8f-8072-c690aa729218\" (UID: \"c2ecd37c-0c41-4b8f-8072-c690aa729218\") " Oct 09 15:49:12 crc kubenswrapper[4719]: I1009 15:49:12.344962 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2ecd37c-0c41-4b8f-8072-c690aa729218-ssh-key\") pod \"c2ecd37c-0c41-4b8f-8072-c690aa729218\" (UID: \"c2ecd37c-0c41-4b8f-8072-c690aa729218\") " Oct 09 15:49:12 crc kubenswrapper[4719]: I1009 15:49:12.344995 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2ecd37c-0c41-4b8f-8072-c690aa729218-inventory\") pod \"c2ecd37c-0c41-4b8f-8072-c690aa729218\" (UID: \"c2ecd37c-0c41-4b8f-8072-c690aa729218\") " Oct 09 15:49:12 crc kubenswrapper[4719]: I1009 15:49:12.352641 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2ecd37c-0c41-4b8f-8072-c690aa729218-kube-api-access-48bbc" (OuterVolumeSpecName: "kube-api-access-48bbc") pod "c2ecd37c-0c41-4b8f-8072-c690aa729218" (UID: "c2ecd37c-0c41-4b8f-8072-c690aa729218"). InnerVolumeSpecName "kube-api-access-48bbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:49:12 crc kubenswrapper[4719]: I1009 15:49:12.375408 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2ecd37c-0c41-4b8f-8072-c690aa729218-inventory" (OuterVolumeSpecName: "inventory") pod "c2ecd37c-0c41-4b8f-8072-c690aa729218" (UID: "c2ecd37c-0c41-4b8f-8072-c690aa729218"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:49:12 crc kubenswrapper[4719]: I1009 15:49:12.378640 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2ecd37c-0c41-4b8f-8072-c690aa729218-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c2ecd37c-0c41-4b8f-8072-c690aa729218" (UID: "c2ecd37c-0c41-4b8f-8072-c690aa729218"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:49:12 crc kubenswrapper[4719]: I1009 15:49:12.448311 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48bbc\" (UniqueName: \"kubernetes.io/projected/c2ecd37c-0c41-4b8f-8072-c690aa729218-kube-api-access-48bbc\") on node \"crc\" DevicePath \"\"" Oct 09 15:49:12 crc kubenswrapper[4719]: I1009 15:49:12.448341 4719 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2ecd37c-0c41-4b8f-8072-c690aa729218-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 15:49:12 crc kubenswrapper[4719]: I1009 15:49:12.448370 4719 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2ecd37c-0c41-4b8f-8072-c690aa729218-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 15:49:12 crc kubenswrapper[4719]: I1009 15:49:12.817535 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-blvrz" event={"ID":"c2ecd37c-0c41-4b8f-8072-c690aa729218","Type":"ContainerDied","Data":"20ec5cc1325bc615f8c3567e16ae8e729586f0e0007dbb818db8b6f2e262413d"} Oct 09 15:49:12 crc kubenswrapper[4719]: I1009 15:49:12.817581 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20ec5cc1325bc615f8c3567e16ae8e729586f0e0007dbb818db8b6f2e262413d" Oct 09 15:49:12 crc kubenswrapper[4719]: I1009 15:49:12.817590 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-blvrz" Oct 09 15:49:14 crc kubenswrapper[4719]: I1009 15:49:14.161251 4719 scope.go:117] "RemoveContainer" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" Oct 09 15:49:14 crc kubenswrapper[4719]: E1009 15:49:14.162786 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:49:16 crc kubenswrapper[4719]: I1009 15:49:16.041606 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-6wvgx"] Oct 09 15:49:16 crc kubenswrapper[4719]: I1009 15:49:16.052913 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-6wvgx"] Oct 09 15:49:17 crc kubenswrapper[4719]: I1009 15:49:17.175103 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d393af38-b13f-4f11-bff8-c10fd25fe75d" path="/var/lib/kubelet/pods/d393af38-b13f-4f11-bff8-c10fd25fe75d/volumes" Oct 09 15:49:20 crc kubenswrapper[4719]: I1009 15:49:20.041545 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b2llw"] Oct 09 15:49:20 crc kubenswrapper[4719]: E1009 15:49:20.042371 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2ecd37c-0c41-4b8f-8072-c690aa729218" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 15:49:20 crc kubenswrapper[4719]: I1009 15:49:20.042391 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ecd37c-0c41-4b8f-8072-c690aa729218" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 15:49:20 crc kubenswrapper[4719]: I1009 15:49:20.042630 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2ecd37c-0c41-4b8f-8072-c690aa729218" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 15:49:20 crc kubenswrapper[4719]: I1009 15:49:20.043290 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b2llw" Oct 09 15:49:20 crc kubenswrapper[4719]: I1009 15:49:20.046250 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 15:49:20 crc kubenswrapper[4719]: I1009 15:49:20.046292 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ssvsw" Oct 09 15:49:20 crc kubenswrapper[4719]: I1009 15:49:20.046327 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 15:49:20 crc kubenswrapper[4719]: I1009 15:49:20.047239 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 15:49:20 crc kubenswrapper[4719]: I1009 15:49:20.050789 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b2llw"] Oct 09 15:49:20 crc kubenswrapper[4719]: I1009 15:49:20.107644 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fa2621b-c679-4391-9058-cd2a871264df-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b2llw\" (UID: \"6fa2621b-c679-4391-9058-cd2a871264df\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b2llw" Oct 09 15:49:20 crc kubenswrapper[4719]: I1009 15:49:20.107841 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dfvl\" (UniqueName: \"kubernetes.io/projected/6fa2621b-c679-4391-9058-cd2a871264df-kube-api-access-7dfvl\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b2llw\" (UID: \"6fa2621b-c679-4391-9058-cd2a871264df\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b2llw" Oct 09 15:49:20 crc kubenswrapper[4719]: I1009 15:49:20.108562 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6fa2621b-c679-4391-9058-cd2a871264df-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b2llw\" (UID: \"6fa2621b-c679-4391-9058-cd2a871264df\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b2llw" Oct 09 15:49:20 crc kubenswrapper[4719]: I1009 15:49:20.210276 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fa2621b-c679-4391-9058-cd2a871264df-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b2llw\" (UID: \"6fa2621b-c679-4391-9058-cd2a871264df\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b2llw" Oct 09 15:49:20 crc kubenswrapper[4719]: I1009 15:49:20.210372 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dfvl\" (UniqueName: \"kubernetes.io/projected/6fa2621b-c679-4391-9058-cd2a871264df-kube-api-access-7dfvl\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b2llw\" (UID: \"6fa2621b-c679-4391-9058-cd2a871264df\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b2llw" Oct 09 15:49:20 crc kubenswrapper[4719]: I1009 15:49:20.210525 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6fa2621b-c679-4391-9058-cd2a871264df-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b2llw\" (UID: \"6fa2621b-c679-4391-9058-cd2a871264df\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b2llw" Oct 09 15:49:20 crc kubenswrapper[4719]: I1009 15:49:20.219487 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fa2621b-c679-4391-9058-cd2a871264df-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b2llw\" (UID: \"6fa2621b-c679-4391-9058-cd2a871264df\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b2llw" Oct 09 15:49:20 crc kubenswrapper[4719]: I1009 15:49:20.224742 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6fa2621b-c679-4391-9058-cd2a871264df-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b2llw\" (UID: \"6fa2621b-c679-4391-9058-cd2a871264df\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b2llw" Oct 09 15:49:20 crc kubenswrapper[4719]: I1009 15:49:20.229033 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dfvl\" (UniqueName: \"kubernetes.io/projected/6fa2621b-c679-4391-9058-cd2a871264df-kube-api-access-7dfvl\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b2llw\" (UID: \"6fa2621b-c679-4391-9058-cd2a871264df\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b2llw" Oct 09 15:49:20 crc kubenswrapper[4719]: I1009 15:49:20.400679 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b2llw" Oct 09 15:49:20 crc kubenswrapper[4719]: I1009 15:49:20.902889 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b2llw"] Oct 09 15:49:21 crc kubenswrapper[4719]: I1009 15:49:21.891603 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b2llw" event={"ID":"6fa2621b-c679-4391-9058-cd2a871264df","Type":"ContainerStarted","Data":"89d1214ff34162b35f3ab64606029262bc8a61322a0fd19f4505e109a858453f"} Oct 09 15:49:21 crc kubenswrapper[4719]: I1009 15:49:21.891683 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b2llw" event={"ID":"6fa2621b-c679-4391-9058-cd2a871264df","Type":"ContainerStarted","Data":"89afc76eaee63d4f035fe149d354dc6d2c3cb28fbe9597e56ae3c91c468bf181"} Oct 09 15:49:21 crc kubenswrapper[4719]: I1009 15:49:21.918205 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b2llw" podStartSLOduration=1.486723208 podStartE2EDuration="1.918178911s" podCreationTimestamp="2025-10-09 15:49:20 +0000 UTC" firstStartedPulling="2025-10-09 15:49:20.909227432 +0000 UTC m=+1866.418938717" lastFinishedPulling="2025-10-09 15:49:21.340683135 +0000 UTC m=+1866.850394420" observedRunningTime="2025-10-09 15:49:21.906983085 +0000 UTC m=+1867.416694370" watchObservedRunningTime="2025-10-09 15:49:21.918178911 +0000 UTC m=+1867.427890196" Oct 09 15:49:25 crc kubenswrapper[4719]: I1009 15:49:25.166680 4719 scope.go:117] "RemoveContainer" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" Oct 09 15:49:25 crc kubenswrapper[4719]: E1009 15:49:25.167344 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:49:26 crc kubenswrapper[4719]: I1009 15:49:26.391474 4719 scope.go:117] "RemoveContainer" containerID="5922ebc8350beb92166ff13c09d24c6d1827a56668004ecfe76dccc58e0473c4" Oct 09 15:49:26 crc kubenswrapper[4719]: I1009 15:49:26.438526 4719 scope.go:117] "RemoveContainer" containerID="53b22611344ea6f695a15115e5a926986cee107c3833356c129aded91a400f7f" Oct 09 15:49:26 crc kubenswrapper[4719]: I1009 15:49:26.476415 4719 scope.go:117] "RemoveContainer" containerID="e5560106bcb3aed30021975f8311fe1f4b36ab807bfa40286e680a4cef3b8000" Oct 09 15:49:36 crc kubenswrapper[4719]: I1009 15:49:36.161102 4719 scope.go:117] "RemoveContainer" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" Oct 09 15:49:36 crc kubenswrapper[4719]: E1009 15:49:36.161845 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:49:49 crc kubenswrapper[4719]: I1009 15:49:49.162679 4719 scope.go:117] "RemoveContainer" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" Oct 09 15:49:49 crc kubenswrapper[4719]: E1009 15:49:49.163625 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:50:00 crc kubenswrapper[4719]: I1009 15:50:00.162932 4719 scope.go:117] "RemoveContainer" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" Oct 09 15:50:00 crc kubenswrapper[4719]: E1009 15:50:00.164743 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:50:09 crc kubenswrapper[4719]: I1009 15:50:09.300810 4719 generic.go:334] "Generic (PLEG): container finished" podID="6fa2621b-c679-4391-9058-cd2a871264df" containerID="89d1214ff34162b35f3ab64606029262bc8a61322a0fd19f4505e109a858453f" exitCode=0 Oct 09 15:50:09 crc kubenswrapper[4719]: I1009 15:50:09.300899 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b2llw" event={"ID":"6fa2621b-c679-4391-9058-cd2a871264df","Type":"ContainerDied","Data":"89d1214ff34162b35f3ab64606029262bc8a61322a0fd19f4505e109a858453f"} Oct 09 15:50:10 crc kubenswrapper[4719]: I1009 15:50:10.703975 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b2llw" Oct 09 15:50:10 crc kubenswrapper[4719]: I1009 15:50:10.724066 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dfvl\" (UniqueName: \"kubernetes.io/projected/6fa2621b-c679-4391-9058-cd2a871264df-kube-api-access-7dfvl\") pod \"6fa2621b-c679-4391-9058-cd2a871264df\" (UID: \"6fa2621b-c679-4391-9058-cd2a871264df\") " Oct 09 15:50:10 crc kubenswrapper[4719]: I1009 15:50:10.724214 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6fa2621b-c679-4391-9058-cd2a871264df-ssh-key\") pod \"6fa2621b-c679-4391-9058-cd2a871264df\" (UID: \"6fa2621b-c679-4391-9058-cd2a871264df\") " Oct 09 15:50:10 crc kubenswrapper[4719]: I1009 15:50:10.724309 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fa2621b-c679-4391-9058-cd2a871264df-inventory\") pod \"6fa2621b-c679-4391-9058-cd2a871264df\" (UID: \"6fa2621b-c679-4391-9058-cd2a871264df\") " Oct 09 15:50:10 crc kubenswrapper[4719]: I1009 15:50:10.734606 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fa2621b-c679-4391-9058-cd2a871264df-kube-api-access-7dfvl" (OuterVolumeSpecName: "kube-api-access-7dfvl") pod "6fa2621b-c679-4391-9058-cd2a871264df" (UID: "6fa2621b-c679-4391-9058-cd2a871264df"). InnerVolumeSpecName "kube-api-access-7dfvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:50:10 crc kubenswrapper[4719]: I1009 15:50:10.755156 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fa2621b-c679-4391-9058-cd2a871264df-inventory" (OuterVolumeSpecName: "inventory") pod "6fa2621b-c679-4391-9058-cd2a871264df" (UID: "6fa2621b-c679-4391-9058-cd2a871264df"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:50:10 crc kubenswrapper[4719]: I1009 15:50:10.780511 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fa2621b-c679-4391-9058-cd2a871264df-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6fa2621b-c679-4391-9058-cd2a871264df" (UID: "6fa2621b-c679-4391-9058-cd2a871264df"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:50:10 crc kubenswrapper[4719]: I1009 15:50:10.827814 4719 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6fa2621b-c679-4391-9058-cd2a871264df-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 15:50:10 crc kubenswrapper[4719]: I1009 15:50:10.827856 4719 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fa2621b-c679-4391-9058-cd2a871264df-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 15:50:10 crc kubenswrapper[4719]: I1009 15:50:10.827871 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dfvl\" (UniqueName: \"kubernetes.io/projected/6fa2621b-c679-4391-9058-cd2a871264df-kube-api-access-7dfvl\") on node \"crc\" DevicePath \"\"" Oct 09 15:50:11 crc kubenswrapper[4719]: I1009 15:50:11.319076 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b2llw" event={"ID":"6fa2621b-c679-4391-9058-cd2a871264df","Type":"ContainerDied","Data":"89afc76eaee63d4f035fe149d354dc6d2c3cb28fbe9597e56ae3c91c468bf181"} Oct 09 15:50:11 crc kubenswrapper[4719]: I1009 15:50:11.319114 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89afc76eaee63d4f035fe149d354dc6d2c3cb28fbe9597e56ae3c91c468bf181" Oct 09 15:50:11 crc kubenswrapper[4719]: I1009 15:50:11.319398 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b2llw" Oct 09 15:50:11 crc kubenswrapper[4719]: I1009 15:50:11.414214 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-hfwbm"] Oct 09 15:50:11 crc kubenswrapper[4719]: E1009 15:50:11.415096 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fa2621b-c679-4391-9058-cd2a871264df" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 15:50:11 crc kubenswrapper[4719]: I1009 15:50:11.415118 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fa2621b-c679-4391-9058-cd2a871264df" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 15:50:11 crc kubenswrapper[4719]: I1009 15:50:11.415413 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fa2621b-c679-4391-9058-cd2a871264df" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 15:50:11 crc kubenswrapper[4719]: I1009 15:50:11.416233 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-hfwbm" Oct 09 15:50:11 crc kubenswrapper[4719]: I1009 15:50:11.420079 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 15:50:11 crc kubenswrapper[4719]: I1009 15:50:11.420296 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ssvsw" Oct 09 15:50:11 crc kubenswrapper[4719]: I1009 15:50:11.420313 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 15:50:11 crc kubenswrapper[4719]: I1009 15:50:11.420665 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 15:50:11 crc kubenswrapper[4719]: I1009 15:50:11.427738 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-hfwbm"] Oct 09 15:50:11 crc kubenswrapper[4719]: I1009 15:50:11.436970 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7df6a296-587c-407c-b2b4-ec923cd05cda-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-hfwbm\" (UID: \"7df6a296-587c-407c-b2b4-ec923cd05cda\") " pod="openstack/ssh-known-hosts-edpm-deployment-hfwbm" Oct 09 15:50:11 crc kubenswrapper[4719]: I1009 15:50:11.437330 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78k52\" (UniqueName: \"kubernetes.io/projected/7df6a296-587c-407c-b2b4-ec923cd05cda-kube-api-access-78k52\") pod \"ssh-known-hosts-edpm-deployment-hfwbm\" (UID: \"7df6a296-587c-407c-b2b4-ec923cd05cda\") " pod="openstack/ssh-known-hosts-edpm-deployment-hfwbm" Oct 09 15:50:11 crc kubenswrapper[4719]: I1009 15:50:11.437456 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7df6a296-587c-407c-b2b4-ec923cd05cda-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-hfwbm\" (UID: \"7df6a296-587c-407c-b2b4-ec923cd05cda\") " pod="openstack/ssh-known-hosts-edpm-deployment-hfwbm" Oct 09 15:50:11 crc kubenswrapper[4719]: I1009 15:50:11.538563 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78k52\" (UniqueName: \"kubernetes.io/projected/7df6a296-587c-407c-b2b4-ec923cd05cda-kube-api-access-78k52\") pod \"ssh-known-hosts-edpm-deployment-hfwbm\" (UID: \"7df6a296-587c-407c-b2b4-ec923cd05cda\") " pod="openstack/ssh-known-hosts-edpm-deployment-hfwbm" Oct 09 15:50:11 crc kubenswrapper[4719]: I1009 15:50:11.538658 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7df6a296-587c-407c-b2b4-ec923cd05cda-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-hfwbm\" (UID: \"7df6a296-587c-407c-b2b4-ec923cd05cda\") " pod="openstack/ssh-known-hosts-edpm-deployment-hfwbm" Oct 09 15:50:11 crc kubenswrapper[4719]: I1009 15:50:11.538718 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7df6a296-587c-407c-b2b4-ec923cd05cda-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-hfwbm\" (UID: \"7df6a296-587c-407c-b2b4-ec923cd05cda\") " pod="openstack/ssh-known-hosts-edpm-deployment-hfwbm" Oct 09 15:50:11 crc kubenswrapper[4719]: I1009 15:50:11.542355 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7df6a296-587c-407c-b2b4-ec923cd05cda-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-hfwbm\" (UID: \"7df6a296-587c-407c-b2b4-ec923cd05cda\") " pod="openstack/ssh-known-hosts-edpm-deployment-hfwbm" Oct 09 15:50:11 crc kubenswrapper[4719]: I1009 15:50:11.542733 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7df6a296-587c-407c-b2b4-ec923cd05cda-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-hfwbm\" (UID: \"7df6a296-587c-407c-b2b4-ec923cd05cda\") " pod="openstack/ssh-known-hosts-edpm-deployment-hfwbm" Oct 09 15:50:11 crc kubenswrapper[4719]: I1009 15:50:11.561429 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78k52\" (UniqueName: \"kubernetes.io/projected/7df6a296-587c-407c-b2b4-ec923cd05cda-kube-api-access-78k52\") pod \"ssh-known-hosts-edpm-deployment-hfwbm\" (UID: \"7df6a296-587c-407c-b2b4-ec923cd05cda\") " pod="openstack/ssh-known-hosts-edpm-deployment-hfwbm" Oct 09 15:50:11 crc kubenswrapper[4719]: I1009 15:50:11.732310 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-hfwbm" Oct 09 15:50:12 crc kubenswrapper[4719]: I1009 15:50:12.248872 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-hfwbm"] Oct 09 15:50:12 crc kubenswrapper[4719]: I1009 15:50:12.264508 4719 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 15:50:12 crc kubenswrapper[4719]: I1009 15:50:12.328890 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-hfwbm" event={"ID":"7df6a296-587c-407c-b2b4-ec923cd05cda","Type":"ContainerStarted","Data":"8ad8b1d860501d83ea36358dbb1324f7ebc2cb691d8ad44967acd2d167682732"} Oct 09 15:50:13 crc kubenswrapper[4719]: I1009 15:50:13.339958 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-hfwbm" event={"ID":"7df6a296-587c-407c-b2b4-ec923cd05cda","Type":"ContainerStarted","Data":"c82fbb22aa1dd8823fb11a73a6a5bfdd495ae6b8e6fda49af56e1cabacb2463b"} Oct 09 15:50:13 crc kubenswrapper[4719]: I1009 15:50:13.355390 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-hfwbm" podStartSLOduration=1.808321355 podStartE2EDuration="2.355355493s" podCreationTimestamp="2025-10-09 15:50:11 +0000 UTC" firstStartedPulling="2025-10-09 15:50:12.264276843 +0000 UTC m=+1917.773988118" lastFinishedPulling="2025-10-09 15:50:12.811310971 +0000 UTC m=+1918.321022256" observedRunningTime="2025-10-09 15:50:13.352582545 +0000 UTC m=+1918.862293830" watchObservedRunningTime="2025-10-09 15:50:13.355355493 +0000 UTC m=+1918.865066778" Oct 09 15:50:15 crc kubenswrapper[4719]: I1009 15:50:15.169230 4719 scope.go:117] "RemoveContainer" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" Oct 09 15:50:15 crc kubenswrapper[4719]: E1009 15:50:15.169487 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:50:20 crc kubenswrapper[4719]: I1009 15:50:20.400059 4719 generic.go:334] "Generic (PLEG): container finished" podID="7df6a296-587c-407c-b2b4-ec923cd05cda" containerID="c82fbb22aa1dd8823fb11a73a6a5bfdd495ae6b8e6fda49af56e1cabacb2463b" exitCode=0 Oct 09 15:50:20 crc kubenswrapper[4719]: I1009 15:50:20.400152 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-hfwbm" event={"ID":"7df6a296-587c-407c-b2b4-ec923cd05cda","Type":"ContainerDied","Data":"c82fbb22aa1dd8823fb11a73a6a5bfdd495ae6b8e6fda49af56e1cabacb2463b"} Oct 09 15:50:21 crc kubenswrapper[4719]: I1009 15:50:21.800501 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-hfwbm" Oct 09 15:50:21 crc kubenswrapper[4719]: I1009 15:50:21.838246 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78k52\" (UniqueName: \"kubernetes.io/projected/7df6a296-587c-407c-b2b4-ec923cd05cda-kube-api-access-78k52\") pod \"7df6a296-587c-407c-b2b4-ec923cd05cda\" (UID: \"7df6a296-587c-407c-b2b4-ec923cd05cda\") " Oct 09 15:50:21 crc kubenswrapper[4719]: I1009 15:50:21.838506 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7df6a296-587c-407c-b2b4-ec923cd05cda-inventory-0\") pod \"7df6a296-587c-407c-b2b4-ec923cd05cda\" (UID: \"7df6a296-587c-407c-b2b4-ec923cd05cda\") " Oct 09 15:50:21 crc kubenswrapper[4719]: I1009 15:50:21.838560 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7df6a296-587c-407c-b2b4-ec923cd05cda-ssh-key-openstack-edpm-ipam\") pod \"7df6a296-587c-407c-b2b4-ec923cd05cda\" (UID: \"7df6a296-587c-407c-b2b4-ec923cd05cda\") " Oct 09 15:50:21 crc kubenswrapper[4719]: I1009 15:50:21.846154 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7df6a296-587c-407c-b2b4-ec923cd05cda-kube-api-access-78k52" (OuterVolumeSpecName: "kube-api-access-78k52") pod "7df6a296-587c-407c-b2b4-ec923cd05cda" (UID: "7df6a296-587c-407c-b2b4-ec923cd05cda"). InnerVolumeSpecName "kube-api-access-78k52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:50:21 crc kubenswrapper[4719]: I1009 15:50:21.868195 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7df6a296-587c-407c-b2b4-ec923cd05cda-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "7df6a296-587c-407c-b2b4-ec923cd05cda" (UID: "7df6a296-587c-407c-b2b4-ec923cd05cda"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:50:21 crc kubenswrapper[4719]: I1009 15:50:21.872704 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7df6a296-587c-407c-b2b4-ec923cd05cda-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7df6a296-587c-407c-b2b4-ec923cd05cda" (UID: "7df6a296-587c-407c-b2b4-ec923cd05cda"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:50:21 crc kubenswrapper[4719]: I1009 15:50:21.940314 4719 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7df6a296-587c-407c-b2b4-ec923cd05cda-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 09 15:50:21 crc kubenswrapper[4719]: I1009 15:50:21.940361 4719 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7df6a296-587c-407c-b2b4-ec923cd05cda-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 09 15:50:21 crc kubenswrapper[4719]: I1009 15:50:21.940374 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78k52\" (UniqueName: \"kubernetes.io/projected/7df6a296-587c-407c-b2b4-ec923cd05cda-kube-api-access-78k52\") on node \"crc\" DevicePath \"\"" Oct 09 15:50:22 crc kubenswrapper[4719]: I1009 15:50:22.417053 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-hfwbm" event={"ID":"7df6a296-587c-407c-b2b4-ec923cd05cda","Type":"ContainerDied","Data":"8ad8b1d860501d83ea36358dbb1324f7ebc2cb691d8ad44967acd2d167682732"} Oct 09 15:50:22 crc kubenswrapper[4719]: I1009 15:50:22.417096 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ad8b1d860501d83ea36358dbb1324f7ebc2cb691d8ad44967acd2d167682732" Oct 09 15:50:22 crc kubenswrapper[4719]: I1009 15:50:22.417104 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-hfwbm" Oct 09 15:50:22 crc kubenswrapper[4719]: I1009 15:50:22.488238 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-h4kn5"] Oct 09 15:50:22 crc kubenswrapper[4719]: E1009 15:50:22.490911 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df6a296-587c-407c-b2b4-ec923cd05cda" containerName="ssh-known-hosts-edpm-deployment" Oct 09 15:50:22 crc kubenswrapper[4719]: I1009 15:50:22.490946 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df6a296-587c-407c-b2b4-ec923cd05cda" containerName="ssh-known-hosts-edpm-deployment" Oct 09 15:50:22 crc kubenswrapper[4719]: I1009 15:50:22.491182 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="7df6a296-587c-407c-b2b4-ec923cd05cda" containerName="ssh-known-hosts-edpm-deployment" Oct 09 15:50:22 crc kubenswrapper[4719]: I1009 15:50:22.491993 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h4kn5" Oct 09 15:50:22 crc kubenswrapper[4719]: I1009 15:50:22.494943 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 15:50:22 crc kubenswrapper[4719]: I1009 15:50:22.494998 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ssvsw" Oct 09 15:50:22 crc kubenswrapper[4719]: I1009 15:50:22.495174 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 15:50:22 crc kubenswrapper[4719]: I1009 15:50:22.500952 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-h4kn5"] Oct 09 15:50:22 crc kubenswrapper[4719]: I1009 15:50:22.504116 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 15:50:22 crc kubenswrapper[4719]: I1009 15:50:22.548920 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39fd920e-4d39-4926-b9d2-3c3c02ebb9ed-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-h4kn5\" (UID: \"39fd920e-4d39-4926-b9d2-3c3c02ebb9ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h4kn5" Oct 09 15:50:22 crc kubenswrapper[4719]: I1009 15:50:22.549010 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39fd920e-4d39-4926-b9d2-3c3c02ebb9ed-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-h4kn5\" (UID: \"39fd920e-4d39-4926-b9d2-3c3c02ebb9ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h4kn5" Oct 09 15:50:22 crc kubenswrapper[4719]: I1009 15:50:22.549039 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phk9h\" (UniqueName: \"kubernetes.io/projected/39fd920e-4d39-4926-b9d2-3c3c02ebb9ed-kube-api-access-phk9h\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-h4kn5\" (UID: \"39fd920e-4d39-4926-b9d2-3c3c02ebb9ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h4kn5" Oct 09 15:50:22 crc kubenswrapper[4719]: I1009 15:50:22.650865 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39fd920e-4d39-4926-b9d2-3c3c02ebb9ed-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-h4kn5\" (UID: \"39fd920e-4d39-4926-b9d2-3c3c02ebb9ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h4kn5" Oct 09 15:50:22 crc kubenswrapper[4719]: I1009 15:50:22.650913 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phk9h\" (UniqueName: \"kubernetes.io/projected/39fd920e-4d39-4926-b9d2-3c3c02ebb9ed-kube-api-access-phk9h\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-h4kn5\" (UID: \"39fd920e-4d39-4926-b9d2-3c3c02ebb9ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h4kn5" Oct 09 15:50:22 crc kubenswrapper[4719]: I1009 15:50:22.651072 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39fd920e-4d39-4926-b9d2-3c3c02ebb9ed-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-h4kn5\" (UID: \"39fd920e-4d39-4926-b9d2-3c3c02ebb9ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h4kn5" Oct 09 15:50:22 crc kubenswrapper[4719]: I1009 15:50:22.658525 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39fd920e-4d39-4926-b9d2-3c3c02ebb9ed-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-h4kn5\" (UID: \"39fd920e-4d39-4926-b9d2-3c3c02ebb9ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h4kn5" Oct 09 15:50:22 crc kubenswrapper[4719]: I1009 15:50:22.659395 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39fd920e-4d39-4926-b9d2-3c3c02ebb9ed-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-h4kn5\" (UID: \"39fd920e-4d39-4926-b9d2-3c3c02ebb9ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h4kn5" Oct 09 15:50:22 crc kubenswrapper[4719]: I1009 15:50:22.675922 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phk9h\" (UniqueName: \"kubernetes.io/projected/39fd920e-4d39-4926-b9d2-3c3c02ebb9ed-kube-api-access-phk9h\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-h4kn5\" (UID: \"39fd920e-4d39-4926-b9d2-3c3c02ebb9ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h4kn5" Oct 09 15:50:22 crc kubenswrapper[4719]: I1009 15:50:22.817093 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h4kn5" Oct 09 15:50:23 crc kubenswrapper[4719]: I1009 15:50:23.409247 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-h4kn5"] Oct 09 15:50:23 crc kubenswrapper[4719]: W1009 15:50:23.411531 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39fd920e_4d39_4926_b9d2_3c3c02ebb9ed.slice/crio-d9b474f9df2979ecf761009ef031af3a08c120bec613d6962d01495972b70922 WatchSource:0}: Error finding container d9b474f9df2979ecf761009ef031af3a08c120bec613d6962d01495972b70922: Status 404 returned error can't find the container with id d9b474f9df2979ecf761009ef031af3a08c120bec613d6962d01495972b70922 Oct 09 15:50:23 crc kubenswrapper[4719]: I1009 15:50:23.447466 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h4kn5" event={"ID":"39fd920e-4d39-4926-b9d2-3c3c02ebb9ed","Type":"ContainerStarted","Data":"d9b474f9df2979ecf761009ef031af3a08c120bec613d6962d01495972b70922"} Oct 09 15:50:24 crc kubenswrapper[4719]: I1009 15:50:24.458213 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h4kn5" event={"ID":"39fd920e-4d39-4926-b9d2-3c3c02ebb9ed","Type":"ContainerStarted","Data":"790caf9b8f0b880df359f60b5e8cdcad54cd8b0b55e6707a0104fc31eb59fdc1"} Oct 09 15:50:24 crc kubenswrapper[4719]: I1009 15:50:24.481822 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h4kn5" podStartSLOduration=2.039591341 podStartE2EDuration="2.481797586s" podCreationTimestamp="2025-10-09 15:50:22 +0000 UTC" firstStartedPulling="2025-10-09 15:50:23.415222465 +0000 UTC m=+1928.924933740" lastFinishedPulling="2025-10-09 15:50:23.8574287 +0000 UTC m=+1929.367139985" observedRunningTime="2025-10-09 15:50:24.472564863 +0000 UTC m=+1929.982276168" watchObservedRunningTime="2025-10-09 15:50:24.481797586 +0000 UTC m=+1929.991508871" Oct 09 15:50:27 crc kubenswrapper[4719]: I1009 15:50:27.162076 4719 scope.go:117] "RemoveContainer" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" Oct 09 15:50:27 crc kubenswrapper[4719]: E1009 15:50:27.162835 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:50:32 crc kubenswrapper[4719]: I1009 15:50:32.530165 4719 generic.go:334] "Generic (PLEG): container finished" podID="39fd920e-4d39-4926-b9d2-3c3c02ebb9ed" containerID="790caf9b8f0b880df359f60b5e8cdcad54cd8b0b55e6707a0104fc31eb59fdc1" exitCode=0 Oct 09 15:50:32 crc kubenswrapper[4719]: I1009 15:50:32.530270 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h4kn5" event={"ID":"39fd920e-4d39-4926-b9d2-3c3c02ebb9ed","Type":"ContainerDied","Data":"790caf9b8f0b880df359f60b5e8cdcad54cd8b0b55e6707a0104fc31eb59fdc1"} Oct 09 15:50:33 crc kubenswrapper[4719]: I1009 15:50:33.902812 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h4kn5" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.087183 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phk9h\" (UniqueName: \"kubernetes.io/projected/39fd920e-4d39-4926-b9d2-3c3c02ebb9ed-kube-api-access-phk9h\") pod \"39fd920e-4d39-4926-b9d2-3c3c02ebb9ed\" (UID: \"39fd920e-4d39-4926-b9d2-3c3c02ebb9ed\") " Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.087597 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39fd920e-4d39-4926-b9d2-3c3c02ebb9ed-ssh-key\") pod \"39fd920e-4d39-4926-b9d2-3c3c02ebb9ed\" (UID: \"39fd920e-4d39-4926-b9d2-3c3c02ebb9ed\") " Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.087647 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39fd920e-4d39-4926-b9d2-3c3c02ebb9ed-inventory\") pod \"39fd920e-4d39-4926-b9d2-3c3c02ebb9ed\" (UID: \"39fd920e-4d39-4926-b9d2-3c3c02ebb9ed\") " Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.092568 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39fd920e-4d39-4926-b9d2-3c3c02ebb9ed-kube-api-access-phk9h" (OuterVolumeSpecName: "kube-api-access-phk9h") pod "39fd920e-4d39-4926-b9d2-3c3c02ebb9ed" (UID: "39fd920e-4d39-4926-b9d2-3c3c02ebb9ed"). InnerVolumeSpecName "kube-api-access-phk9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.116251 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39fd920e-4d39-4926-b9d2-3c3c02ebb9ed-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "39fd920e-4d39-4926-b9d2-3c3c02ebb9ed" (UID: "39fd920e-4d39-4926-b9d2-3c3c02ebb9ed"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.130372 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39fd920e-4d39-4926-b9d2-3c3c02ebb9ed-inventory" (OuterVolumeSpecName: "inventory") pod "39fd920e-4d39-4926-b9d2-3c3c02ebb9ed" (UID: "39fd920e-4d39-4926-b9d2-3c3c02ebb9ed"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.190467 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phk9h\" (UniqueName: \"kubernetes.io/projected/39fd920e-4d39-4926-b9d2-3c3c02ebb9ed-kube-api-access-phk9h\") on node \"crc\" DevicePath \"\"" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.190496 4719 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39fd920e-4d39-4926-b9d2-3c3c02ebb9ed-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.190545 4719 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39fd920e-4d39-4926-b9d2-3c3c02ebb9ed-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.548517 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h4kn5" event={"ID":"39fd920e-4d39-4926-b9d2-3c3c02ebb9ed","Type":"ContainerDied","Data":"d9b474f9df2979ecf761009ef031af3a08c120bec613d6962d01495972b70922"} Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.548571 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9b474f9df2979ecf761009ef031af3a08c120bec613d6962d01495972b70922" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.548590 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-h4kn5" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.630958 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v"] Oct 09 15:50:34 crc kubenswrapper[4719]: E1009 15:50:34.631456 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39fd920e-4d39-4926-b9d2-3c3c02ebb9ed" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.631481 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="39fd920e-4d39-4926-b9d2-3c3c02ebb9ed" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.631765 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="39fd920e-4d39-4926-b9d2-3c3c02ebb9ed" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.632611 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.634730 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.634964 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ssvsw" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.635089 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.641731 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.653584 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v"] Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.800026 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39e39eb0-02e7-46b7-82be-38cbb9e1bf19-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v\" (UID: \"39e39eb0-02e7-46b7-82be-38cbb9e1bf19\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.800125 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbrx7\" (UniqueName: \"kubernetes.io/projected/39e39eb0-02e7-46b7-82be-38cbb9e1bf19-kube-api-access-sbrx7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v\" (UID: \"39e39eb0-02e7-46b7-82be-38cbb9e1bf19\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.800192 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39e39eb0-02e7-46b7-82be-38cbb9e1bf19-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v\" (UID: \"39e39eb0-02e7-46b7-82be-38cbb9e1bf19\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.902015 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39e39eb0-02e7-46b7-82be-38cbb9e1bf19-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v\" (UID: \"39e39eb0-02e7-46b7-82be-38cbb9e1bf19\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.902114 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbrx7\" (UniqueName: \"kubernetes.io/projected/39e39eb0-02e7-46b7-82be-38cbb9e1bf19-kube-api-access-sbrx7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v\" (UID: \"39e39eb0-02e7-46b7-82be-38cbb9e1bf19\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.902187 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39e39eb0-02e7-46b7-82be-38cbb9e1bf19-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v\" (UID: \"39e39eb0-02e7-46b7-82be-38cbb9e1bf19\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.905930 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39e39eb0-02e7-46b7-82be-38cbb9e1bf19-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v\" (UID: \"39e39eb0-02e7-46b7-82be-38cbb9e1bf19\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.916030 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39e39eb0-02e7-46b7-82be-38cbb9e1bf19-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v\" (UID: \"39e39eb0-02e7-46b7-82be-38cbb9e1bf19\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.918119 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbrx7\" (UniqueName: \"kubernetes.io/projected/39e39eb0-02e7-46b7-82be-38cbb9e1bf19-kube-api-access-sbrx7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v\" (UID: \"39e39eb0-02e7-46b7-82be-38cbb9e1bf19\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v" Oct 09 15:50:34 crc kubenswrapper[4719]: I1009 15:50:34.963966 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v" Oct 09 15:50:35 crc kubenswrapper[4719]: I1009 15:50:35.470306 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v"] Oct 09 15:50:35 crc kubenswrapper[4719]: I1009 15:50:35.557516 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v" event={"ID":"39e39eb0-02e7-46b7-82be-38cbb9e1bf19","Type":"ContainerStarted","Data":"8999ca856f69444ca6b85d1ceda6fd6ccbc7dcd9bd93a974822a03ba764be98d"} Oct 09 15:50:36 crc kubenswrapper[4719]: I1009 15:50:36.567828 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v" event={"ID":"39e39eb0-02e7-46b7-82be-38cbb9e1bf19","Type":"ContainerStarted","Data":"5c48d8a76c2cc8217177944b84074c9cb0303297a1dd9f679a9a0a3b587139af"} Oct 09 15:50:36 crc kubenswrapper[4719]: I1009 15:50:36.592814 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v" podStartSLOduration=2.178882024 podStartE2EDuration="2.59279452s" podCreationTimestamp="2025-10-09 15:50:34 +0000 UTC" firstStartedPulling="2025-10-09 15:50:35.476559211 +0000 UTC m=+1940.986270496" lastFinishedPulling="2025-10-09 15:50:35.890471707 +0000 UTC m=+1941.400182992" observedRunningTime="2025-10-09 15:50:36.583586138 +0000 UTC m=+1942.093297443" watchObservedRunningTime="2025-10-09 15:50:36.59279452 +0000 UTC m=+1942.102505815" Oct 09 15:50:42 crc kubenswrapper[4719]: I1009 15:50:42.161882 4719 scope.go:117] "RemoveContainer" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" Oct 09 15:50:42 crc kubenswrapper[4719]: E1009 15:50:42.162362 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:50:57 crc kubenswrapper[4719]: I1009 15:50:57.162099 4719 scope.go:117] "RemoveContainer" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" Oct 09 15:50:57 crc kubenswrapper[4719]: E1009 15:50:57.163061 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:51:10 crc kubenswrapper[4719]: I1009 15:51:10.161940 4719 scope.go:117] "RemoveContainer" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" Oct 09 15:51:10 crc kubenswrapper[4719]: I1009 15:51:10.885860 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerStarted","Data":"abad957eacff8118d320311536ec10e8847725a5f2366aab4422a5954663f1fc"} Oct 09 15:51:27 crc kubenswrapper[4719]: I1009 15:51:27.555887 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kbtrr"] Oct 09 15:51:27 crc kubenswrapper[4719]: I1009 15:51:27.558201 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kbtrr" Oct 09 15:51:27 crc kubenswrapper[4719]: I1009 15:51:27.572378 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kbtrr"] Oct 09 15:51:27 crc kubenswrapper[4719]: I1009 15:51:27.640263 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43f569f7-d920-40f4-9ae2-e1201295c852-catalog-content\") pod \"certified-operators-kbtrr\" (UID: \"43f569f7-d920-40f4-9ae2-e1201295c852\") " pod="openshift-marketplace/certified-operators-kbtrr" Oct 09 15:51:27 crc kubenswrapper[4719]: I1009 15:51:27.640329 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43f569f7-d920-40f4-9ae2-e1201295c852-utilities\") pod \"certified-operators-kbtrr\" (UID: \"43f569f7-d920-40f4-9ae2-e1201295c852\") " pod="openshift-marketplace/certified-operators-kbtrr" Oct 09 15:51:27 crc kubenswrapper[4719]: I1009 15:51:27.640845 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpjvz\" (UniqueName: \"kubernetes.io/projected/43f569f7-d920-40f4-9ae2-e1201295c852-kube-api-access-kpjvz\") pod \"certified-operators-kbtrr\" (UID: \"43f569f7-d920-40f4-9ae2-e1201295c852\") " pod="openshift-marketplace/certified-operators-kbtrr" Oct 09 15:51:27 crc kubenswrapper[4719]: I1009 15:51:27.743561 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpjvz\" (UniqueName: \"kubernetes.io/projected/43f569f7-d920-40f4-9ae2-e1201295c852-kube-api-access-kpjvz\") pod \"certified-operators-kbtrr\" (UID: \"43f569f7-d920-40f4-9ae2-e1201295c852\") " pod="openshift-marketplace/certified-operators-kbtrr" Oct 09 15:51:27 crc kubenswrapper[4719]: I1009 15:51:27.743647 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43f569f7-d920-40f4-9ae2-e1201295c852-catalog-content\") pod \"certified-operators-kbtrr\" (UID: \"43f569f7-d920-40f4-9ae2-e1201295c852\") " pod="openshift-marketplace/certified-operators-kbtrr" Oct 09 15:51:27 crc kubenswrapper[4719]: I1009 15:51:27.743682 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43f569f7-d920-40f4-9ae2-e1201295c852-utilities\") pod \"certified-operators-kbtrr\" (UID: \"43f569f7-d920-40f4-9ae2-e1201295c852\") " pod="openshift-marketplace/certified-operators-kbtrr" Oct 09 15:51:27 crc kubenswrapper[4719]: I1009 15:51:27.744425 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43f569f7-d920-40f4-9ae2-e1201295c852-utilities\") pod \"certified-operators-kbtrr\" (UID: \"43f569f7-d920-40f4-9ae2-e1201295c852\") " pod="openshift-marketplace/certified-operators-kbtrr" Oct 09 15:51:27 crc kubenswrapper[4719]: I1009 15:51:27.744434 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43f569f7-d920-40f4-9ae2-e1201295c852-catalog-content\") pod \"certified-operators-kbtrr\" (UID: \"43f569f7-d920-40f4-9ae2-e1201295c852\") " pod="openshift-marketplace/certified-operators-kbtrr" Oct 09 15:51:27 crc kubenswrapper[4719]: I1009 15:51:27.772264 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpjvz\" (UniqueName: \"kubernetes.io/projected/43f569f7-d920-40f4-9ae2-e1201295c852-kube-api-access-kpjvz\") pod \"certified-operators-kbtrr\" (UID: \"43f569f7-d920-40f4-9ae2-e1201295c852\") " pod="openshift-marketplace/certified-operators-kbtrr" Oct 09 15:51:27 crc kubenswrapper[4719]: I1009 15:51:27.881763 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kbtrr" Oct 09 15:51:28 crc kubenswrapper[4719]: I1009 15:51:28.502372 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kbtrr"] Oct 09 15:51:29 crc kubenswrapper[4719]: I1009 15:51:29.063768 4719 generic.go:334] "Generic (PLEG): container finished" podID="43f569f7-d920-40f4-9ae2-e1201295c852" containerID="df7107ec215c27cd78ee4171314329199248e4845b2442378c32738657db91ef" exitCode=0 Oct 09 15:51:29 crc kubenswrapper[4719]: I1009 15:51:29.063844 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbtrr" event={"ID":"43f569f7-d920-40f4-9ae2-e1201295c852","Type":"ContainerDied","Data":"df7107ec215c27cd78ee4171314329199248e4845b2442378c32738657db91ef"} Oct 09 15:51:29 crc kubenswrapper[4719]: I1009 15:51:29.064039 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbtrr" event={"ID":"43f569f7-d920-40f4-9ae2-e1201295c852","Type":"ContainerStarted","Data":"d8c1cc147c03f7668d0c13eaa30185727a3c9da7fc4f5b9bb3de416cf7b6657b"} Oct 09 15:51:31 crc kubenswrapper[4719]: I1009 15:51:31.087169 4719 generic.go:334] "Generic (PLEG): container finished" podID="43f569f7-d920-40f4-9ae2-e1201295c852" containerID="d2d53eb32a68413a789adaca6c7cf7fe460ca3cb2b16a0ae5693867fb5a1de2b" exitCode=0 Oct 09 15:51:31 crc kubenswrapper[4719]: I1009 15:51:31.087243 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbtrr" event={"ID":"43f569f7-d920-40f4-9ae2-e1201295c852","Type":"ContainerDied","Data":"d2d53eb32a68413a789adaca6c7cf7fe460ca3cb2b16a0ae5693867fb5a1de2b"} Oct 09 15:51:32 crc kubenswrapper[4719]: I1009 15:51:32.098215 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbtrr" event={"ID":"43f569f7-d920-40f4-9ae2-e1201295c852","Type":"ContainerStarted","Data":"5c8ee7d7da57f1490b6de0038465cce9a88e538a692676fa1a46efd8004f6b5f"} Oct 09 15:51:37 crc kubenswrapper[4719]: I1009 15:51:37.882575 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kbtrr" Oct 09 15:51:37 crc kubenswrapper[4719]: I1009 15:51:37.883271 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kbtrr" Oct 09 15:51:37 crc kubenswrapper[4719]: I1009 15:51:37.936385 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kbtrr" Oct 09 15:51:37 crc kubenswrapper[4719]: I1009 15:51:37.955010 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kbtrr" podStartSLOduration=8.274668677 podStartE2EDuration="10.954990701s" podCreationTimestamp="2025-10-09 15:51:27 +0000 UTC" firstStartedPulling="2025-10-09 15:51:29.065698414 +0000 UTC m=+1994.575409699" lastFinishedPulling="2025-10-09 15:51:31.746020438 +0000 UTC m=+1997.255731723" observedRunningTime="2025-10-09 15:51:32.11981967 +0000 UTC m=+1997.629530955" watchObservedRunningTime="2025-10-09 15:51:37.954990701 +0000 UTC m=+2003.464701986" Oct 09 15:51:38 crc kubenswrapper[4719]: I1009 15:51:38.232223 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kbtrr" Oct 09 15:51:38 crc kubenswrapper[4719]: I1009 15:51:38.277172 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kbtrr"] Oct 09 15:51:40 crc kubenswrapper[4719]: I1009 15:51:40.175515 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kbtrr" podUID="43f569f7-d920-40f4-9ae2-e1201295c852" containerName="registry-server" containerID="cri-o://5c8ee7d7da57f1490b6de0038465cce9a88e538a692676fa1a46efd8004f6b5f" gracePeriod=2 Oct 09 15:51:40 crc kubenswrapper[4719]: I1009 15:51:40.632144 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kbtrr" Oct 09 15:51:40 crc kubenswrapper[4719]: I1009 15:51:40.705851 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43f569f7-d920-40f4-9ae2-e1201295c852-catalog-content\") pod \"43f569f7-d920-40f4-9ae2-e1201295c852\" (UID: \"43f569f7-d920-40f4-9ae2-e1201295c852\") " Oct 09 15:51:40 crc kubenswrapper[4719]: I1009 15:51:40.706067 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43f569f7-d920-40f4-9ae2-e1201295c852-utilities\") pod \"43f569f7-d920-40f4-9ae2-e1201295c852\" (UID: \"43f569f7-d920-40f4-9ae2-e1201295c852\") " Oct 09 15:51:40 crc kubenswrapper[4719]: I1009 15:51:40.706092 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpjvz\" (UniqueName: \"kubernetes.io/projected/43f569f7-d920-40f4-9ae2-e1201295c852-kube-api-access-kpjvz\") pod \"43f569f7-d920-40f4-9ae2-e1201295c852\" (UID: \"43f569f7-d920-40f4-9ae2-e1201295c852\") " Oct 09 15:51:40 crc kubenswrapper[4719]: I1009 15:51:40.707226 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43f569f7-d920-40f4-9ae2-e1201295c852-utilities" (OuterVolumeSpecName: "utilities") pod "43f569f7-d920-40f4-9ae2-e1201295c852" (UID: "43f569f7-d920-40f4-9ae2-e1201295c852"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:51:40 crc kubenswrapper[4719]: I1009 15:51:40.715183 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43f569f7-d920-40f4-9ae2-e1201295c852-kube-api-access-kpjvz" (OuterVolumeSpecName: "kube-api-access-kpjvz") pod "43f569f7-d920-40f4-9ae2-e1201295c852" (UID: "43f569f7-d920-40f4-9ae2-e1201295c852"). InnerVolumeSpecName "kube-api-access-kpjvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:51:40 crc kubenswrapper[4719]: I1009 15:51:40.748743 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43f569f7-d920-40f4-9ae2-e1201295c852-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43f569f7-d920-40f4-9ae2-e1201295c852" (UID: "43f569f7-d920-40f4-9ae2-e1201295c852"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:51:40 crc kubenswrapper[4719]: I1009 15:51:40.809021 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43f569f7-d920-40f4-9ae2-e1201295c852-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 15:51:40 crc kubenswrapper[4719]: I1009 15:51:40.809063 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpjvz\" (UniqueName: \"kubernetes.io/projected/43f569f7-d920-40f4-9ae2-e1201295c852-kube-api-access-kpjvz\") on node \"crc\" DevicePath \"\"" Oct 09 15:51:40 crc kubenswrapper[4719]: I1009 15:51:40.809074 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43f569f7-d920-40f4-9ae2-e1201295c852-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 15:51:41 crc kubenswrapper[4719]: I1009 15:51:41.185414 4719 generic.go:334] "Generic (PLEG): container finished" podID="43f569f7-d920-40f4-9ae2-e1201295c852" containerID="5c8ee7d7da57f1490b6de0038465cce9a88e538a692676fa1a46efd8004f6b5f" exitCode=0 Oct 09 15:51:41 crc kubenswrapper[4719]: I1009 15:51:41.185459 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbtrr" event={"ID":"43f569f7-d920-40f4-9ae2-e1201295c852","Type":"ContainerDied","Data":"5c8ee7d7da57f1490b6de0038465cce9a88e538a692676fa1a46efd8004f6b5f"} Oct 09 15:51:41 crc kubenswrapper[4719]: I1009 15:51:41.185498 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbtrr" event={"ID":"43f569f7-d920-40f4-9ae2-e1201295c852","Type":"ContainerDied","Data":"d8c1cc147c03f7668d0c13eaa30185727a3c9da7fc4f5b9bb3de416cf7b6657b"} Oct 09 15:51:41 crc kubenswrapper[4719]: I1009 15:51:41.185523 4719 scope.go:117] "RemoveContainer" containerID="5c8ee7d7da57f1490b6de0038465cce9a88e538a692676fa1a46efd8004f6b5f" Oct 09 15:51:41 crc kubenswrapper[4719]: I1009 15:51:41.185515 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kbtrr" Oct 09 15:51:41 crc kubenswrapper[4719]: I1009 15:51:41.207224 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kbtrr"] Oct 09 15:51:41 crc kubenswrapper[4719]: I1009 15:51:41.210661 4719 scope.go:117] "RemoveContainer" containerID="d2d53eb32a68413a789adaca6c7cf7fe460ca3cb2b16a0ae5693867fb5a1de2b" Oct 09 15:51:41 crc kubenswrapper[4719]: I1009 15:51:41.217939 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kbtrr"] Oct 09 15:51:41 crc kubenswrapper[4719]: I1009 15:51:41.240223 4719 scope.go:117] "RemoveContainer" containerID="df7107ec215c27cd78ee4171314329199248e4845b2442378c32738657db91ef" Oct 09 15:51:41 crc kubenswrapper[4719]: I1009 15:51:41.295375 4719 scope.go:117] "RemoveContainer" containerID="5c8ee7d7da57f1490b6de0038465cce9a88e538a692676fa1a46efd8004f6b5f" Oct 09 15:51:41 crc kubenswrapper[4719]: E1009 15:51:41.296101 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c8ee7d7da57f1490b6de0038465cce9a88e538a692676fa1a46efd8004f6b5f\": container with ID starting with 5c8ee7d7da57f1490b6de0038465cce9a88e538a692676fa1a46efd8004f6b5f not found: ID does not exist" containerID="5c8ee7d7da57f1490b6de0038465cce9a88e538a692676fa1a46efd8004f6b5f" Oct 09 15:51:41 crc kubenswrapper[4719]: I1009 15:51:41.296130 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c8ee7d7da57f1490b6de0038465cce9a88e538a692676fa1a46efd8004f6b5f"} err="failed to get container status \"5c8ee7d7da57f1490b6de0038465cce9a88e538a692676fa1a46efd8004f6b5f\": rpc error: code = NotFound desc = could not find container \"5c8ee7d7da57f1490b6de0038465cce9a88e538a692676fa1a46efd8004f6b5f\": container with ID starting with 5c8ee7d7da57f1490b6de0038465cce9a88e538a692676fa1a46efd8004f6b5f not found: ID does not exist" Oct 09 15:51:41 crc kubenswrapper[4719]: I1009 15:51:41.296150 4719 scope.go:117] "RemoveContainer" containerID="d2d53eb32a68413a789adaca6c7cf7fe460ca3cb2b16a0ae5693867fb5a1de2b" Oct 09 15:51:41 crc kubenswrapper[4719]: E1009 15:51:41.296661 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2d53eb32a68413a789adaca6c7cf7fe460ca3cb2b16a0ae5693867fb5a1de2b\": container with ID starting with d2d53eb32a68413a789adaca6c7cf7fe460ca3cb2b16a0ae5693867fb5a1de2b not found: ID does not exist" containerID="d2d53eb32a68413a789adaca6c7cf7fe460ca3cb2b16a0ae5693867fb5a1de2b" Oct 09 15:51:41 crc kubenswrapper[4719]: I1009 15:51:41.296733 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2d53eb32a68413a789adaca6c7cf7fe460ca3cb2b16a0ae5693867fb5a1de2b"} err="failed to get container status \"d2d53eb32a68413a789adaca6c7cf7fe460ca3cb2b16a0ae5693867fb5a1de2b\": rpc error: code = NotFound desc = could not find container \"d2d53eb32a68413a789adaca6c7cf7fe460ca3cb2b16a0ae5693867fb5a1de2b\": container with ID starting with d2d53eb32a68413a789adaca6c7cf7fe460ca3cb2b16a0ae5693867fb5a1de2b not found: ID does not exist" Oct 09 15:51:41 crc kubenswrapper[4719]: I1009 15:51:41.296763 4719 scope.go:117] "RemoveContainer" containerID="df7107ec215c27cd78ee4171314329199248e4845b2442378c32738657db91ef" Oct 09 15:51:41 crc kubenswrapper[4719]: E1009 15:51:41.297169 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df7107ec215c27cd78ee4171314329199248e4845b2442378c32738657db91ef\": container with ID starting with df7107ec215c27cd78ee4171314329199248e4845b2442378c32738657db91ef not found: ID does not exist" containerID="df7107ec215c27cd78ee4171314329199248e4845b2442378c32738657db91ef" Oct 09 15:51:41 crc kubenswrapper[4719]: I1009 15:51:41.297198 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df7107ec215c27cd78ee4171314329199248e4845b2442378c32738657db91ef"} err="failed to get container status \"df7107ec215c27cd78ee4171314329199248e4845b2442378c32738657db91ef\": rpc error: code = NotFound desc = could not find container \"df7107ec215c27cd78ee4171314329199248e4845b2442378c32738657db91ef\": container with ID starting with df7107ec215c27cd78ee4171314329199248e4845b2442378c32738657db91ef not found: ID does not exist" Oct 09 15:51:43 crc kubenswrapper[4719]: I1009 15:51:43.178123 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43f569f7-d920-40f4-9ae2-e1201295c852" path="/var/lib/kubelet/pods/43f569f7-d920-40f4-9ae2-e1201295c852/volumes" Oct 09 15:51:51 crc kubenswrapper[4719]: I1009 15:51:51.274887 4719 generic.go:334] "Generic (PLEG): container finished" podID="39e39eb0-02e7-46b7-82be-38cbb9e1bf19" containerID="5c48d8a76c2cc8217177944b84074c9cb0303297a1dd9f679a9a0a3b587139af" exitCode=0 Oct 09 15:51:51 crc kubenswrapper[4719]: I1009 15:51:51.275021 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v" event={"ID":"39e39eb0-02e7-46b7-82be-38cbb9e1bf19","Type":"ContainerDied","Data":"5c48d8a76c2cc8217177944b84074c9cb0303297a1dd9f679a9a0a3b587139af"} Oct 09 15:51:52 crc kubenswrapper[4719]: I1009 15:51:52.679635 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v" Oct 09 15:51:52 crc kubenswrapper[4719]: I1009 15:51:52.745539 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39e39eb0-02e7-46b7-82be-38cbb9e1bf19-inventory\") pod \"39e39eb0-02e7-46b7-82be-38cbb9e1bf19\" (UID: \"39e39eb0-02e7-46b7-82be-38cbb9e1bf19\") " Oct 09 15:51:52 crc kubenswrapper[4719]: I1009 15:51:52.745601 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbrx7\" (UniqueName: \"kubernetes.io/projected/39e39eb0-02e7-46b7-82be-38cbb9e1bf19-kube-api-access-sbrx7\") pod \"39e39eb0-02e7-46b7-82be-38cbb9e1bf19\" (UID: \"39e39eb0-02e7-46b7-82be-38cbb9e1bf19\") " Oct 09 15:51:52 crc kubenswrapper[4719]: I1009 15:51:52.745743 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39e39eb0-02e7-46b7-82be-38cbb9e1bf19-ssh-key\") pod \"39e39eb0-02e7-46b7-82be-38cbb9e1bf19\" (UID: \"39e39eb0-02e7-46b7-82be-38cbb9e1bf19\") " Oct 09 15:51:52 crc kubenswrapper[4719]: I1009 15:51:52.750687 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39e39eb0-02e7-46b7-82be-38cbb9e1bf19-kube-api-access-sbrx7" (OuterVolumeSpecName: "kube-api-access-sbrx7") pod "39e39eb0-02e7-46b7-82be-38cbb9e1bf19" (UID: "39e39eb0-02e7-46b7-82be-38cbb9e1bf19"). InnerVolumeSpecName "kube-api-access-sbrx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:51:52 crc kubenswrapper[4719]: I1009 15:51:52.777214 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39e39eb0-02e7-46b7-82be-38cbb9e1bf19-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "39e39eb0-02e7-46b7-82be-38cbb9e1bf19" (UID: "39e39eb0-02e7-46b7-82be-38cbb9e1bf19"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:51:52 crc kubenswrapper[4719]: I1009 15:51:52.777255 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39e39eb0-02e7-46b7-82be-38cbb9e1bf19-inventory" (OuterVolumeSpecName: "inventory") pod "39e39eb0-02e7-46b7-82be-38cbb9e1bf19" (UID: "39e39eb0-02e7-46b7-82be-38cbb9e1bf19"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:51:52 crc kubenswrapper[4719]: I1009 15:51:52.847740 4719 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39e39eb0-02e7-46b7-82be-38cbb9e1bf19-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 15:51:52 crc kubenswrapper[4719]: I1009 15:51:52.847783 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbrx7\" (UniqueName: \"kubernetes.io/projected/39e39eb0-02e7-46b7-82be-38cbb9e1bf19-kube-api-access-sbrx7\") on node \"crc\" DevicePath \"\"" Oct 09 15:51:52 crc kubenswrapper[4719]: I1009 15:51:52.847795 4719 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39e39eb0-02e7-46b7-82be-38cbb9e1bf19-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.294957 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v" event={"ID":"39e39eb0-02e7-46b7-82be-38cbb9e1bf19","Type":"ContainerDied","Data":"8999ca856f69444ca6b85d1ceda6fd6ccbc7dcd9bd93a974822a03ba764be98d"} Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.294990 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.294993 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8999ca856f69444ca6b85d1ceda6fd6ccbc7dcd9bd93a974822a03ba764be98d" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.473490 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw"] Oct 09 15:51:53 crc kubenswrapper[4719]: E1009 15:51:53.474266 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f569f7-d920-40f4-9ae2-e1201295c852" containerName="extract-content" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.474288 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f569f7-d920-40f4-9ae2-e1201295c852" containerName="extract-content" Oct 09 15:51:53 crc kubenswrapper[4719]: E1009 15:51:53.474303 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39e39eb0-02e7-46b7-82be-38cbb9e1bf19" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.474312 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="39e39eb0-02e7-46b7-82be-38cbb9e1bf19" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 09 15:51:53 crc kubenswrapper[4719]: E1009 15:51:53.474340 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f569f7-d920-40f4-9ae2-e1201295c852" containerName="registry-server" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.474353 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f569f7-d920-40f4-9ae2-e1201295c852" containerName="registry-server" Oct 09 15:51:53 crc kubenswrapper[4719]: E1009 15:51:53.474382 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f569f7-d920-40f4-9ae2-e1201295c852" containerName="extract-utilities" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.474391 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f569f7-d920-40f4-9ae2-e1201295c852" containerName="extract-utilities" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.474677 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="39e39eb0-02e7-46b7-82be-38cbb9e1bf19" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.474712 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="43f569f7-d920-40f4-9ae2-e1201295c852" containerName="registry-server" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.475658 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.479475 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ssvsw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.479689 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.479805 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.479925 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.480007 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.480039 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.481409 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.484126 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.506837 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw"] Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.566842 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.566911 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.566948 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.566974 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.567021 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.567053 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.567073 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.567181 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.567209 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2znns\" (UniqueName: \"kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-kube-api-access-2znns\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.567243 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.567275 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.567318 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.567349 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.567389 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.669198 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.669269 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2znns\" (UniqueName: \"kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-kube-api-access-2znns\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.669307 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.669349 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.669428 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.669457 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.669486 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.669529 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.669559 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.669590 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.669618 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.669656 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.669690 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.669711 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.673936 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.675153 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.675851 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.676027 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.677047 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.677181 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.677495 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.678762 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.680035 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.681874 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.687592 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.687615 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.688885 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.690912 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2znns\" (UniqueName: \"kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-kube-api-access-2znns\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:53 crc kubenswrapper[4719]: I1009 15:51:53.796157 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:51:54 crc kubenswrapper[4719]: I1009 15:51:54.307412 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw"] Oct 09 15:51:55 crc kubenswrapper[4719]: I1009 15:51:55.316133 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" event={"ID":"976acf87-d11d-47a4-ad0d-2119fc70504c","Type":"ContainerStarted","Data":"0b947bddd75ad03ddca06eaf632efc5a1b6fa346c8c9c914dd691a3890664aeb"} Oct 09 15:51:55 crc kubenswrapper[4719]: I1009 15:51:55.316505 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" event={"ID":"976acf87-d11d-47a4-ad0d-2119fc70504c","Type":"ContainerStarted","Data":"5315a06c2a5d26ecab034483c17f4eb72d307d6ad30bd6fea79b09a91021bdb8"} Oct 09 15:51:55 crc kubenswrapper[4719]: I1009 15:51:55.344577 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" podStartSLOduration=1.9155831760000002 podStartE2EDuration="2.344559435s" podCreationTimestamp="2025-10-09 15:51:53 +0000 UTC" firstStartedPulling="2025-10-09 15:51:54.319523031 +0000 UTC m=+2019.829234316" lastFinishedPulling="2025-10-09 15:51:54.74849929 +0000 UTC m=+2020.258210575" observedRunningTime="2025-10-09 15:51:55.336123816 +0000 UTC m=+2020.845835101" watchObservedRunningTime="2025-10-09 15:51:55.344559435 +0000 UTC m=+2020.854270720" Oct 09 15:52:16 crc kubenswrapper[4719]: I1009 15:52:16.074557 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-27nrv"] Oct 09 15:52:16 crc kubenswrapper[4719]: I1009 15:52:16.077658 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-27nrv" Oct 09 15:52:16 crc kubenswrapper[4719]: I1009 15:52:16.082933 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-27nrv"] Oct 09 15:52:16 crc kubenswrapper[4719]: I1009 15:52:16.149991 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7vmn\" (UniqueName: \"kubernetes.io/projected/faa399bc-af9e-414a-9924-4d9a160f05e8-kube-api-access-s7vmn\") pod \"community-operators-27nrv\" (UID: \"faa399bc-af9e-414a-9924-4d9a160f05e8\") " pod="openshift-marketplace/community-operators-27nrv" Oct 09 15:52:16 crc kubenswrapper[4719]: I1009 15:52:16.150464 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa399bc-af9e-414a-9924-4d9a160f05e8-catalog-content\") pod \"community-operators-27nrv\" (UID: \"faa399bc-af9e-414a-9924-4d9a160f05e8\") " pod="openshift-marketplace/community-operators-27nrv" Oct 09 15:52:16 crc kubenswrapper[4719]: I1009 15:52:16.150508 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa399bc-af9e-414a-9924-4d9a160f05e8-utilities\") pod \"community-operators-27nrv\" (UID: \"faa399bc-af9e-414a-9924-4d9a160f05e8\") " pod="openshift-marketplace/community-operators-27nrv" Oct 09 15:52:16 crc kubenswrapper[4719]: I1009 15:52:16.252612 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa399bc-af9e-414a-9924-4d9a160f05e8-catalog-content\") pod \"community-operators-27nrv\" (UID: \"faa399bc-af9e-414a-9924-4d9a160f05e8\") " pod="openshift-marketplace/community-operators-27nrv" Oct 09 15:52:16 crc kubenswrapper[4719]: I1009 15:52:16.252674 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa399bc-af9e-414a-9924-4d9a160f05e8-utilities\") pod \"community-operators-27nrv\" (UID: \"faa399bc-af9e-414a-9924-4d9a160f05e8\") " pod="openshift-marketplace/community-operators-27nrv" Oct 09 15:52:16 crc kubenswrapper[4719]: I1009 15:52:16.252758 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7vmn\" (UniqueName: \"kubernetes.io/projected/faa399bc-af9e-414a-9924-4d9a160f05e8-kube-api-access-s7vmn\") pod \"community-operators-27nrv\" (UID: \"faa399bc-af9e-414a-9924-4d9a160f05e8\") " pod="openshift-marketplace/community-operators-27nrv" Oct 09 15:52:16 crc kubenswrapper[4719]: I1009 15:52:16.254727 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa399bc-af9e-414a-9924-4d9a160f05e8-catalog-content\") pod \"community-operators-27nrv\" (UID: \"faa399bc-af9e-414a-9924-4d9a160f05e8\") " pod="openshift-marketplace/community-operators-27nrv" Oct 09 15:52:16 crc kubenswrapper[4719]: I1009 15:52:16.255039 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa399bc-af9e-414a-9924-4d9a160f05e8-utilities\") pod \"community-operators-27nrv\" (UID: \"faa399bc-af9e-414a-9924-4d9a160f05e8\") " pod="openshift-marketplace/community-operators-27nrv" Oct 09 15:52:16 crc kubenswrapper[4719]: I1009 15:52:16.280337 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7vmn\" (UniqueName: \"kubernetes.io/projected/faa399bc-af9e-414a-9924-4d9a160f05e8-kube-api-access-s7vmn\") pod \"community-operators-27nrv\" (UID: \"faa399bc-af9e-414a-9924-4d9a160f05e8\") " pod="openshift-marketplace/community-operators-27nrv" Oct 09 15:52:16 crc kubenswrapper[4719]: I1009 15:52:16.400204 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-27nrv" Oct 09 15:52:16 crc kubenswrapper[4719]: I1009 15:52:16.973225 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-27nrv"] Oct 09 15:52:17 crc kubenswrapper[4719]: I1009 15:52:17.473371 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tvnzr"] Oct 09 15:52:17 crc kubenswrapper[4719]: I1009 15:52:17.476109 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tvnzr" Oct 09 15:52:17 crc kubenswrapper[4719]: I1009 15:52:17.499320 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tvnzr"] Oct 09 15:52:17 crc kubenswrapper[4719]: I1009 15:52:17.529927 4719 generic.go:334] "Generic (PLEG): container finished" podID="faa399bc-af9e-414a-9924-4d9a160f05e8" containerID="3f0264b4fb8c1e4e010c20b20c1fc33cc288c8b0c8bd0d171ed63f0e8b0889bb" exitCode=0 Oct 09 15:52:17 crc kubenswrapper[4719]: I1009 15:52:17.530005 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27nrv" event={"ID":"faa399bc-af9e-414a-9924-4d9a160f05e8","Type":"ContainerDied","Data":"3f0264b4fb8c1e4e010c20b20c1fc33cc288c8b0c8bd0d171ed63f0e8b0889bb"} Oct 09 15:52:17 crc kubenswrapper[4719]: I1009 15:52:17.530038 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27nrv" event={"ID":"faa399bc-af9e-414a-9924-4d9a160f05e8","Type":"ContainerStarted","Data":"19148ab3f3602dc699fe25a66988292ca2650108e98ae1ce0de3f599437ce700"} Oct 09 15:52:17 crc kubenswrapper[4719]: I1009 15:52:17.609383 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1dc13ea-0601-4c3d-8105-22dd84ae3d6b-utilities\") pod \"redhat-marketplace-tvnzr\" (UID: \"d1dc13ea-0601-4c3d-8105-22dd84ae3d6b\") " pod="openshift-marketplace/redhat-marketplace-tvnzr" Oct 09 15:52:17 crc kubenswrapper[4719]: I1009 15:52:17.609523 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9cqg\" (UniqueName: \"kubernetes.io/projected/d1dc13ea-0601-4c3d-8105-22dd84ae3d6b-kube-api-access-x9cqg\") pod \"redhat-marketplace-tvnzr\" (UID: \"d1dc13ea-0601-4c3d-8105-22dd84ae3d6b\") " pod="openshift-marketplace/redhat-marketplace-tvnzr" Oct 09 15:52:17 crc kubenswrapper[4719]: I1009 15:52:17.609556 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1dc13ea-0601-4c3d-8105-22dd84ae3d6b-catalog-content\") pod \"redhat-marketplace-tvnzr\" (UID: \"d1dc13ea-0601-4c3d-8105-22dd84ae3d6b\") " pod="openshift-marketplace/redhat-marketplace-tvnzr" Oct 09 15:52:17 crc kubenswrapper[4719]: I1009 15:52:17.711797 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1dc13ea-0601-4c3d-8105-22dd84ae3d6b-utilities\") pod \"redhat-marketplace-tvnzr\" (UID: \"d1dc13ea-0601-4c3d-8105-22dd84ae3d6b\") " pod="openshift-marketplace/redhat-marketplace-tvnzr" Oct 09 15:52:17 crc kubenswrapper[4719]: I1009 15:52:17.711939 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9cqg\" (UniqueName: \"kubernetes.io/projected/d1dc13ea-0601-4c3d-8105-22dd84ae3d6b-kube-api-access-x9cqg\") pod \"redhat-marketplace-tvnzr\" (UID: \"d1dc13ea-0601-4c3d-8105-22dd84ae3d6b\") " pod="openshift-marketplace/redhat-marketplace-tvnzr" Oct 09 15:52:17 crc kubenswrapper[4719]: I1009 15:52:17.711964 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1dc13ea-0601-4c3d-8105-22dd84ae3d6b-catalog-content\") pod \"redhat-marketplace-tvnzr\" (UID: \"d1dc13ea-0601-4c3d-8105-22dd84ae3d6b\") " pod="openshift-marketplace/redhat-marketplace-tvnzr" Oct 09 15:52:17 crc kubenswrapper[4719]: I1009 15:52:17.712677 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1dc13ea-0601-4c3d-8105-22dd84ae3d6b-catalog-content\") pod \"redhat-marketplace-tvnzr\" (UID: \"d1dc13ea-0601-4c3d-8105-22dd84ae3d6b\") " pod="openshift-marketplace/redhat-marketplace-tvnzr" Oct 09 15:52:17 crc kubenswrapper[4719]: I1009 15:52:17.712701 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1dc13ea-0601-4c3d-8105-22dd84ae3d6b-utilities\") pod \"redhat-marketplace-tvnzr\" (UID: \"d1dc13ea-0601-4c3d-8105-22dd84ae3d6b\") " pod="openshift-marketplace/redhat-marketplace-tvnzr" Oct 09 15:52:17 crc kubenswrapper[4719]: I1009 15:52:17.735481 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9cqg\" (UniqueName: \"kubernetes.io/projected/d1dc13ea-0601-4c3d-8105-22dd84ae3d6b-kube-api-access-x9cqg\") pod \"redhat-marketplace-tvnzr\" (UID: \"d1dc13ea-0601-4c3d-8105-22dd84ae3d6b\") " pod="openshift-marketplace/redhat-marketplace-tvnzr" Oct 09 15:52:17 crc kubenswrapper[4719]: I1009 15:52:17.815286 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tvnzr" Oct 09 15:52:18 crc kubenswrapper[4719]: I1009 15:52:18.329258 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tvnzr"] Oct 09 15:52:18 crc kubenswrapper[4719]: I1009 15:52:18.542585 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvnzr" event={"ID":"d1dc13ea-0601-4c3d-8105-22dd84ae3d6b","Type":"ContainerStarted","Data":"77fbe73612506a91a102fbefc095a567ef9ea6a8bfaa132c675ae5fc75b3387f"} Oct 09 15:52:18 crc kubenswrapper[4719]: I1009 15:52:18.542642 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvnzr" event={"ID":"d1dc13ea-0601-4c3d-8105-22dd84ae3d6b","Type":"ContainerStarted","Data":"d445acdbf38b485afb93bb63aee87d75eb4331702d4ca625a0ba80666e6e4750"} Oct 09 15:52:18 crc kubenswrapper[4719]: I1009 15:52:18.546548 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27nrv" event={"ID":"faa399bc-af9e-414a-9924-4d9a160f05e8","Type":"ContainerStarted","Data":"80ce0f012efb200768a254d20e2a39959bb2807a1b7a9bbf276b65509c8acd1e"} Oct 09 15:52:19 crc kubenswrapper[4719]: I1009 15:52:19.565082 4719 generic.go:334] "Generic (PLEG): container finished" podID="d1dc13ea-0601-4c3d-8105-22dd84ae3d6b" containerID="77fbe73612506a91a102fbefc095a567ef9ea6a8bfaa132c675ae5fc75b3387f" exitCode=0 Oct 09 15:52:19 crc kubenswrapper[4719]: I1009 15:52:19.565236 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvnzr" event={"ID":"d1dc13ea-0601-4c3d-8105-22dd84ae3d6b","Type":"ContainerDied","Data":"77fbe73612506a91a102fbefc095a567ef9ea6a8bfaa132c675ae5fc75b3387f"} Oct 09 15:52:19 crc kubenswrapper[4719]: I1009 15:52:19.568902 4719 generic.go:334] "Generic (PLEG): container finished" podID="faa399bc-af9e-414a-9924-4d9a160f05e8" containerID="80ce0f012efb200768a254d20e2a39959bb2807a1b7a9bbf276b65509c8acd1e" exitCode=0 Oct 09 15:52:19 crc kubenswrapper[4719]: I1009 15:52:19.568940 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27nrv" event={"ID":"faa399bc-af9e-414a-9924-4d9a160f05e8","Type":"ContainerDied","Data":"80ce0f012efb200768a254d20e2a39959bb2807a1b7a9bbf276b65509c8acd1e"} Oct 09 15:52:20 crc kubenswrapper[4719]: I1009 15:52:20.583054 4719 generic.go:334] "Generic (PLEG): container finished" podID="d1dc13ea-0601-4c3d-8105-22dd84ae3d6b" containerID="015f5024cbdb04a7018dff9f9e50e48bb8eddb3f6a0c70c0f751d11c9840d236" exitCode=0 Oct 09 15:52:20 crc kubenswrapper[4719]: I1009 15:52:20.583122 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvnzr" event={"ID":"d1dc13ea-0601-4c3d-8105-22dd84ae3d6b","Type":"ContainerDied","Data":"015f5024cbdb04a7018dff9f9e50e48bb8eddb3f6a0c70c0f751d11c9840d236"} Oct 09 15:52:20 crc kubenswrapper[4719]: I1009 15:52:20.587174 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27nrv" event={"ID":"faa399bc-af9e-414a-9924-4d9a160f05e8","Type":"ContainerStarted","Data":"13ad3dd6a1596d64f080cfe12a62fff5dd20c6ef6ce0a927bacbe43289ec6e86"} Oct 09 15:52:20 crc kubenswrapper[4719]: I1009 15:52:20.633198 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-27nrv" podStartSLOduration=2.142680294 podStartE2EDuration="4.633176268s" podCreationTimestamp="2025-10-09 15:52:16 +0000 UTC" firstStartedPulling="2025-10-09 15:52:17.535320289 +0000 UTC m=+2043.045031574" lastFinishedPulling="2025-10-09 15:52:20.025816253 +0000 UTC m=+2045.535527548" observedRunningTime="2025-10-09 15:52:20.624320966 +0000 UTC m=+2046.134032261" watchObservedRunningTime="2025-10-09 15:52:20.633176268 +0000 UTC m=+2046.142887553" Oct 09 15:52:21 crc kubenswrapper[4719]: I1009 15:52:21.603996 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvnzr" event={"ID":"d1dc13ea-0601-4c3d-8105-22dd84ae3d6b","Type":"ContainerStarted","Data":"1ad4eaa82ca0f28b3e93679f0a4bb6430ac25c18dae88b3d49f7832b06f40209"} Oct 09 15:52:21 crc kubenswrapper[4719]: I1009 15:52:21.648502 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tvnzr" podStartSLOduration=2.20322221 podStartE2EDuration="4.648468093s" podCreationTimestamp="2025-10-09 15:52:17 +0000 UTC" firstStartedPulling="2025-10-09 15:52:18.544721255 +0000 UTC m=+2044.054432540" lastFinishedPulling="2025-10-09 15:52:20.989967138 +0000 UTC m=+2046.499678423" observedRunningTime="2025-10-09 15:52:21.641647305 +0000 UTC m=+2047.151358600" watchObservedRunningTime="2025-10-09 15:52:21.648468093 +0000 UTC m=+2047.158179428" Oct 09 15:52:26 crc kubenswrapper[4719]: I1009 15:52:26.400994 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-27nrv" Oct 09 15:52:26 crc kubenswrapper[4719]: I1009 15:52:26.401561 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-27nrv" Oct 09 15:52:26 crc kubenswrapper[4719]: I1009 15:52:26.453752 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-27nrv" Oct 09 15:52:26 crc kubenswrapper[4719]: I1009 15:52:26.693662 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-27nrv" Oct 09 15:52:26 crc kubenswrapper[4719]: I1009 15:52:26.751113 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-27nrv"] Oct 09 15:52:27 crc kubenswrapper[4719]: I1009 15:52:27.817328 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tvnzr" Oct 09 15:52:27 crc kubenswrapper[4719]: I1009 15:52:27.817707 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tvnzr" Oct 09 15:52:27 crc kubenswrapper[4719]: I1009 15:52:27.865961 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tvnzr" Oct 09 15:52:28 crc kubenswrapper[4719]: I1009 15:52:28.667016 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-27nrv" podUID="faa399bc-af9e-414a-9924-4d9a160f05e8" containerName="registry-server" containerID="cri-o://13ad3dd6a1596d64f080cfe12a62fff5dd20c6ef6ce0a927bacbe43289ec6e86" gracePeriod=2 Oct 09 15:52:28 crc kubenswrapper[4719]: I1009 15:52:28.750954 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tvnzr" Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.093000 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tvnzr"] Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.150029 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-27nrv" Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.280662 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa399bc-af9e-414a-9924-4d9a160f05e8-utilities\") pod \"faa399bc-af9e-414a-9924-4d9a160f05e8\" (UID: \"faa399bc-af9e-414a-9924-4d9a160f05e8\") " Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.280739 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa399bc-af9e-414a-9924-4d9a160f05e8-catalog-content\") pod \"faa399bc-af9e-414a-9924-4d9a160f05e8\" (UID: \"faa399bc-af9e-414a-9924-4d9a160f05e8\") " Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.280984 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7vmn\" (UniqueName: \"kubernetes.io/projected/faa399bc-af9e-414a-9924-4d9a160f05e8-kube-api-access-s7vmn\") pod \"faa399bc-af9e-414a-9924-4d9a160f05e8\" (UID: \"faa399bc-af9e-414a-9924-4d9a160f05e8\") " Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.282183 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faa399bc-af9e-414a-9924-4d9a160f05e8-utilities" (OuterVolumeSpecName: "utilities") pod "faa399bc-af9e-414a-9924-4d9a160f05e8" (UID: "faa399bc-af9e-414a-9924-4d9a160f05e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.293598 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faa399bc-af9e-414a-9924-4d9a160f05e8-kube-api-access-s7vmn" (OuterVolumeSpecName: "kube-api-access-s7vmn") pod "faa399bc-af9e-414a-9924-4d9a160f05e8" (UID: "faa399bc-af9e-414a-9924-4d9a160f05e8"). InnerVolumeSpecName "kube-api-access-s7vmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.334626 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faa399bc-af9e-414a-9924-4d9a160f05e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "faa399bc-af9e-414a-9924-4d9a160f05e8" (UID: "faa399bc-af9e-414a-9924-4d9a160f05e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.383445 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7vmn\" (UniqueName: \"kubernetes.io/projected/faa399bc-af9e-414a-9924-4d9a160f05e8-kube-api-access-s7vmn\") on node \"crc\" DevicePath \"\"" Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.383486 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa399bc-af9e-414a-9924-4d9a160f05e8-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.383496 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa399bc-af9e-414a-9924-4d9a160f05e8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.682759 4719 generic.go:334] "Generic (PLEG): container finished" podID="faa399bc-af9e-414a-9924-4d9a160f05e8" containerID="13ad3dd6a1596d64f080cfe12a62fff5dd20c6ef6ce0a927bacbe43289ec6e86" exitCode=0 Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.682834 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-27nrv" Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.682921 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27nrv" event={"ID":"faa399bc-af9e-414a-9924-4d9a160f05e8","Type":"ContainerDied","Data":"13ad3dd6a1596d64f080cfe12a62fff5dd20c6ef6ce0a927bacbe43289ec6e86"} Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.682986 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27nrv" event={"ID":"faa399bc-af9e-414a-9924-4d9a160f05e8","Type":"ContainerDied","Data":"19148ab3f3602dc699fe25a66988292ca2650108e98ae1ce0de3f599437ce700"} Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.683009 4719 scope.go:117] "RemoveContainer" containerID="13ad3dd6a1596d64f080cfe12a62fff5dd20c6ef6ce0a927bacbe43289ec6e86" Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.748826 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-27nrv"] Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.756591 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-27nrv"] Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.767999 4719 scope.go:117] "RemoveContainer" containerID="80ce0f012efb200768a254d20e2a39959bb2807a1b7a9bbf276b65509c8acd1e" Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.850635 4719 scope.go:117] "RemoveContainer" containerID="3f0264b4fb8c1e4e010c20b20c1fc33cc288c8b0c8bd0d171ed63f0e8b0889bb" Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.898536 4719 scope.go:117] "RemoveContainer" containerID="13ad3dd6a1596d64f080cfe12a62fff5dd20c6ef6ce0a927bacbe43289ec6e86" Oct 09 15:52:29 crc kubenswrapper[4719]: E1009 15:52:29.899024 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13ad3dd6a1596d64f080cfe12a62fff5dd20c6ef6ce0a927bacbe43289ec6e86\": container with ID starting with 13ad3dd6a1596d64f080cfe12a62fff5dd20c6ef6ce0a927bacbe43289ec6e86 not found: ID does not exist" containerID="13ad3dd6a1596d64f080cfe12a62fff5dd20c6ef6ce0a927bacbe43289ec6e86" Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.899057 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13ad3dd6a1596d64f080cfe12a62fff5dd20c6ef6ce0a927bacbe43289ec6e86"} err="failed to get container status \"13ad3dd6a1596d64f080cfe12a62fff5dd20c6ef6ce0a927bacbe43289ec6e86\": rpc error: code = NotFound desc = could not find container \"13ad3dd6a1596d64f080cfe12a62fff5dd20c6ef6ce0a927bacbe43289ec6e86\": container with ID starting with 13ad3dd6a1596d64f080cfe12a62fff5dd20c6ef6ce0a927bacbe43289ec6e86 not found: ID does not exist" Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.899077 4719 scope.go:117] "RemoveContainer" containerID="80ce0f012efb200768a254d20e2a39959bb2807a1b7a9bbf276b65509c8acd1e" Oct 09 15:52:29 crc kubenswrapper[4719]: E1009 15:52:29.899249 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80ce0f012efb200768a254d20e2a39959bb2807a1b7a9bbf276b65509c8acd1e\": container with ID starting with 80ce0f012efb200768a254d20e2a39959bb2807a1b7a9bbf276b65509c8acd1e not found: ID does not exist" containerID="80ce0f012efb200768a254d20e2a39959bb2807a1b7a9bbf276b65509c8acd1e" Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.899265 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80ce0f012efb200768a254d20e2a39959bb2807a1b7a9bbf276b65509c8acd1e"} err="failed to get container status \"80ce0f012efb200768a254d20e2a39959bb2807a1b7a9bbf276b65509c8acd1e\": rpc error: code = NotFound desc = could not find container \"80ce0f012efb200768a254d20e2a39959bb2807a1b7a9bbf276b65509c8acd1e\": container with ID starting with 80ce0f012efb200768a254d20e2a39959bb2807a1b7a9bbf276b65509c8acd1e not found: ID does not exist" Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.899277 4719 scope.go:117] "RemoveContainer" containerID="3f0264b4fb8c1e4e010c20b20c1fc33cc288c8b0c8bd0d171ed63f0e8b0889bb" Oct 09 15:52:29 crc kubenswrapper[4719]: E1009 15:52:29.903447 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f0264b4fb8c1e4e010c20b20c1fc33cc288c8b0c8bd0d171ed63f0e8b0889bb\": container with ID starting with 3f0264b4fb8c1e4e010c20b20c1fc33cc288c8b0c8bd0d171ed63f0e8b0889bb not found: ID does not exist" containerID="3f0264b4fb8c1e4e010c20b20c1fc33cc288c8b0c8bd0d171ed63f0e8b0889bb" Oct 09 15:52:29 crc kubenswrapper[4719]: I1009 15:52:29.903484 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f0264b4fb8c1e4e010c20b20c1fc33cc288c8b0c8bd0d171ed63f0e8b0889bb"} err="failed to get container status \"3f0264b4fb8c1e4e010c20b20c1fc33cc288c8b0c8bd0d171ed63f0e8b0889bb\": rpc error: code = NotFound desc = could not find container \"3f0264b4fb8c1e4e010c20b20c1fc33cc288c8b0c8bd0d171ed63f0e8b0889bb\": container with ID starting with 3f0264b4fb8c1e4e010c20b20c1fc33cc288c8b0c8bd0d171ed63f0e8b0889bb not found: ID does not exist" Oct 09 15:52:30 crc kubenswrapper[4719]: I1009 15:52:30.692558 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tvnzr" podUID="d1dc13ea-0601-4c3d-8105-22dd84ae3d6b" containerName="registry-server" containerID="cri-o://1ad4eaa82ca0f28b3e93679f0a4bb6430ac25c18dae88b3d49f7832b06f40209" gracePeriod=2 Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.143418 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tvnzr" Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.174162 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faa399bc-af9e-414a-9924-4d9a160f05e8" path="/var/lib/kubelet/pods/faa399bc-af9e-414a-9924-4d9a160f05e8/volumes" Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.229995 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1dc13ea-0601-4c3d-8105-22dd84ae3d6b-catalog-content\") pod \"d1dc13ea-0601-4c3d-8105-22dd84ae3d6b\" (UID: \"d1dc13ea-0601-4c3d-8105-22dd84ae3d6b\") " Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.230133 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1dc13ea-0601-4c3d-8105-22dd84ae3d6b-utilities\") pod \"d1dc13ea-0601-4c3d-8105-22dd84ae3d6b\" (UID: \"d1dc13ea-0601-4c3d-8105-22dd84ae3d6b\") " Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.230179 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9cqg\" (UniqueName: \"kubernetes.io/projected/d1dc13ea-0601-4c3d-8105-22dd84ae3d6b-kube-api-access-x9cqg\") pod \"d1dc13ea-0601-4c3d-8105-22dd84ae3d6b\" (UID: \"d1dc13ea-0601-4c3d-8105-22dd84ae3d6b\") " Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.231247 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1dc13ea-0601-4c3d-8105-22dd84ae3d6b-utilities" (OuterVolumeSpecName: "utilities") pod "d1dc13ea-0601-4c3d-8105-22dd84ae3d6b" (UID: "d1dc13ea-0601-4c3d-8105-22dd84ae3d6b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.231695 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1dc13ea-0601-4c3d-8105-22dd84ae3d6b-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.236539 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1dc13ea-0601-4c3d-8105-22dd84ae3d6b-kube-api-access-x9cqg" (OuterVolumeSpecName: "kube-api-access-x9cqg") pod "d1dc13ea-0601-4c3d-8105-22dd84ae3d6b" (UID: "d1dc13ea-0601-4c3d-8105-22dd84ae3d6b"). InnerVolumeSpecName "kube-api-access-x9cqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.248041 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1dc13ea-0601-4c3d-8105-22dd84ae3d6b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d1dc13ea-0601-4c3d-8105-22dd84ae3d6b" (UID: "d1dc13ea-0601-4c3d-8105-22dd84ae3d6b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.333117 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1dc13ea-0601-4c3d-8105-22dd84ae3d6b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.333151 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9cqg\" (UniqueName: \"kubernetes.io/projected/d1dc13ea-0601-4c3d-8105-22dd84ae3d6b-kube-api-access-x9cqg\") on node \"crc\" DevicePath \"\"" Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.702605 4719 generic.go:334] "Generic (PLEG): container finished" podID="d1dc13ea-0601-4c3d-8105-22dd84ae3d6b" containerID="1ad4eaa82ca0f28b3e93679f0a4bb6430ac25c18dae88b3d49f7832b06f40209" exitCode=0 Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.702655 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvnzr" event={"ID":"d1dc13ea-0601-4c3d-8105-22dd84ae3d6b","Type":"ContainerDied","Data":"1ad4eaa82ca0f28b3e93679f0a4bb6430ac25c18dae88b3d49f7832b06f40209"} Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.702661 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tvnzr" Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.702688 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvnzr" event={"ID":"d1dc13ea-0601-4c3d-8105-22dd84ae3d6b","Type":"ContainerDied","Data":"d445acdbf38b485afb93bb63aee87d75eb4331702d4ca625a0ba80666e6e4750"} Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.702708 4719 scope.go:117] "RemoveContainer" containerID="1ad4eaa82ca0f28b3e93679f0a4bb6430ac25c18dae88b3d49f7832b06f40209" Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.730778 4719 scope.go:117] "RemoveContainer" containerID="015f5024cbdb04a7018dff9f9e50e48bb8eddb3f6a0c70c0f751d11c9840d236" Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.736315 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tvnzr"] Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.746269 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tvnzr"] Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.764742 4719 scope.go:117] "RemoveContainer" containerID="77fbe73612506a91a102fbefc095a567ef9ea6a8bfaa132c675ae5fc75b3387f" Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.798279 4719 scope.go:117] "RemoveContainer" containerID="1ad4eaa82ca0f28b3e93679f0a4bb6430ac25c18dae88b3d49f7832b06f40209" Oct 09 15:52:31 crc kubenswrapper[4719]: E1009 15:52:31.798807 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ad4eaa82ca0f28b3e93679f0a4bb6430ac25c18dae88b3d49f7832b06f40209\": container with ID starting with 1ad4eaa82ca0f28b3e93679f0a4bb6430ac25c18dae88b3d49f7832b06f40209 not found: ID does not exist" containerID="1ad4eaa82ca0f28b3e93679f0a4bb6430ac25c18dae88b3d49f7832b06f40209" Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.798854 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ad4eaa82ca0f28b3e93679f0a4bb6430ac25c18dae88b3d49f7832b06f40209"} err="failed to get container status \"1ad4eaa82ca0f28b3e93679f0a4bb6430ac25c18dae88b3d49f7832b06f40209\": rpc error: code = NotFound desc = could not find container \"1ad4eaa82ca0f28b3e93679f0a4bb6430ac25c18dae88b3d49f7832b06f40209\": container with ID starting with 1ad4eaa82ca0f28b3e93679f0a4bb6430ac25c18dae88b3d49f7832b06f40209 not found: ID does not exist" Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.798882 4719 scope.go:117] "RemoveContainer" containerID="015f5024cbdb04a7018dff9f9e50e48bb8eddb3f6a0c70c0f751d11c9840d236" Oct 09 15:52:31 crc kubenswrapper[4719]: E1009 15:52:31.799213 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"015f5024cbdb04a7018dff9f9e50e48bb8eddb3f6a0c70c0f751d11c9840d236\": container with ID starting with 015f5024cbdb04a7018dff9f9e50e48bb8eddb3f6a0c70c0f751d11c9840d236 not found: ID does not exist" containerID="015f5024cbdb04a7018dff9f9e50e48bb8eddb3f6a0c70c0f751d11c9840d236" Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.799340 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"015f5024cbdb04a7018dff9f9e50e48bb8eddb3f6a0c70c0f751d11c9840d236"} err="failed to get container status \"015f5024cbdb04a7018dff9f9e50e48bb8eddb3f6a0c70c0f751d11c9840d236\": rpc error: code = NotFound desc = could not find container \"015f5024cbdb04a7018dff9f9e50e48bb8eddb3f6a0c70c0f751d11c9840d236\": container with ID starting with 015f5024cbdb04a7018dff9f9e50e48bb8eddb3f6a0c70c0f751d11c9840d236 not found: ID does not exist" Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.799451 4719 scope.go:117] "RemoveContainer" containerID="77fbe73612506a91a102fbefc095a567ef9ea6a8bfaa132c675ae5fc75b3387f" Oct 09 15:52:31 crc kubenswrapper[4719]: E1009 15:52:31.799744 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77fbe73612506a91a102fbefc095a567ef9ea6a8bfaa132c675ae5fc75b3387f\": container with ID starting with 77fbe73612506a91a102fbefc095a567ef9ea6a8bfaa132c675ae5fc75b3387f not found: ID does not exist" containerID="77fbe73612506a91a102fbefc095a567ef9ea6a8bfaa132c675ae5fc75b3387f" Oct 09 15:52:31 crc kubenswrapper[4719]: I1009 15:52:31.799768 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77fbe73612506a91a102fbefc095a567ef9ea6a8bfaa132c675ae5fc75b3387f"} err="failed to get container status \"77fbe73612506a91a102fbefc095a567ef9ea6a8bfaa132c675ae5fc75b3387f\": rpc error: code = NotFound desc = could not find container \"77fbe73612506a91a102fbefc095a567ef9ea6a8bfaa132c675ae5fc75b3387f\": container with ID starting with 77fbe73612506a91a102fbefc095a567ef9ea6a8bfaa132c675ae5fc75b3387f not found: ID does not exist" Oct 09 15:52:33 crc kubenswrapper[4719]: I1009 15:52:33.176651 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1dc13ea-0601-4c3d-8105-22dd84ae3d6b" path="/var/lib/kubelet/pods/d1dc13ea-0601-4c3d-8105-22dd84ae3d6b/volumes" Oct 09 15:52:33 crc kubenswrapper[4719]: I1009 15:52:33.724915 4719 generic.go:334] "Generic (PLEG): container finished" podID="976acf87-d11d-47a4-ad0d-2119fc70504c" containerID="0b947bddd75ad03ddca06eaf632efc5a1b6fa346c8c9c914dd691a3890664aeb" exitCode=0 Oct 09 15:52:33 crc kubenswrapper[4719]: I1009 15:52:33.724960 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" event={"ID":"976acf87-d11d-47a4-ad0d-2119fc70504c","Type":"ContainerDied","Data":"0b947bddd75ad03ddca06eaf632efc5a1b6fa346c8c9c914dd691a3890664aeb"} Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.153621 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.217946 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2znns\" (UniqueName: \"kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-kube-api-access-2znns\") pod \"976acf87-d11d-47a4-ad0d-2119fc70504c\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.218001 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"976acf87-d11d-47a4-ad0d-2119fc70504c\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.218042 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-nova-combined-ca-bundle\") pod \"976acf87-d11d-47a4-ad0d-2119fc70504c\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.218244 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-ovn-combined-ca-bundle\") pod \"976acf87-d11d-47a4-ad0d-2119fc70504c\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.218321 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-libvirt-combined-ca-bundle\") pod \"976acf87-d11d-47a4-ad0d-2119fc70504c\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.218342 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"976acf87-d11d-47a4-ad0d-2119fc70504c\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.218397 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"976acf87-d11d-47a4-ad0d-2119fc70504c\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.218425 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"976acf87-d11d-47a4-ad0d-2119fc70504c\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.218445 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-repo-setup-combined-ca-bundle\") pod \"976acf87-d11d-47a4-ad0d-2119fc70504c\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.218460 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-telemetry-combined-ca-bundle\") pod \"976acf87-d11d-47a4-ad0d-2119fc70504c\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.218481 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-ssh-key\") pod \"976acf87-d11d-47a4-ad0d-2119fc70504c\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.218517 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-bootstrap-combined-ca-bundle\") pod \"976acf87-d11d-47a4-ad0d-2119fc70504c\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.218585 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-neutron-metadata-combined-ca-bundle\") pod \"976acf87-d11d-47a4-ad0d-2119fc70504c\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.218604 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-inventory\") pod \"976acf87-d11d-47a4-ad0d-2119fc70504c\" (UID: \"976acf87-d11d-47a4-ad0d-2119fc70504c\") " Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.227708 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "976acf87-d11d-47a4-ad0d-2119fc70504c" (UID: "976acf87-d11d-47a4-ad0d-2119fc70504c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.228084 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "976acf87-d11d-47a4-ad0d-2119fc70504c" (UID: "976acf87-d11d-47a4-ad0d-2119fc70504c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.229615 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "976acf87-d11d-47a4-ad0d-2119fc70504c" (UID: "976acf87-d11d-47a4-ad0d-2119fc70504c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.232148 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "976acf87-d11d-47a4-ad0d-2119fc70504c" (UID: "976acf87-d11d-47a4-ad0d-2119fc70504c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.232268 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "976acf87-d11d-47a4-ad0d-2119fc70504c" (UID: "976acf87-d11d-47a4-ad0d-2119fc70504c"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.232394 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "976acf87-d11d-47a4-ad0d-2119fc70504c" (UID: "976acf87-d11d-47a4-ad0d-2119fc70504c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.241939 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "976acf87-d11d-47a4-ad0d-2119fc70504c" (UID: "976acf87-d11d-47a4-ad0d-2119fc70504c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.241953 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-kube-api-access-2znns" (OuterVolumeSpecName: "kube-api-access-2znns") pod "976acf87-d11d-47a4-ad0d-2119fc70504c" (UID: "976acf87-d11d-47a4-ad0d-2119fc70504c"). InnerVolumeSpecName "kube-api-access-2znns". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.242110 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "976acf87-d11d-47a4-ad0d-2119fc70504c" (UID: "976acf87-d11d-47a4-ad0d-2119fc70504c"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.242234 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "976acf87-d11d-47a4-ad0d-2119fc70504c" (UID: "976acf87-d11d-47a4-ad0d-2119fc70504c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.243091 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "976acf87-d11d-47a4-ad0d-2119fc70504c" (UID: "976acf87-d11d-47a4-ad0d-2119fc70504c"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.244624 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "976acf87-d11d-47a4-ad0d-2119fc70504c" (UID: "976acf87-d11d-47a4-ad0d-2119fc70504c"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.254604 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "976acf87-d11d-47a4-ad0d-2119fc70504c" (UID: "976acf87-d11d-47a4-ad0d-2119fc70504c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.271191 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-inventory" (OuterVolumeSpecName: "inventory") pod "976acf87-d11d-47a4-ad0d-2119fc70504c" (UID: "976acf87-d11d-47a4-ad0d-2119fc70504c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.322894 4719 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.322945 4719 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.322956 4719 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.322970 4719 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.322999 4719 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.323014 4719 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.323023 4719 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.323033 4719 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.323044 4719 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.323072 4719 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.323083 4719 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.323093 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2znns\" (UniqueName: \"kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-kube-api-access-2znns\") on node \"crc\" DevicePath \"\"" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.323103 4719 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/976acf87-d11d-47a4-ad0d-2119fc70504c-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.323112 4719 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976acf87-d11d-47a4-ad0d-2119fc70504c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.745864 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" event={"ID":"976acf87-d11d-47a4-ad0d-2119fc70504c","Type":"ContainerDied","Data":"5315a06c2a5d26ecab034483c17f4eb72d307d6ad30bd6fea79b09a91021bdb8"} Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.745925 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5315a06c2a5d26ecab034483c17f4eb72d307d6ad30bd6fea79b09a91021bdb8" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.746019 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.839907 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg"] Oct 09 15:52:35 crc kubenswrapper[4719]: E1009 15:52:35.840267 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa399bc-af9e-414a-9924-4d9a160f05e8" containerName="extract-utilities" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.840283 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa399bc-af9e-414a-9924-4d9a160f05e8" containerName="extract-utilities" Oct 09 15:52:35 crc kubenswrapper[4719]: E1009 15:52:35.840309 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1dc13ea-0601-4c3d-8105-22dd84ae3d6b" containerName="extract-utilities" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.840315 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1dc13ea-0601-4c3d-8105-22dd84ae3d6b" containerName="extract-utilities" Oct 09 15:52:35 crc kubenswrapper[4719]: E1009 15:52:35.840326 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1dc13ea-0601-4c3d-8105-22dd84ae3d6b" containerName="registry-server" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.840332 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1dc13ea-0601-4c3d-8105-22dd84ae3d6b" containerName="registry-server" Oct 09 15:52:35 crc kubenswrapper[4719]: E1009 15:52:35.840396 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa399bc-af9e-414a-9924-4d9a160f05e8" containerName="extract-content" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.840403 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa399bc-af9e-414a-9924-4d9a160f05e8" containerName="extract-content" Oct 09 15:52:35 crc kubenswrapper[4719]: E1009 15:52:35.840414 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa399bc-af9e-414a-9924-4d9a160f05e8" containerName="registry-server" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.840420 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa399bc-af9e-414a-9924-4d9a160f05e8" containerName="registry-server" Oct 09 15:52:35 crc kubenswrapper[4719]: E1009 15:52:35.840434 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1dc13ea-0601-4c3d-8105-22dd84ae3d6b" containerName="extract-content" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.840441 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1dc13ea-0601-4c3d-8105-22dd84ae3d6b" containerName="extract-content" Oct 09 15:52:35 crc kubenswrapper[4719]: E1009 15:52:35.840455 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="976acf87-d11d-47a4-ad0d-2119fc70504c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.840463 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="976acf87-d11d-47a4-ad0d-2119fc70504c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.840631 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1dc13ea-0601-4c3d-8105-22dd84ae3d6b" containerName="registry-server" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.840647 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="976acf87-d11d-47a4-ad0d-2119fc70504c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.840657 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="faa399bc-af9e-414a-9924-4d9a160f05e8" containerName="registry-server" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.841385 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.843202 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.843697 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.844273 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.844314 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ssvsw" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.844854 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.855858 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg"] Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.934441 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a768f51e-2990-40f5-84df-13c410d05385-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7f2xg\" (UID: \"a768f51e-2990-40f5-84df-13c410d05385\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.934946 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mppl\" (UniqueName: \"kubernetes.io/projected/a768f51e-2990-40f5-84df-13c410d05385-kube-api-access-8mppl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7f2xg\" (UID: \"a768f51e-2990-40f5-84df-13c410d05385\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.934988 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a768f51e-2990-40f5-84df-13c410d05385-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7f2xg\" (UID: \"a768f51e-2990-40f5-84df-13c410d05385\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.935072 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a768f51e-2990-40f5-84df-13c410d05385-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7f2xg\" (UID: \"a768f51e-2990-40f5-84df-13c410d05385\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg" Oct 09 15:52:35 crc kubenswrapper[4719]: I1009 15:52:35.935164 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a768f51e-2990-40f5-84df-13c410d05385-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7f2xg\" (UID: \"a768f51e-2990-40f5-84df-13c410d05385\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg" Oct 09 15:52:36 crc kubenswrapper[4719]: I1009 15:52:36.037373 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a768f51e-2990-40f5-84df-13c410d05385-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7f2xg\" (UID: \"a768f51e-2990-40f5-84df-13c410d05385\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg" Oct 09 15:52:36 crc kubenswrapper[4719]: I1009 15:52:36.037865 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a768f51e-2990-40f5-84df-13c410d05385-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7f2xg\" (UID: \"a768f51e-2990-40f5-84df-13c410d05385\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg" Oct 09 15:52:36 crc kubenswrapper[4719]: I1009 15:52:36.038030 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a768f51e-2990-40f5-84df-13c410d05385-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7f2xg\" (UID: \"a768f51e-2990-40f5-84df-13c410d05385\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg" Oct 09 15:52:36 crc kubenswrapper[4719]: I1009 15:52:36.038167 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mppl\" (UniqueName: \"kubernetes.io/projected/a768f51e-2990-40f5-84df-13c410d05385-kube-api-access-8mppl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7f2xg\" (UID: \"a768f51e-2990-40f5-84df-13c410d05385\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg" Oct 09 15:52:36 crc kubenswrapper[4719]: I1009 15:52:36.038269 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a768f51e-2990-40f5-84df-13c410d05385-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7f2xg\" (UID: \"a768f51e-2990-40f5-84df-13c410d05385\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg" Oct 09 15:52:36 crc kubenswrapper[4719]: I1009 15:52:36.039082 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a768f51e-2990-40f5-84df-13c410d05385-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7f2xg\" (UID: \"a768f51e-2990-40f5-84df-13c410d05385\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg" Oct 09 15:52:36 crc kubenswrapper[4719]: I1009 15:52:36.041806 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a768f51e-2990-40f5-84df-13c410d05385-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7f2xg\" (UID: \"a768f51e-2990-40f5-84df-13c410d05385\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg" Oct 09 15:52:36 crc kubenswrapper[4719]: I1009 15:52:36.042115 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a768f51e-2990-40f5-84df-13c410d05385-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7f2xg\" (UID: \"a768f51e-2990-40f5-84df-13c410d05385\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg" Oct 09 15:52:36 crc kubenswrapper[4719]: I1009 15:52:36.042238 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a768f51e-2990-40f5-84df-13c410d05385-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7f2xg\" (UID: \"a768f51e-2990-40f5-84df-13c410d05385\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg" Oct 09 15:52:36 crc kubenswrapper[4719]: I1009 15:52:36.059244 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mppl\" (UniqueName: \"kubernetes.io/projected/a768f51e-2990-40f5-84df-13c410d05385-kube-api-access-8mppl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7f2xg\" (UID: \"a768f51e-2990-40f5-84df-13c410d05385\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg" Oct 09 15:52:36 crc kubenswrapper[4719]: I1009 15:52:36.176821 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg" Oct 09 15:52:36 crc kubenswrapper[4719]: I1009 15:52:36.678033 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg"] Oct 09 15:52:36 crc kubenswrapper[4719]: I1009 15:52:36.755818 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg" event={"ID":"a768f51e-2990-40f5-84df-13c410d05385","Type":"ContainerStarted","Data":"398d081c9232323ad849ef12d259dc408f90f28c409f3d093aa0c5048ad88db8"} Oct 09 15:52:37 crc kubenswrapper[4719]: I1009 15:52:37.767055 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg" event={"ID":"a768f51e-2990-40f5-84df-13c410d05385","Type":"ContainerStarted","Data":"1f4db0342f3f7ec5720807d5d07e6e4bf4fc986e82e49ca98dddc631a93baa6e"} Oct 09 15:52:37 crc kubenswrapper[4719]: I1009 15:52:37.792840 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg" podStartSLOduration=2.3600948649999998 podStartE2EDuration="2.792811875s" podCreationTimestamp="2025-10-09 15:52:35 +0000 UTC" firstStartedPulling="2025-10-09 15:52:36.681049246 +0000 UTC m=+2062.190760531" lastFinishedPulling="2025-10-09 15:52:37.113766246 +0000 UTC m=+2062.623477541" observedRunningTime="2025-10-09 15:52:37.782754125 +0000 UTC m=+2063.292465430" watchObservedRunningTime="2025-10-09 15:52:37.792811875 +0000 UTC m=+2063.302523190" Oct 09 15:53:19 crc kubenswrapper[4719]: I1009 15:53:19.538945 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ql4l8"] Oct 09 15:53:19 crc kubenswrapper[4719]: I1009 15:53:19.542261 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ql4l8" Oct 09 15:53:19 crc kubenswrapper[4719]: I1009 15:53:19.559461 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ql4l8"] Oct 09 15:53:19 crc kubenswrapper[4719]: I1009 15:53:19.663135 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvpnc\" (UniqueName: \"kubernetes.io/projected/fa29c574-7174-40b2-92d2-fe2ddbc04be0-kube-api-access-xvpnc\") pod \"redhat-operators-ql4l8\" (UID: \"fa29c574-7174-40b2-92d2-fe2ddbc04be0\") " pod="openshift-marketplace/redhat-operators-ql4l8" Oct 09 15:53:19 crc kubenswrapper[4719]: I1009 15:53:19.663488 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa29c574-7174-40b2-92d2-fe2ddbc04be0-catalog-content\") pod \"redhat-operators-ql4l8\" (UID: \"fa29c574-7174-40b2-92d2-fe2ddbc04be0\") " pod="openshift-marketplace/redhat-operators-ql4l8" Oct 09 15:53:19 crc kubenswrapper[4719]: I1009 15:53:19.663733 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa29c574-7174-40b2-92d2-fe2ddbc04be0-utilities\") pod \"redhat-operators-ql4l8\" (UID: \"fa29c574-7174-40b2-92d2-fe2ddbc04be0\") " pod="openshift-marketplace/redhat-operators-ql4l8" Oct 09 15:53:19 crc kubenswrapper[4719]: I1009 15:53:19.765735 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvpnc\" (UniqueName: \"kubernetes.io/projected/fa29c574-7174-40b2-92d2-fe2ddbc04be0-kube-api-access-xvpnc\") pod \"redhat-operators-ql4l8\" (UID: \"fa29c574-7174-40b2-92d2-fe2ddbc04be0\") " pod="openshift-marketplace/redhat-operators-ql4l8" Oct 09 15:53:19 crc kubenswrapper[4719]: I1009 15:53:19.765802 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa29c574-7174-40b2-92d2-fe2ddbc04be0-catalog-content\") pod \"redhat-operators-ql4l8\" (UID: \"fa29c574-7174-40b2-92d2-fe2ddbc04be0\") " pod="openshift-marketplace/redhat-operators-ql4l8" Oct 09 15:53:19 crc kubenswrapper[4719]: I1009 15:53:19.765832 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa29c574-7174-40b2-92d2-fe2ddbc04be0-utilities\") pod \"redhat-operators-ql4l8\" (UID: \"fa29c574-7174-40b2-92d2-fe2ddbc04be0\") " pod="openshift-marketplace/redhat-operators-ql4l8" Oct 09 15:53:19 crc kubenswrapper[4719]: I1009 15:53:19.766436 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa29c574-7174-40b2-92d2-fe2ddbc04be0-utilities\") pod \"redhat-operators-ql4l8\" (UID: \"fa29c574-7174-40b2-92d2-fe2ddbc04be0\") " pod="openshift-marketplace/redhat-operators-ql4l8" Oct 09 15:53:19 crc kubenswrapper[4719]: I1009 15:53:19.766540 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa29c574-7174-40b2-92d2-fe2ddbc04be0-catalog-content\") pod \"redhat-operators-ql4l8\" (UID: \"fa29c574-7174-40b2-92d2-fe2ddbc04be0\") " pod="openshift-marketplace/redhat-operators-ql4l8" Oct 09 15:53:19 crc kubenswrapper[4719]: I1009 15:53:19.793374 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvpnc\" (UniqueName: \"kubernetes.io/projected/fa29c574-7174-40b2-92d2-fe2ddbc04be0-kube-api-access-xvpnc\") pod \"redhat-operators-ql4l8\" (UID: \"fa29c574-7174-40b2-92d2-fe2ddbc04be0\") " pod="openshift-marketplace/redhat-operators-ql4l8" Oct 09 15:53:19 crc kubenswrapper[4719]: I1009 15:53:19.874502 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ql4l8" Oct 09 15:53:20 crc kubenswrapper[4719]: I1009 15:53:20.312800 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ql4l8"] Oct 09 15:53:21 crc kubenswrapper[4719]: I1009 15:53:21.250373 4719 generic.go:334] "Generic (PLEG): container finished" podID="fa29c574-7174-40b2-92d2-fe2ddbc04be0" containerID="d79bab7de5f49c8f4439db9acfc8ce24d9083e4af987e098e5e4cfd7516b8fe5" exitCode=0 Oct 09 15:53:21 crc kubenswrapper[4719]: I1009 15:53:21.250418 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ql4l8" event={"ID":"fa29c574-7174-40b2-92d2-fe2ddbc04be0","Type":"ContainerDied","Data":"d79bab7de5f49c8f4439db9acfc8ce24d9083e4af987e098e5e4cfd7516b8fe5"} Oct 09 15:53:21 crc kubenswrapper[4719]: I1009 15:53:21.250443 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ql4l8" event={"ID":"fa29c574-7174-40b2-92d2-fe2ddbc04be0","Type":"ContainerStarted","Data":"dc33577165987db797e964673a288cb5985192f3b78be18e8b32b550657f7a2d"} Oct 09 15:53:22 crc kubenswrapper[4719]: I1009 15:53:22.259928 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ql4l8" event={"ID":"fa29c574-7174-40b2-92d2-fe2ddbc04be0","Type":"ContainerStarted","Data":"712ea37ef12fbff0036956c91f77f47f99dcee4b71ce91610a70856fdaa9991a"} Oct 09 15:53:23 crc kubenswrapper[4719]: I1009 15:53:23.269646 4719 generic.go:334] "Generic (PLEG): container finished" podID="fa29c574-7174-40b2-92d2-fe2ddbc04be0" containerID="712ea37ef12fbff0036956c91f77f47f99dcee4b71ce91610a70856fdaa9991a" exitCode=0 Oct 09 15:53:23 crc kubenswrapper[4719]: I1009 15:53:23.269697 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ql4l8" event={"ID":"fa29c574-7174-40b2-92d2-fe2ddbc04be0","Type":"ContainerDied","Data":"712ea37ef12fbff0036956c91f77f47f99dcee4b71ce91610a70856fdaa9991a"} Oct 09 15:53:24 crc kubenswrapper[4719]: I1009 15:53:24.280533 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ql4l8" event={"ID":"fa29c574-7174-40b2-92d2-fe2ddbc04be0","Type":"ContainerStarted","Data":"f92e142662b00a4fefb9cd4e02917b2cf7238a7cb185b2919e859f20cae525f2"} Oct 09 15:53:24 crc kubenswrapper[4719]: I1009 15:53:24.298801 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ql4l8" podStartSLOduration=2.832828171 podStartE2EDuration="5.298780913s" podCreationTimestamp="2025-10-09 15:53:19 +0000 UTC" firstStartedPulling="2025-10-09 15:53:21.252770286 +0000 UTC m=+2106.762481571" lastFinishedPulling="2025-10-09 15:53:23.718723028 +0000 UTC m=+2109.228434313" observedRunningTime="2025-10-09 15:53:24.296756799 +0000 UTC m=+2109.806468104" watchObservedRunningTime="2025-10-09 15:53:24.298780913 +0000 UTC m=+2109.808492208" Oct 09 15:53:29 crc kubenswrapper[4719]: I1009 15:53:29.875051 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ql4l8" Oct 09 15:53:29 crc kubenswrapper[4719]: I1009 15:53:29.875970 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ql4l8" Oct 09 15:53:29 crc kubenswrapper[4719]: I1009 15:53:29.923506 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ql4l8" Oct 09 15:53:30 crc kubenswrapper[4719]: I1009 15:53:30.431768 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ql4l8" Oct 09 15:53:30 crc kubenswrapper[4719]: I1009 15:53:30.511365 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ql4l8"] Oct 09 15:53:32 crc kubenswrapper[4719]: I1009 15:53:32.362498 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ql4l8" podUID="fa29c574-7174-40b2-92d2-fe2ddbc04be0" containerName="registry-server" containerID="cri-o://f92e142662b00a4fefb9cd4e02917b2cf7238a7cb185b2919e859f20cae525f2" gracePeriod=2 Oct 09 15:53:32 crc kubenswrapper[4719]: I1009 15:53:32.826985 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ql4l8" Oct 09 15:53:32 crc kubenswrapper[4719]: I1009 15:53:32.928824 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa29c574-7174-40b2-92d2-fe2ddbc04be0-catalog-content\") pod \"fa29c574-7174-40b2-92d2-fe2ddbc04be0\" (UID: \"fa29c574-7174-40b2-92d2-fe2ddbc04be0\") " Oct 09 15:53:32 crc kubenswrapper[4719]: I1009 15:53:32.929042 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvpnc\" (UniqueName: \"kubernetes.io/projected/fa29c574-7174-40b2-92d2-fe2ddbc04be0-kube-api-access-xvpnc\") pod \"fa29c574-7174-40b2-92d2-fe2ddbc04be0\" (UID: \"fa29c574-7174-40b2-92d2-fe2ddbc04be0\") " Oct 09 15:53:32 crc kubenswrapper[4719]: I1009 15:53:32.929107 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa29c574-7174-40b2-92d2-fe2ddbc04be0-utilities\") pod \"fa29c574-7174-40b2-92d2-fe2ddbc04be0\" (UID: \"fa29c574-7174-40b2-92d2-fe2ddbc04be0\") " Oct 09 15:53:32 crc kubenswrapper[4719]: I1009 15:53:32.930217 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa29c574-7174-40b2-92d2-fe2ddbc04be0-utilities" (OuterVolumeSpecName: "utilities") pod "fa29c574-7174-40b2-92d2-fe2ddbc04be0" (UID: "fa29c574-7174-40b2-92d2-fe2ddbc04be0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:53:32 crc kubenswrapper[4719]: I1009 15:53:32.938642 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa29c574-7174-40b2-92d2-fe2ddbc04be0-kube-api-access-xvpnc" (OuterVolumeSpecName: "kube-api-access-xvpnc") pod "fa29c574-7174-40b2-92d2-fe2ddbc04be0" (UID: "fa29c574-7174-40b2-92d2-fe2ddbc04be0"). InnerVolumeSpecName "kube-api-access-xvpnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:53:33 crc kubenswrapper[4719]: I1009 15:53:33.030783 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvpnc\" (UniqueName: \"kubernetes.io/projected/fa29c574-7174-40b2-92d2-fe2ddbc04be0-kube-api-access-xvpnc\") on node \"crc\" DevicePath \"\"" Oct 09 15:53:33 crc kubenswrapper[4719]: I1009 15:53:33.030827 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa29c574-7174-40b2-92d2-fe2ddbc04be0-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 15:53:33 crc kubenswrapper[4719]: I1009 15:53:33.375701 4719 generic.go:334] "Generic (PLEG): container finished" podID="fa29c574-7174-40b2-92d2-fe2ddbc04be0" containerID="f92e142662b00a4fefb9cd4e02917b2cf7238a7cb185b2919e859f20cae525f2" exitCode=0 Oct 09 15:53:33 crc kubenswrapper[4719]: I1009 15:53:33.375749 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ql4l8" event={"ID":"fa29c574-7174-40b2-92d2-fe2ddbc04be0","Type":"ContainerDied","Data":"f92e142662b00a4fefb9cd4e02917b2cf7238a7cb185b2919e859f20cae525f2"} Oct 09 15:53:33 crc kubenswrapper[4719]: I1009 15:53:33.375778 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ql4l8" event={"ID":"fa29c574-7174-40b2-92d2-fe2ddbc04be0","Type":"ContainerDied","Data":"dc33577165987db797e964673a288cb5985192f3b78be18e8b32b550657f7a2d"} Oct 09 15:53:33 crc kubenswrapper[4719]: I1009 15:53:33.375797 4719 scope.go:117] "RemoveContainer" containerID="f92e142662b00a4fefb9cd4e02917b2cf7238a7cb185b2919e859f20cae525f2" Oct 09 15:53:33 crc kubenswrapper[4719]: I1009 15:53:33.375925 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ql4l8" Oct 09 15:53:33 crc kubenswrapper[4719]: I1009 15:53:33.396210 4719 scope.go:117] "RemoveContainer" containerID="712ea37ef12fbff0036956c91f77f47f99dcee4b71ce91610a70856fdaa9991a" Oct 09 15:53:33 crc kubenswrapper[4719]: I1009 15:53:33.415194 4719 scope.go:117] "RemoveContainer" containerID="d79bab7de5f49c8f4439db9acfc8ce24d9083e4af987e098e5e4cfd7516b8fe5" Oct 09 15:53:33 crc kubenswrapper[4719]: I1009 15:53:33.470070 4719 scope.go:117] "RemoveContainer" containerID="f92e142662b00a4fefb9cd4e02917b2cf7238a7cb185b2919e859f20cae525f2" Oct 09 15:53:33 crc kubenswrapper[4719]: E1009 15:53:33.473873 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f92e142662b00a4fefb9cd4e02917b2cf7238a7cb185b2919e859f20cae525f2\": container with ID starting with f92e142662b00a4fefb9cd4e02917b2cf7238a7cb185b2919e859f20cae525f2 not found: ID does not exist" containerID="f92e142662b00a4fefb9cd4e02917b2cf7238a7cb185b2919e859f20cae525f2" Oct 09 15:53:33 crc kubenswrapper[4719]: I1009 15:53:33.473918 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f92e142662b00a4fefb9cd4e02917b2cf7238a7cb185b2919e859f20cae525f2"} err="failed to get container status \"f92e142662b00a4fefb9cd4e02917b2cf7238a7cb185b2919e859f20cae525f2\": rpc error: code = NotFound desc = could not find container \"f92e142662b00a4fefb9cd4e02917b2cf7238a7cb185b2919e859f20cae525f2\": container with ID starting with f92e142662b00a4fefb9cd4e02917b2cf7238a7cb185b2919e859f20cae525f2 not found: ID does not exist" Oct 09 15:53:33 crc kubenswrapper[4719]: I1009 15:53:33.473945 4719 scope.go:117] "RemoveContainer" containerID="712ea37ef12fbff0036956c91f77f47f99dcee4b71ce91610a70856fdaa9991a" Oct 09 15:53:33 crc kubenswrapper[4719]: E1009 15:53:33.474499 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"712ea37ef12fbff0036956c91f77f47f99dcee4b71ce91610a70856fdaa9991a\": container with ID starting with 712ea37ef12fbff0036956c91f77f47f99dcee4b71ce91610a70856fdaa9991a not found: ID does not exist" containerID="712ea37ef12fbff0036956c91f77f47f99dcee4b71ce91610a70856fdaa9991a" Oct 09 15:53:33 crc kubenswrapper[4719]: I1009 15:53:33.474567 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"712ea37ef12fbff0036956c91f77f47f99dcee4b71ce91610a70856fdaa9991a"} err="failed to get container status \"712ea37ef12fbff0036956c91f77f47f99dcee4b71ce91610a70856fdaa9991a\": rpc error: code = NotFound desc = could not find container \"712ea37ef12fbff0036956c91f77f47f99dcee4b71ce91610a70856fdaa9991a\": container with ID starting with 712ea37ef12fbff0036956c91f77f47f99dcee4b71ce91610a70856fdaa9991a not found: ID does not exist" Oct 09 15:53:33 crc kubenswrapper[4719]: I1009 15:53:33.474593 4719 scope.go:117] "RemoveContainer" containerID="d79bab7de5f49c8f4439db9acfc8ce24d9083e4af987e098e5e4cfd7516b8fe5" Oct 09 15:53:33 crc kubenswrapper[4719]: E1009 15:53:33.479912 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d79bab7de5f49c8f4439db9acfc8ce24d9083e4af987e098e5e4cfd7516b8fe5\": container with ID starting with d79bab7de5f49c8f4439db9acfc8ce24d9083e4af987e098e5e4cfd7516b8fe5 not found: ID does not exist" containerID="d79bab7de5f49c8f4439db9acfc8ce24d9083e4af987e098e5e4cfd7516b8fe5" Oct 09 15:53:33 crc kubenswrapper[4719]: I1009 15:53:33.479955 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d79bab7de5f49c8f4439db9acfc8ce24d9083e4af987e098e5e4cfd7516b8fe5"} err="failed to get container status \"d79bab7de5f49c8f4439db9acfc8ce24d9083e4af987e098e5e4cfd7516b8fe5\": rpc error: code = NotFound desc = could not find container \"d79bab7de5f49c8f4439db9acfc8ce24d9083e4af987e098e5e4cfd7516b8fe5\": container with ID starting with d79bab7de5f49c8f4439db9acfc8ce24d9083e4af987e098e5e4cfd7516b8fe5 not found: ID does not exist" Oct 09 15:53:34 crc kubenswrapper[4719]: I1009 15:53:34.201769 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa29c574-7174-40b2-92d2-fe2ddbc04be0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa29c574-7174-40b2-92d2-fe2ddbc04be0" (UID: "fa29c574-7174-40b2-92d2-fe2ddbc04be0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 15:53:34 crc kubenswrapper[4719]: I1009 15:53:34.256026 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa29c574-7174-40b2-92d2-fe2ddbc04be0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 15:53:34 crc kubenswrapper[4719]: I1009 15:53:34.309310 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ql4l8"] Oct 09 15:53:34 crc kubenswrapper[4719]: I1009 15:53:34.317648 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ql4l8"] Oct 09 15:53:35 crc kubenswrapper[4719]: I1009 15:53:35.172793 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa29c574-7174-40b2-92d2-fe2ddbc04be0" path="/var/lib/kubelet/pods/fa29c574-7174-40b2-92d2-fe2ddbc04be0/volumes" Oct 09 15:53:36 crc kubenswrapper[4719]: I1009 15:53:36.976651 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:53:36 crc kubenswrapper[4719]: I1009 15:53:36.977262 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:53:41 crc kubenswrapper[4719]: I1009 15:53:41.443015 4719 generic.go:334] "Generic (PLEG): container finished" podID="a768f51e-2990-40f5-84df-13c410d05385" containerID="1f4db0342f3f7ec5720807d5d07e6e4bf4fc986e82e49ca98dddc631a93baa6e" exitCode=0 Oct 09 15:53:41 crc kubenswrapper[4719]: I1009 15:53:41.443104 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg" event={"ID":"a768f51e-2990-40f5-84df-13c410d05385","Type":"ContainerDied","Data":"1f4db0342f3f7ec5720807d5d07e6e4bf4fc986e82e49ca98dddc631a93baa6e"} Oct 09 15:53:42 crc kubenswrapper[4719]: I1009 15:53:42.871477 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.022713 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mppl\" (UniqueName: \"kubernetes.io/projected/a768f51e-2990-40f5-84df-13c410d05385-kube-api-access-8mppl\") pod \"a768f51e-2990-40f5-84df-13c410d05385\" (UID: \"a768f51e-2990-40f5-84df-13c410d05385\") " Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.022869 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a768f51e-2990-40f5-84df-13c410d05385-inventory\") pod \"a768f51e-2990-40f5-84df-13c410d05385\" (UID: \"a768f51e-2990-40f5-84df-13c410d05385\") " Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.022945 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a768f51e-2990-40f5-84df-13c410d05385-ovn-combined-ca-bundle\") pod \"a768f51e-2990-40f5-84df-13c410d05385\" (UID: \"a768f51e-2990-40f5-84df-13c410d05385\") " Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.022990 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a768f51e-2990-40f5-84df-13c410d05385-ssh-key\") pod \"a768f51e-2990-40f5-84df-13c410d05385\" (UID: \"a768f51e-2990-40f5-84df-13c410d05385\") " Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.023036 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a768f51e-2990-40f5-84df-13c410d05385-ovncontroller-config-0\") pod \"a768f51e-2990-40f5-84df-13c410d05385\" (UID: \"a768f51e-2990-40f5-84df-13c410d05385\") " Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.029373 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a768f51e-2990-40f5-84df-13c410d05385-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "a768f51e-2990-40f5-84df-13c410d05385" (UID: "a768f51e-2990-40f5-84df-13c410d05385"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.030006 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a768f51e-2990-40f5-84df-13c410d05385-kube-api-access-8mppl" (OuterVolumeSpecName: "kube-api-access-8mppl") pod "a768f51e-2990-40f5-84df-13c410d05385" (UID: "a768f51e-2990-40f5-84df-13c410d05385"). InnerVolumeSpecName "kube-api-access-8mppl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.051586 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a768f51e-2990-40f5-84df-13c410d05385-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "a768f51e-2990-40f5-84df-13c410d05385" (UID: "a768f51e-2990-40f5-84df-13c410d05385"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.058567 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a768f51e-2990-40f5-84df-13c410d05385-inventory" (OuterVolumeSpecName: "inventory") pod "a768f51e-2990-40f5-84df-13c410d05385" (UID: "a768f51e-2990-40f5-84df-13c410d05385"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.059804 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a768f51e-2990-40f5-84df-13c410d05385-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a768f51e-2990-40f5-84df-13c410d05385" (UID: "a768f51e-2990-40f5-84df-13c410d05385"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.125995 4719 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a768f51e-2990-40f5-84df-13c410d05385-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.126024 4719 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a768f51e-2990-40f5-84df-13c410d05385-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.126034 4719 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a768f51e-2990-40f5-84df-13c410d05385-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.126042 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mppl\" (UniqueName: \"kubernetes.io/projected/a768f51e-2990-40f5-84df-13c410d05385-kube-api-access-8mppl\") on node \"crc\" DevicePath \"\"" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.126050 4719 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a768f51e-2990-40f5-84df-13c410d05385-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.466031 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg" event={"ID":"a768f51e-2990-40f5-84df-13c410d05385","Type":"ContainerDied","Data":"398d081c9232323ad849ef12d259dc408f90f28c409f3d093aa0c5048ad88db8"} Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.466075 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="398d081c9232323ad849ef12d259dc408f90f28c409f3d093aa0c5048ad88db8" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.466122 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7f2xg" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.638452 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg"] Oct 09 15:53:43 crc kubenswrapper[4719]: E1009 15:53:43.638882 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a768f51e-2990-40f5-84df-13c410d05385" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.638896 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="a768f51e-2990-40f5-84df-13c410d05385" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 09 15:53:43 crc kubenswrapper[4719]: E1009 15:53:43.638912 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa29c574-7174-40b2-92d2-fe2ddbc04be0" containerName="extract-utilities" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.638918 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa29c574-7174-40b2-92d2-fe2ddbc04be0" containerName="extract-utilities" Oct 09 15:53:43 crc kubenswrapper[4719]: E1009 15:53:43.638937 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa29c574-7174-40b2-92d2-fe2ddbc04be0" containerName="extract-content" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.638944 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa29c574-7174-40b2-92d2-fe2ddbc04be0" containerName="extract-content" Oct 09 15:53:43 crc kubenswrapper[4719]: E1009 15:53:43.638958 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa29c574-7174-40b2-92d2-fe2ddbc04be0" containerName="registry-server" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.638964 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa29c574-7174-40b2-92d2-fe2ddbc04be0" containerName="registry-server" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.639127 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="a768f51e-2990-40f5-84df-13c410d05385" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.639167 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa29c574-7174-40b2-92d2-fe2ddbc04be0" containerName="registry-server" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.639852 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.641808 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.641874 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.641938 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ssvsw" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.643253 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.643920 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.646921 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.653231 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg"] Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.839065 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg\" (UID: \"d36d0870-b55a-4791-9554-11d38e304e92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.839165 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkc26\" (UniqueName: \"kubernetes.io/projected/d36d0870-b55a-4791-9554-11d38e304e92-kube-api-access-hkc26\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg\" (UID: \"d36d0870-b55a-4791-9554-11d38e304e92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.839198 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg\" (UID: \"d36d0870-b55a-4791-9554-11d38e304e92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.839258 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg\" (UID: \"d36d0870-b55a-4791-9554-11d38e304e92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.839293 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg\" (UID: \"d36d0870-b55a-4791-9554-11d38e304e92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.839519 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg\" (UID: \"d36d0870-b55a-4791-9554-11d38e304e92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.941007 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg\" (UID: \"d36d0870-b55a-4791-9554-11d38e304e92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.941073 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg\" (UID: \"d36d0870-b55a-4791-9554-11d38e304e92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.941119 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg\" (UID: \"d36d0870-b55a-4791-9554-11d38e304e92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.941183 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg\" (UID: \"d36d0870-b55a-4791-9554-11d38e304e92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.941240 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkc26\" (UniqueName: \"kubernetes.io/projected/d36d0870-b55a-4791-9554-11d38e304e92-kube-api-access-hkc26\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg\" (UID: \"d36d0870-b55a-4791-9554-11d38e304e92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.941276 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg\" (UID: \"d36d0870-b55a-4791-9554-11d38e304e92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.944674 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg\" (UID: \"d36d0870-b55a-4791-9554-11d38e304e92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.947563 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg\" (UID: \"d36d0870-b55a-4791-9554-11d38e304e92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.948084 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg\" (UID: \"d36d0870-b55a-4791-9554-11d38e304e92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.949348 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg\" (UID: \"d36d0870-b55a-4791-9554-11d38e304e92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.953421 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg\" (UID: \"d36d0870-b55a-4791-9554-11d38e304e92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" Oct 09 15:53:43 crc kubenswrapper[4719]: I1009 15:53:43.962295 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkc26\" (UniqueName: \"kubernetes.io/projected/d36d0870-b55a-4791-9554-11d38e304e92-kube-api-access-hkc26\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg\" (UID: \"d36d0870-b55a-4791-9554-11d38e304e92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" Oct 09 15:53:44 crc kubenswrapper[4719]: I1009 15:53:44.258624 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" Oct 09 15:53:44 crc kubenswrapper[4719]: I1009 15:53:44.756156 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg"] Oct 09 15:53:45 crc kubenswrapper[4719]: I1009 15:53:45.486024 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" event={"ID":"d36d0870-b55a-4791-9554-11d38e304e92","Type":"ContainerStarted","Data":"c65c8e067c94d5e67975daa4af84be71ff9824f184baa6062e7e6328dfee23f9"} Oct 09 15:53:46 crc kubenswrapper[4719]: I1009 15:53:46.496913 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" event={"ID":"d36d0870-b55a-4791-9554-11d38e304e92","Type":"ContainerStarted","Data":"29e6bba3fc2ddbf1ca2c1859707e8f17aff1da03752630198ce9a72d24cf92df"} Oct 09 15:53:46 crc kubenswrapper[4719]: I1009 15:53:46.516080 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" podStartSLOduration=2.914764341 podStartE2EDuration="3.516063403s" podCreationTimestamp="2025-10-09 15:53:43 +0000 UTC" firstStartedPulling="2025-10-09 15:53:44.756720878 +0000 UTC m=+2130.266432163" lastFinishedPulling="2025-10-09 15:53:45.35801991 +0000 UTC m=+2130.867731225" observedRunningTime="2025-10-09 15:53:46.513652596 +0000 UTC m=+2132.023363911" watchObservedRunningTime="2025-10-09 15:53:46.516063403 +0000 UTC m=+2132.025774688" Oct 09 15:54:06 crc kubenswrapper[4719]: I1009 15:54:06.976984 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:54:06 crc kubenswrapper[4719]: I1009 15:54:06.977601 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:54:32 crc kubenswrapper[4719]: I1009 15:54:32.926203 4719 generic.go:334] "Generic (PLEG): container finished" podID="d36d0870-b55a-4791-9554-11d38e304e92" containerID="29e6bba3fc2ddbf1ca2c1859707e8f17aff1da03752630198ce9a72d24cf92df" exitCode=0 Oct 09 15:54:32 crc kubenswrapper[4719]: I1009 15:54:32.926910 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" event={"ID":"d36d0870-b55a-4791-9554-11d38e304e92","Type":"ContainerDied","Data":"29e6bba3fc2ddbf1ca2c1859707e8f17aff1da03752630198ce9a72d24cf92df"} Oct 09 15:54:34 crc kubenswrapper[4719]: I1009 15:54:34.291782 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" Oct 09 15:54:34 crc kubenswrapper[4719]: I1009 15:54:34.387178 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-inventory\") pod \"d36d0870-b55a-4791-9554-11d38e304e92\" (UID: \"d36d0870-b55a-4791-9554-11d38e304e92\") " Oct 09 15:54:34 crc kubenswrapper[4719]: I1009 15:54:34.387243 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkc26\" (UniqueName: \"kubernetes.io/projected/d36d0870-b55a-4791-9554-11d38e304e92-kube-api-access-hkc26\") pod \"d36d0870-b55a-4791-9554-11d38e304e92\" (UID: \"d36d0870-b55a-4791-9554-11d38e304e92\") " Oct 09 15:54:34 crc kubenswrapper[4719]: I1009 15:54:34.387282 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-neutron-ovn-metadata-agent-neutron-config-0\") pod \"d36d0870-b55a-4791-9554-11d38e304e92\" (UID: \"d36d0870-b55a-4791-9554-11d38e304e92\") " Oct 09 15:54:34 crc kubenswrapper[4719]: I1009 15:54:34.387331 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-ssh-key\") pod \"d36d0870-b55a-4791-9554-11d38e304e92\" (UID: \"d36d0870-b55a-4791-9554-11d38e304e92\") " Oct 09 15:54:34 crc kubenswrapper[4719]: I1009 15:54:34.387502 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-neutron-metadata-combined-ca-bundle\") pod \"d36d0870-b55a-4791-9554-11d38e304e92\" (UID: \"d36d0870-b55a-4791-9554-11d38e304e92\") " Oct 09 15:54:34 crc kubenswrapper[4719]: I1009 15:54:34.387684 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-nova-metadata-neutron-config-0\") pod \"d36d0870-b55a-4791-9554-11d38e304e92\" (UID: \"d36d0870-b55a-4791-9554-11d38e304e92\") " Oct 09 15:54:34 crc kubenswrapper[4719]: I1009 15:54:34.394455 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d36d0870-b55a-4791-9554-11d38e304e92-kube-api-access-hkc26" (OuterVolumeSpecName: "kube-api-access-hkc26") pod "d36d0870-b55a-4791-9554-11d38e304e92" (UID: "d36d0870-b55a-4791-9554-11d38e304e92"). InnerVolumeSpecName "kube-api-access-hkc26". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:54:34 crc kubenswrapper[4719]: I1009 15:54:34.401480 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "d36d0870-b55a-4791-9554-11d38e304e92" (UID: "d36d0870-b55a-4791-9554-11d38e304e92"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:54:34 crc kubenswrapper[4719]: I1009 15:54:34.420498 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-inventory" (OuterVolumeSpecName: "inventory") pod "d36d0870-b55a-4791-9554-11d38e304e92" (UID: "d36d0870-b55a-4791-9554-11d38e304e92"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:54:34 crc kubenswrapper[4719]: I1009 15:54:34.425361 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "d36d0870-b55a-4791-9554-11d38e304e92" (UID: "d36d0870-b55a-4791-9554-11d38e304e92"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:54:34 crc kubenswrapper[4719]: I1009 15:54:34.426615 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "d36d0870-b55a-4791-9554-11d38e304e92" (UID: "d36d0870-b55a-4791-9554-11d38e304e92"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:54:34 crc kubenswrapper[4719]: I1009 15:54:34.428535 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d36d0870-b55a-4791-9554-11d38e304e92" (UID: "d36d0870-b55a-4791-9554-11d38e304e92"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:54:34 crc kubenswrapper[4719]: I1009 15:54:34.489404 4719 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:54:34 crc kubenswrapper[4719]: I1009 15:54:34.489445 4719 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 09 15:54:34 crc kubenswrapper[4719]: I1009 15:54:34.489458 4719 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 15:54:34 crc kubenswrapper[4719]: I1009 15:54:34.489473 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkc26\" (UniqueName: \"kubernetes.io/projected/d36d0870-b55a-4791-9554-11d38e304e92-kube-api-access-hkc26\") on node \"crc\" DevicePath \"\"" Oct 09 15:54:34 crc kubenswrapper[4719]: I1009 15:54:34.489485 4719 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 09 15:54:34 crc kubenswrapper[4719]: I1009 15:54:34.489496 4719 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d36d0870-b55a-4791-9554-11d38e304e92-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 15:54:34 crc kubenswrapper[4719]: I1009 15:54:34.942554 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" event={"ID":"d36d0870-b55a-4791-9554-11d38e304e92","Type":"ContainerDied","Data":"c65c8e067c94d5e67975daa4af84be71ff9824f184baa6062e7e6328dfee23f9"} Oct 09 15:54:34 crc kubenswrapper[4719]: I1009 15:54:34.942594 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg" Oct 09 15:54:34 crc kubenswrapper[4719]: I1009 15:54:34.942605 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c65c8e067c94d5e67975daa4af84be71ff9824f184baa6062e7e6328dfee23f9" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.042722 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72"] Oct 09 15:54:35 crc kubenswrapper[4719]: E1009 15:54:35.043148 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36d0870-b55a-4791-9554-11d38e304e92" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.043175 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36d0870-b55a-4791-9554-11d38e304e92" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.043476 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="d36d0870-b55a-4791-9554-11d38e304e92" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.044145 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.047014 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.047190 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.047364 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.047581 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.052730 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ssvsw" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.053004 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72"] Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.101064 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7mv72\" (UID: \"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.101206 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7mv72\" (UID: \"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.101281 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7mv72\" (UID: \"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.101360 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf6dh\" (UniqueName: \"kubernetes.io/projected/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-kube-api-access-zf6dh\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7mv72\" (UID: \"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.101457 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7mv72\" (UID: \"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.203273 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7mv72\" (UID: \"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.203332 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7mv72\" (UID: \"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.203707 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7mv72\" (UID: \"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.203753 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf6dh\" (UniqueName: \"kubernetes.io/projected/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-kube-api-access-zf6dh\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7mv72\" (UID: \"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.203794 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7mv72\" (UID: \"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.210877 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7mv72\" (UID: \"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.211642 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7mv72\" (UID: \"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.212810 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7mv72\" (UID: \"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.214068 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7mv72\" (UID: \"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.222452 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf6dh\" (UniqueName: \"kubernetes.io/projected/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-kube-api-access-zf6dh\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7mv72\" (UID: \"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.361965 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72" Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.906727 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72"] Oct 09 15:54:35 crc kubenswrapper[4719]: I1009 15:54:35.957889 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72" event={"ID":"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f","Type":"ContainerStarted","Data":"c170c0054ca1a1ef954bc01f2c047ff58f471e522fb7a427046394785b3b123b"} Oct 09 15:54:36 crc kubenswrapper[4719]: I1009 15:54:36.976169 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:54:36 crc kubenswrapper[4719]: I1009 15:54:36.976897 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:54:36 crc kubenswrapper[4719]: I1009 15:54:36.977028 4719 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 15:54:36 crc kubenswrapper[4719]: I1009 15:54:36.977846 4719 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"abad957eacff8118d320311536ec10e8847725a5f2366aab4422a5954663f1fc"} pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 15:54:36 crc kubenswrapper[4719]: I1009 15:54:36.977894 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" containerID="cri-o://abad957eacff8118d320311536ec10e8847725a5f2366aab4422a5954663f1fc" gracePeriod=600 Oct 09 15:54:36 crc kubenswrapper[4719]: I1009 15:54:36.979460 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72" event={"ID":"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f","Type":"ContainerStarted","Data":"f362aace1401396d1c1abf0241b8f4e85385f57573d419d05eec6918b602401f"} Oct 09 15:54:37 crc kubenswrapper[4719]: I1009 15:54:37.008257 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72" podStartSLOduration=1.483281431 podStartE2EDuration="2.008243829s" podCreationTimestamp="2025-10-09 15:54:35 +0000 UTC" firstStartedPulling="2025-10-09 15:54:35.913222294 +0000 UTC m=+2181.422933579" lastFinishedPulling="2025-10-09 15:54:36.438184682 +0000 UTC m=+2181.947895977" observedRunningTime="2025-10-09 15:54:37.007548356 +0000 UTC m=+2182.517259641" watchObservedRunningTime="2025-10-09 15:54:37.008243829 +0000 UTC m=+2182.517955114" Oct 09 15:54:37 crc kubenswrapper[4719]: I1009 15:54:37.990388 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerDied","Data":"abad957eacff8118d320311536ec10e8847725a5f2366aab4422a5954663f1fc"} Oct 09 15:54:37 crc kubenswrapper[4719]: I1009 15:54:37.991016 4719 scope.go:117] "RemoveContainer" containerID="55848799feb0f83996cad9faea64b8bd81a5055bee1fd116f8ee1236dc974c4b" Oct 09 15:54:37 crc kubenswrapper[4719]: I1009 15:54:37.990453 4719 generic.go:334] "Generic (PLEG): container finished" podID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerID="abad957eacff8118d320311536ec10e8847725a5f2366aab4422a5954663f1fc" exitCode=0 Oct 09 15:54:37 crc kubenswrapper[4719]: I1009 15:54:37.991165 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerStarted","Data":"ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9"} Oct 09 15:57:06 crc kubenswrapper[4719]: I1009 15:57:06.976255 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:57:06 crc kubenswrapper[4719]: I1009 15:57:06.976906 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:57:36 crc kubenswrapper[4719]: I1009 15:57:36.976848 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:57:36 crc kubenswrapper[4719]: I1009 15:57:36.977443 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:58:06 crc kubenswrapper[4719]: I1009 15:58:06.976987 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 15:58:06 crc kubenswrapper[4719]: I1009 15:58:06.977626 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 15:58:06 crc kubenswrapper[4719]: I1009 15:58:06.977683 4719 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 15:58:06 crc kubenswrapper[4719]: I1009 15:58:06.978659 4719 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9"} pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 15:58:06 crc kubenswrapper[4719]: I1009 15:58:06.978766 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" containerID="cri-o://ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" gracePeriod=600 Oct 09 15:58:07 crc kubenswrapper[4719]: E1009 15:58:07.123662 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:58:07 crc kubenswrapper[4719]: I1009 15:58:07.961736 4719 generic.go:334] "Generic (PLEG): container finished" podID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" exitCode=0 Oct 09 15:58:07 crc kubenswrapper[4719]: I1009 15:58:07.961773 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerDied","Data":"ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9"} Oct 09 15:58:07 crc kubenswrapper[4719]: I1009 15:58:07.962035 4719 scope.go:117] "RemoveContainer" containerID="abad957eacff8118d320311536ec10e8847725a5f2366aab4422a5954663f1fc" Oct 09 15:58:07 crc kubenswrapper[4719]: I1009 15:58:07.962754 4719 scope.go:117] "RemoveContainer" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" Oct 09 15:58:07 crc kubenswrapper[4719]: E1009 15:58:07.962979 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:58:22 crc kubenswrapper[4719]: I1009 15:58:22.162037 4719 scope.go:117] "RemoveContainer" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" Oct 09 15:58:22 crc kubenswrapper[4719]: E1009 15:58:22.163312 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:58:36 crc kubenswrapper[4719]: I1009 15:58:36.160917 4719 scope.go:117] "RemoveContainer" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" Oct 09 15:58:36 crc kubenswrapper[4719]: E1009 15:58:36.161962 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:58:43 crc kubenswrapper[4719]: I1009 15:58:43.306712 4719 generic.go:334] "Generic (PLEG): container finished" podID="2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f" containerID="f362aace1401396d1c1abf0241b8f4e85385f57573d419d05eec6918b602401f" exitCode=0 Oct 09 15:58:43 crc kubenswrapper[4719]: I1009 15:58:43.306880 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72" event={"ID":"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f","Type":"ContainerDied","Data":"f362aace1401396d1c1abf0241b8f4e85385f57573d419d05eec6918b602401f"} Oct 09 15:58:44 crc kubenswrapper[4719]: I1009 15:58:44.764418 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72" Oct 09 15:58:44 crc kubenswrapper[4719]: I1009 15:58:44.955558 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-libvirt-combined-ca-bundle\") pod \"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f\" (UID: \"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f\") " Oct 09 15:58:44 crc kubenswrapper[4719]: I1009 15:58:44.955983 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-inventory\") pod \"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f\" (UID: \"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f\") " Oct 09 15:58:44 crc kubenswrapper[4719]: I1009 15:58:44.956110 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-ssh-key\") pod \"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f\" (UID: \"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f\") " Oct 09 15:58:44 crc kubenswrapper[4719]: I1009 15:58:44.956155 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf6dh\" (UniqueName: \"kubernetes.io/projected/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-kube-api-access-zf6dh\") pod \"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f\" (UID: \"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f\") " Oct 09 15:58:44 crc kubenswrapper[4719]: I1009 15:58:44.956273 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-libvirt-secret-0\") pod \"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f\" (UID: \"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f\") " Oct 09 15:58:44 crc kubenswrapper[4719]: I1009 15:58:44.962327 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f" (UID: "2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:58:44 crc kubenswrapper[4719]: I1009 15:58:44.962451 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-kube-api-access-zf6dh" (OuterVolumeSpecName: "kube-api-access-zf6dh") pod "2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f" (UID: "2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f"). InnerVolumeSpecName "kube-api-access-zf6dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:58:44 crc kubenswrapper[4719]: I1009 15:58:44.985671 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f" (UID: "2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:58:44 crc kubenswrapper[4719]: I1009 15:58:44.995380 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f" (UID: "2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:58:44 crc kubenswrapper[4719]: I1009 15:58:44.996291 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-inventory" (OuterVolumeSpecName: "inventory") pod "2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f" (UID: "2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.060113 4719 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.060147 4719 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.060157 4719 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.060166 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf6dh\" (UniqueName: \"kubernetes.io/projected/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-kube-api-access-zf6dh\") on node \"crc\" DevicePath \"\"" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.060175 4719 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.355177 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72" event={"ID":"2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f","Type":"ContainerDied","Data":"c170c0054ca1a1ef954bc01f2c047ff58f471e522fb7a427046394785b3b123b"} Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.355223 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c170c0054ca1a1ef954bc01f2c047ff58f471e522fb7a427046394785b3b123b" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.355295 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7mv72" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.436094 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss"] Oct 09 15:58:45 crc kubenswrapper[4719]: E1009 15:58:45.436665 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.436685 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.436964 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.437792 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.440655 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.440822 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.441102 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.441385 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.441530 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.441693 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.442049 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ssvsw" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.457704 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss"] Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.574410 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.574466 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.574513 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.574808 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfx8s\" (UniqueName: \"kubernetes.io/projected/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-kube-api-access-wfx8s\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.574910 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.575048 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.575181 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.575238 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.575736 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.677523 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfx8s\" (UniqueName: \"kubernetes.io/projected/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-kube-api-access-wfx8s\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.677623 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.677686 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.677745 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.677787 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.677864 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.677910 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.677936 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.677990 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.679132 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.683875 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.684280 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.684506 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.684574 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.684750 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.684948 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.685294 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.695411 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfx8s\" (UniqueName: \"kubernetes.io/projected/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-kube-api-access-wfx8s\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vc5ss\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:45 crc kubenswrapper[4719]: I1009 15:58:45.757364 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 15:58:46 crc kubenswrapper[4719]: I1009 15:58:46.270173 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss"] Oct 09 15:58:46 crc kubenswrapper[4719]: W1009 15:58:46.274833 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23a47423_b3ad_4ba3_b0ab_9a452d485f2b.slice/crio-7bf75817b4e267461a55b75d6345f00b740722f02f60c65bcab65ec4e237f1a8 WatchSource:0}: Error finding container 7bf75817b4e267461a55b75d6345f00b740722f02f60c65bcab65ec4e237f1a8: Status 404 returned error can't find the container with id 7bf75817b4e267461a55b75d6345f00b740722f02f60c65bcab65ec4e237f1a8 Oct 09 15:58:46 crc kubenswrapper[4719]: I1009 15:58:46.277169 4719 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 15:58:46 crc kubenswrapper[4719]: I1009 15:58:46.364126 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" event={"ID":"23a47423-b3ad-4ba3-b0ab-9a452d485f2b","Type":"ContainerStarted","Data":"7bf75817b4e267461a55b75d6345f00b740722f02f60c65bcab65ec4e237f1a8"} Oct 09 15:58:47 crc kubenswrapper[4719]: I1009 15:58:47.375637 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" event={"ID":"23a47423-b3ad-4ba3-b0ab-9a452d485f2b","Type":"ContainerStarted","Data":"a4d6eb6aca40f500d37ae5d8e1dd8d97ba874076b31be8db97db74610deecdca"} Oct 09 15:58:47 crc kubenswrapper[4719]: I1009 15:58:47.395546 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" podStartSLOduration=1.938498506 podStartE2EDuration="2.39552877s" podCreationTimestamp="2025-10-09 15:58:45 +0000 UTC" firstStartedPulling="2025-10-09 15:58:46.276958305 +0000 UTC m=+2431.786669590" lastFinishedPulling="2025-10-09 15:58:46.733988579 +0000 UTC m=+2432.243699854" observedRunningTime="2025-10-09 15:58:47.39490271 +0000 UTC m=+2432.904613995" watchObservedRunningTime="2025-10-09 15:58:47.39552877 +0000 UTC m=+2432.905240055" Oct 09 15:58:49 crc kubenswrapper[4719]: I1009 15:58:49.162175 4719 scope.go:117] "RemoveContainer" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" Oct 09 15:58:49 crc kubenswrapper[4719]: E1009 15:58:49.164163 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:59:00 crc kubenswrapper[4719]: I1009 15:59:00.162231 4719 scope.go:117] "RemoveContainer" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" Oct 09 15:59:00 crc kubenswrapper[4719]: E1009 15:59:00.162958 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:59:12 crc kubenswrapper[4719]: I1009 15:59:12.161505 4719 scope.go:117] "RemoveContainer" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" Oct 09 15:59:12 crc kubenswrapper[4719]: E1009 15:59:12.164687 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:59:24 crc kubenswrapper[4719]: I1009 15:59:24.161629 4719 scope.go:117] "RemoveContainer" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" Oct 09 15:59:24 crc kubenswrapper[4719]: E1009 15:59:24.162588 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:59:38 crc kubenswrapper[4719]: I1009 15:59:38.162317 4719 scope.go:117] "RemoveContainer" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" Oct 09 15:59:38 crc kubenswrapper[4719]: E1009 15:59:38.163196 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 15:59:51 crc kubenswrapper[4719]: I1009 15:59:51.162110 4719 scope.go:117] "RemoveContainer" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" Oct 09 15:59:51 crc kubenswrapper[4719]: E1009 15:59:51.163045 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:00:00 crc kubenswrapper[4719]: I1009 16:00:00.173130 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333760-zgd5q"] Oct 09 16:00:00 crc kubenswrapper[4719]: I1009 16:00:00.175158 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333760-zgd5q" Oct 09 16:00:00 crc kubenswrapper[4719]: I1009 16:00:00.179279 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 16:00:00 crc kubenswrapper[4719]: I1009 16:00:00.179615 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 16:00:00 crc kubenswrapper[4719]: I1009 16:00:00.202500 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333760-zgd5q"] Oct 09 16:00:00 crc kubenswrapper[4719]: I1009 16:00:00.323011 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdtw4\" (UniqueName: \"kubernetes.io/projected/923d9a5f-f038-4d7f-877c-cf0f4c970a59-kube-api-access-kdtw4\") pod \"collect-profiles-29333760-zgd5q\" (UID: \"923d9a5f-f038-4d7f-877c-cf0f4c970a59\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333760-zgd5q" Oct 09 16:00:00 crc kubenswrapper[4719]: I1009 16:00:00.323123 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/923d9a5f-f038-4d7f-877c-cf0f4c970a59-config-volume\") pod \"collect-profiles-29333760-zgd5q\" (UID: \"923d9a5f-f038-4d7f-877c-cf0f4c970a59\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333760-zgd5q" Oct 09 16:00:00 crc kubenswrapper[4719]: I1009 16:00:00.323237 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/923d9a5f-f038-4d7f-877c-cf0f4c970a59-secret-volume\") pod \"collect-profiles-29333760-zgd5q\" (UID: \"923d9a5f-f038-4d7f-877c-cf0f4c970a59\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333760-zgd5q" Oct 09 16:00:00 crc kubenswrapper[4719]: I1009 16:00:00.426043 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/923d9a5f-f038-4d7f-877c-cf0f4c970a59-secret-volume\") pod \"collect-profiles-29333760-zgd5q\" (UID: \"923d9a5f-f038-4d7f-877c-cf0f4c970a59\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333760-zgd5q" Oct 09 16:00:00 crc kubenswrapper[4719]: I1009 16:00:00.426267 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdtw4\" (UniqueName: \"kubernetes.io/projected/923d9a5f-f038-4d7f-877c-cf0f4c970a59-kube-api-access-kdtw4\") pod \"collect-profiles-29333760-zgd5q\" (UID: \"923d9a5f-f038-4d7f-877c-cf0f4c970a59\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333760-zgd5q" Oct 09 16:00:00 crc kubenswrapper[4719]: I1009 16:00:00.426396 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/923d9a5f-f038-4d7f-877c-cf0f4c970a59-config-volume\") pod \"collect-profiles-29333760-zgd5q\" (UID: \"923d9a5f-f038-4d7f-877c-cf0f4c970a59\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333760-zgd5q" Oct 09 16:00:00 crc kubenswrapper[4719]: I1009 16:00:00.427753 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/923d9a5f-f038-4d7f-877c-cf0f4c970a59-config-volume\") pod \"collect-profiles-29333760-zgd5q\" (UID: \"923d9a5f-f038-4d7f-877c-cf0f4c970a59\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333760-zgd5q" Oct 09 16:00:00 crc kubenswrapper[4719]: I1009 16:00:00.437329 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/923d9a5f-f038-4d7f-877c-cf0f4c970a59-secret-volume\") pod \"collect-profiles-29333760-zgd5q\" (UID: \"923d9a5f-f038-4d7f-877c-cf0f4c970a59\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333760-zgd5q" Oct 09 16:00:00 crc kubenswrapper[4719]: I1009 16:00:00.447651 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdtw4\" (UniqueName: \"kubernetes.io/projected/923d9a5f-f038-4d7f-877c-cf0f4c970a59-kube-api-access-kdtw4\") pod \"collect-profiles-29333760-zgd5q\" (UID: \"923d9a5f-f038-4d7f-877c-cf0f4c970a59\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333760-zgd5q" Oct 09 16:00:00 crc kubenswrapper[4719]: I1009 16:00:00.524600 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333760-zgd5q" Oct 09 16:00:01 crc kubenswrapper[4719]: I1009 16:00:01.105329 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333760-zgd5q"] Oct 09 16:00:02 crc kubenswrapper[4719]: I1009 16:00:02.063721 4719 generic.go:334] "Generic (PLEG): container finished" podID="923d9a5f-f038-4d7f-877c-cf0f4c970a59" containerID="6b80fab1e5a80d5afb83105a8c55745a7d0fdd547f3964c823b163620b3c5c18" exitCode=0 Oct 09 16:00:02 crc kubenswrapper[4719]: I1009 16:00:02.064034 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333760-zgd5q" event={"ID":"923d9a5f-f038-4d7f-877c-cf0f4c970a59","Type":"ContainerDied","Data":"6b80fab1e5a80d5afb83105a8c55745a7d0fdd547f3964c823b163620b3c5c18"} Oct 09 16:00:02 crc kubenswrapper[4719]: I1009 16:00:02.064062 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333760-zgd5q" event={"ID":"923d9a5f-f038-4d7f-877c-cf0f4c970a59","Type":"ContainerStarted","Data":"992c58ea5dbae5369ec869431d616e2b5a4b372286978e83a08997eafaa3450e"} Oct 09 16:00:03 crc kubenswrapper[4719]: I1009 16:00:03.448835 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333760-zgd5q" Oct 09 16:00:03 crc kubenswrapper[4719]: I1009 16:00:03.608509 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdtw4\" (UniqueName: \"kubernetes.io/projected/923d9a5f-f038-4d7f-877c-cf0f4c970a59-kube-api-access-kdtw4\") pod \"923d9a5f-f038-4d7f-877c-cf0f4c970a59\" (UID: \"923d9a5f-f038-4d7f-877c-cf0f4c970a59\") " Oct 09 16:00:03 crc kubenswrapper[4719]: I1009 16:00:03.608615 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/923d9a5f-f038-4d7f-877c-cf0f4c970a59-secret-volume\") pod \"923d9a5f-f038-4d7f-877c-cf0f4c970a59\" (UID: \"923d9a5f-f038-4d7f-877c-cf0f4c970a59\") " Oct 09 16:00:03 crc kubenswrapper[4719]: I1009 16:00:03.608651 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/923d9a5f-f038-4d7f-877c-cf0f4c970a59-config-volume\") pod \"923d9a5f-f038-4d7f-877c-cf0f4c970a59\" (UID: \"923d9a5f-f038-4d7f-877c-cf0f4c970a59\") " Oct 09 16:00:03 crc kubenswrapper[4719]: I1009 16:00:03.609331 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/923d9a5f-f038-4d7f-877c-cf0f4c970a59-config-volume" (OuterVolumeSpecName: "config-volume") pod "923d9a5f-f038-4d7f-877c-cf0f4c970a59" (UID: "923d9a5f-f038-4d7f-877c-cf0f4c970a59"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 16:00:03 crc kubenswrapper[4719]: I1009 16:00:03.613556 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/923d9a5f-f038-4d7f-877c-cf0f4c970a59-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "923d9a5f-f038-4d7f-877c-cf0f4c970a59" (UID: "923d9a5f-f038-4d7f-877c-cf0f4c970a59"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:00:03 crc kubenswrapper[4719]: I1009 16:00:03.614210 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/923d9a5f-f038-4d7f-877c-cf0f4c970a59-kube-api-access-kdtw4" (OuterVolumeSpecName: "kube-api-access-kdtw4") pod "923d9a5f-f038-4d7f-877c-cf0f4c970a59" (UID: "923d9a5f-f038-4d7f-877c-cf0f4c970a59"). InnerVolumeSpecName "kube-api-access-kdtw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:00:03 crc kubenswrapper[4719]: I1009 16:00:03.712190 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdtw4\" (UniqueName: \"kubernetes.io/projected/923d9a5f-f038-4d7f-877c-cf0f4c970a59-kube-api-access-kdtw4\") on node \"crc\" DevicePath \"\"" Oct 09 16:00:03 crc kubenswrapper[4719]: I1009 16:00:03.712264 4719 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/923d9a5f-f038-4d7f-877c-cf0f4c970a59-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 16:00:03 crc kubenswrapper[4719]: I1009 16:00:03.712279 4719 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/923d9a5f-f038-4d7f-877c-cf0f4c970a59-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 16:00:04 crc kubenswrapper[4719]: I1009 16:00:04.084380 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333760-zgd5q" event={"ID":"923d9a5f-f038-4d7f-877c-cf0f4c970a59","Type":"ContainerDied","Data":"992c58ea5dbae5369ec869431d616e2b5a4b372286978e83a08997eafaa3450e"} Oct 09 16:00:04 crc kubenswrapper[4719]: I1009 16:00:04.084451 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="992c58ea5dbae5369ec869431d616e2b5a4b372286978e83a08997eafaa3450e" Oct 09 16:00:04 crc kubenswrapper[4719]: I1009 16:00:04.084510 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333760-zgd5q" Oct 09 16:00:04 crc kubenswrapper[4719]: I1009 16:00:04.524360 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333715-pgfd2"] Oct 09 16:00:04 crc kubenswrapper[4719]: I1009 16:00:04.533818 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333715-pgfd2"] Oct 09 16:00:05 crc kubenswrapper[4719]: I1009 16:00:05.261919 4719 scope.go:117] "RemoveContainer" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" Oct 09 16:00:05 crc kubenswrapper[4719]: E1009 16:00:05.262638 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:00:05 crc kubenswrapper[4719]: I1009 16:00:05.306531 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24507f61-1a02-438c-b1ca-82515867e605" path="/var/lib/kubelet/pods/24507f61-1a02-438c-b1ca-82515867e605/volumes" Oct 09 16:00:19 crc kubenswrapper[4719]: I1009 16:00:19.161935 4719 scope.go:117] "RemoveContainer" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" Oct 09 16:00:19 crc kubenswrapper[4719]: E1009 16:00:19.162789 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:00:26 crc kubenswrapper[4719]: I1009 16:00:26.866331 4719 scope.go:117] "RemoveContainer" containerID="8de3d9cf280680a9dc7deeace9f3fd771589dbc230018023ab1c30b8f7b9f35f" Oct 09 16:00:33 crc kubenswrapper[4719]: I1009 16:00:33.161752 4719 scope.go:117] "RemoveContainer" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" Oct 09 16:00:33 crc kubenswrapper[4719]: E1009 16:00:33.162226 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:00:47 crc kubenswrapper[4719]: I1009 16:00:47.161320 4719 scope.go:117] "RemoveContainer" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" Oct 09 16:00:47 crc kubenswrapper[4719]: E1009 16:00:47.162108 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:01:00 crc kubenswrapper[4719]: I1009 16:01:00.164713 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29333761-x56tx"] Oct 09 16:01:00 crc kubenswrapper[4719]: E1009 16:01:00.166046 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="923d9a5f-f038-4d7f-877c-cf0f4c970a59" containerName="collect-profiles" Oct 09 16:01:00 crc kubenswrapper[4719]: I1009 16:01:00.166067 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="923d9a5f-f038-4d7f-877c-cf0f4c970a59" containerName="collect-profiles" Oct 09 16:01:00 crc kubenswrapper[4719]: I1009 16:01:00.166534 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="923d9a5f-f038-4d7f-877c-cf0f4c970a59" containerName="collect-profiles" Oct 09 16:01:00 crc kubenswrapper[4719]: I1009 16:01:00.168994 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29333761-x56tx" Oct 09 16:01:00 crc kubenswrapper[4719]: I1009 16:01:00.178505 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29333761-x56tx"] Oct 09 16:01:00 crc kubenswrapper[4719]: I1009 16:01:00.255827 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhwrj\" (UniqueName: \"kubernetes.io/projected/88187911-7d06-4147-97ad-9279f3e101e0-kube-api-access-hhwrj\") pod \"keystone-cron-29333761-x56tx\" (UID: \"88187911-7d06-4147-97ad-9279f3e101e0\") " pod="openstack/keystone-cron-29333761-x56tx" Oct 09 16:01:00 crc kubenswrapper[4719]: I1009 16:01:00.255958 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/88187911-7d06-4147-97ad-9279f3e101e0-fernet-keys\") pod \"keystone-cron-29333761-x56tx\" (UID: \"88187911-7d06-4147-97ad-9279f3e101e0\") " pod="openstack/keystone-cron-29333761-x56tx" Oct 09 16:01:00 crc kubenswrapper[4719]: I1009 16:01:00.256505 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88187911-7d06-4147-97ad-9279f3e101e0-combined-ca-bundle\") pod \"keystone-cron-29333761-x56tx\" (UID: \"88187911-7d06-4147-97ad-9279f3e101e0\") " pod="openstack/keystone-cron-29333761-x56tx" Oct 09 16:01:00 crc kubenswrapper[4719]: I1009 16:01:00.256901 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88187911-7d06-4147-97ad-9279f3e101e0-config-data\") pod \"keystone-cron-29333761-x56tx\" (UID: \"88187911-7d06-4147-97ad-9279f3e101e0\") " pod="openstack/keystone-cron-29333761-x56tx" Oct 09 16:01:00 crc kubenswrapper[4719]: I1009 16:01:00.358645 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhwrj\" (UniqueName: \"kubernetes.io/projected/88187911-7d06-4147-97ad-9279f3e101e0-kube-api-access-hhwrj\") pod \"keystone-cron-29333761-x56tx\" (UID: \"88187911-7d06-4147-97ad-9279f3e101e0\") " pod="openstack/keystone-cron-29333761-x56tx" Oct 09 16:01:00 crc kubenswrapper[4719]: I1009 16:01:00.358740 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/88187911-7d06-4147-97ad-9279f3e101e0-fernet-keys\") pod \"keystone-cron-29333761-x56tx\" (UID: \"88187911-7d06-4147-97ad-9279f3e101e0\") " pod="openstack/keystone-cron-29333761-x56tx" Oct 09 16:01:00 crc kubenswrapper[4719]: I1009 16:01:00.358860 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88187911-7d06-4147-97ad-9279f3e101e0-combined-ca-bundle\") pod \"keystone-cron-29333761-x56tx\" (UID: \"88187911-7d06-4147-97ad-9279f3e101e0\") " pod="openstack/keystone-cron-29333761-x56tx" Oct 09 16:01:00 crc kubenswrapper[4719]: I1009 16:01:00.358905 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88187911-7d06-4147-97ad-9279f3e101e0-config-data\") pod \"keystone-cron-29333761-x56tx\" (UID: \"88187911-7d06-4147-97ad-9279f3e101e0\") " pod="openstack/keystone-cron-29333761-x56tx" Oct 09 16:01:00 crc kubenswrapper[4719]: I1009 16:01:00.365622 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88187911-7d06-4147-97ad-9279f3e101e0-combined-ca-bundle\") pod \"keystone-cron-29333761-x56tx\" (UID: \"88187911-7d06-4147-97ad-9279f3e101e0\") " pod="openstack/keystone-cron-29333761-x56tx" Oct 09 16:01:00 crc kubenswrapper[4719]: I1009 16:01:00.366035 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88187911-7d06-4147-97ad-9279f3e101e0-config-data\") pod \"keystone-cron-29333761-x56tx\" (UID: \"88187911-7d06-4147-97ad-9279f3e101e0\") " pod="openstack/keystone-cron-29333761-x56tx" Oct 09 16:01:00 crc kubenswrapper[4719]: I1009 16:01:00.366892 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/88187911-7d06-4147-97ad-9279f3e101e0-fernet-keys\") pod \"keystone-cron-29333761-x56tx\" (UID: \"88187911-7d06-4147-97ad-9279f3e101e0\") " pod="openstack/keystone-cron-29333761-x56tx" Oct 09 16:01:00 crc kubenswrapper[4719]: I1009 16:01:00.377996 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhwrj\" (UniqueName: \"kubernetes.io/projected/88187911-7d06-4147-97ad-9279f3e101e0-kube-api-access-hhwrj\") pod \"keystone-cron-29333761-x56tx\" (UID: \"88187911-7d06-4147-97ad-9279f3e101e0\") " pod="openstack/keystone-cron-29333761-x56tx" Oct 09 16:01:00 crc kubenswrapper[4719]: I1009 16:01:00.506189 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29333761-x56tx" Oct 09 16:01:00 crc kubenswrapper[4719]: I1009 16:01:00.926267 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29333761-x56tx"] Oct 09 16:01:01 crc kubenswrapper[4719]: I1009 16:01:01.161686 4719 scope.go:117] "RemoveContainer" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" Oct 09 16:01:01 crc kubenswrapper[4719]: E1009 16:01:01.162015 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:01:01 crc kubenswrapper[4719]: I1009 16:01:01.607970 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29333761-x56tx" event={"ID":"88187911-7d06-4147-97ad-9279f3e101e0","Type":"ContainerStarted","Data":"a62bfa5b751672ff447aeae1d7b17c5f7b1cd3b78a6659296d0e68cbf494251d"} Oct 09 16:01:01 crc kubenswrapper[4719]: I1009 16:01:01.608373 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29333761-x56tx" event={"ID":"88187911-7d06-4147-97ad-9279f3e101e0","Type":"ContainerStarted","Data":"e56f6bf75da60b2eccaaaaea31d83c36fe5097e841733902f7616af51e4c6c42"} Oct 09 16:01:01 crc kubenswrapper[4719]: I1009 16:01:01.630404 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29333761-x56tx" podStartSLOduration=1.630386567 podStartE2EDuration="1.630386567s" podCreationTimestamp="2025-10-09 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 16:01:01.620996597 +0000 UTC m=+2567.130707882" watchObservedRunningTime="2025-10-09 16:01:01.630386567 +0000 UTC m=+2567.140097852" Oct 09 16:01:04 crc kubenswrapper[4719]: I1009 16:01:04.634642 4719 generic.go:334] "Generic (PLEG): container finished" podID="88187911-7d06-4147-97ad-9279f3e101e0" containerID="a62bfa5b751672ff447aeae1d7b17c5f7b1cd3b78a6659296d0e68cbf494251d" exitCode=0 Oct 09 16:01:04 crc kubenswrapper[4719]: I1009 16:01:04.634795 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29333761-x56tx" event={"ID":"88187911-7d06-4147-97ad-9279f3e101e0","Type":"ContainerDied","Data":"a62bfa5b751672ff447aeae1d7b17c5f7b1cd3b78a6659296d0e68cbf494251d"} Oct 09 16:01:05 crc kubenswrapper[4719]: I1009 16:01:05.955085 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29333761-x56tx" Oct 09 16:01:06 crc kubenswrapper[4719]: I1009 16:01:06.068200 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhwrj\" (UniqueName: \"kubernetes.io/projected/88187911-7d06-4147-97ad-9279f3e101e0-kube-api-access-hhwrj\") pod \"88187911-7d06-4147-97ad-9279f3e101e0\" (UID: \"88187911-7d06-4147-97ad-9279f3e101e0\") " Oct 09 16:01:06 crc kubenswrapper[4719]: I1009 16:01:06.068269 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88187911-7d06-4147-97ad-9279f3e101e0-combined-ca-bundle\") pod \"88187911-7d06-4147-97ad-9279f3e101e0\" (UID: \"88187911-7d06-4147-97ad-9279f3e101e0\") " Oct 09 16:01:06 crc kubenswrapper[4719]: I1009 16:01:06.068331 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/88187911-7d06-4147-97ad-9279f3e101e0-fernet-keys\") pod \"88187911-7d06-4147-97ad-9279f3e101e0\" (UID: \"88187911-7d06-4147-97ad-9279f3e101e0\") " Oct 09 16:01:06 crc kubenswrapper[4719]: I1009 16:01:06.068422 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88187911-7d06-4147-97ad-9279f3e101e0-config-data\") pod \"88187911-7d06-4147-97ad-9279f3e101e0\" (UID: \"88187911-7d06-4147-97ad-9279f3e101e0\") " Oct 09 16:01:06 crc kubenswrapper[4719]: I1009 16:01:06.074432 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88187911-7d06-4147-97ad-9279f3e101e0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "88187911-7d06-4147-97ad-9279f3e101e0" (UID: "88187911-7d06-4147-97ad-9279f3e101e0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:01:06 crc kubenswrapper[4719]: I1009 16:01:06.074607 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88187911-7d06-4147-97ad-9279f3e101e0-kube-api-access-hhwrj" (OuterVolumeSpecName: "kube-api-access-hhwrj") pod "88187911-7d06-4147-97ad-9279f3e101e0" (UID: "88187911-7d06-4147-97ad-9279f3e101e0"). InnerVolumeSpecName "kube-api-access-hhwrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:01:06 crc kubenswrapper[4719]: I1009 16:01:06.119872 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88187911-7d06-4147-97ad-9279f3e101e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88187911-7d06-4147-97ad-9279f3e101e0" (UID: "88187911-7d06-4147-97ad-9279f3e101e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:01:06 crc kubenswrapper[4719]: I1009 16:01:06.130178 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88187911-7d06-4147-97ad-9279f3e101e0-config-data" (OuterVolumeSpecName: "config-data") pod "88187911-7d06-4147-97ad-9279f3e101e0" (UID: "88187911-7d06-4147-97ad-9279f3e101e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:01:06 crc kubenswrapper[4719]: I1009 16:01:06.170611 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhwrj\" (UniqueName: \"kubernetes.io/projected/88187911-7d06-4147-97ad-9279f3e101e0-kube-api-access-hhwrj\") on node \"crc\" DevicePath \"\"" Oct 09 16:01:06 crc kubenswrapper[4719]: I1009 16:01:06.170651 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88187911-7d06-4147-97ad-9279f3e101e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 16:01:06 crc kubenswrapper[4719]: I1009 16:01:06.170664 4719 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/88187911-7d06-4147-97ad-9279f3e101e0-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 09 16:01:06 crc kubenswrapper[4719]: I1009 16:01:06.170675 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88187911-7d06-4147-97ad-9279f3e101e0-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 16:01:06 crc kubenswrapper[4719]: I1009 16:01:06.659657 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29333761-x56tx" event={"ID":"88187911-7d06-4147-97ad-9279f3e101e0","Type":"ContainerDied","Data":"e56f6bf75da60b2eccaaaaea31d83c36fe5097e841733902f7616af51e4c6c42"} Oct 09 16:01:06 crc kubenswrapper[4719]: I1009 16:01:06.659709 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e56f6bf75da60b2eccaaaaea31d83c36fe5097e841733902f7616af51e4c6c42" Oct 09 16:01:06 crc kubenswrapper[4719]: I1009 16:01:06.659707 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29333761-x56tx" Oct 09 16:01:14 crc kubenswrapper[4719]: I1009 16:01:14.160956 4719 scope.go:117] "RemoveContainer" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" Oct 09 16:01:14 crc kubenswrapper[4719]: E1009 16:01:14.161769 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:01:25 crc kubenswrapper[4719]: I1009 16:01:25.168097 4719 scope.go:117] "RemoveContainer" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" Oct 09 16:01:25 crc kubenswrapper[4719]: E1009 16:01:25.168948 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:01:37 crc kubenswrapper[4719]: I1009 16:01:37.161822 4719 scope.go:117] "RemoveContainer" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" Oct 09 16:01:37 crc kubenswrapper[4719]: E1009 16:01:37.162626 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:01:52 crc kubenswrapper[4719]: I1009 16:01:52.162935 4719 scope.go:117] "RemoveContainer" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" Oct 09 16:01:52 crc kubenswrapper[4719]: E1009 16:01:52.163663 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:02:00 crc kubenswrapper[4719]: I1009 16:02:00.144812 4719 generic.go:334] "Generic (PLEG): container finished" podID="23a47423-b3ad-4ba3-b0ab-9a452d485f2b" containerID="a4d6eb6aca40f500d37ae5d8e1dd8d97ba874076b31be8db97db74610deecdca" exitCode=0 Oct 09 16:02:00 crc kubenswrapper[4719]: I1009 16:02:00.144923 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" event={"ID":"23a47423-b3ad-4ba3-b0ab-9a452d485f2b","Type":"ContainerDied","Data":"a4d6eb6aca40f500d37ae5d8e1dd8d97ba874076b31be8db97db74610deecdca"} Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.548656 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.669703 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-cell1-compute-config-1\") pod \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.669757 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-extra-config-0\") pod \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.669803 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-combined-ca-bundle\") pod \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.669846 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-cell1-compute-config-0\") pod \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.669903 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-migration-ssh-key-1\") pod \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.669954 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-migration-ssh-key-0\") pod \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.669988 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfx8s\" (UniqueName: \"kubernetes.io/projected/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-kube-api-access-wfx8s\") pod \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.670028 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-inventory\") pod \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.670071 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-ssh-key\") pod \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\" (UID: \"23a47423-b3ad-4ba3-b0ab-9a452d485f2b\") " Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.675513 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "23a47423-b3ad-4ba3-b0ab-9a452d485f2b" (UID: "23a47423-b3ad-4ba3-b0ab-9a452d485f2b"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.690992 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-kube-api-access-wfx8s" (OuterVolumeSpecName: "kube-api-access-wfx8s") pod "23a47423-b3ad-4ba3-b0ab-9a452d485f2b" (UID: "23a47423-b3ad-4ba3-b0ab-9a452d485f2b"). InnerVolumeSpecName "kube-api-access-wfx8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.697061 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "23a47423-b3ad-4ba3-b0ab-9a452d485f2b" (UID: "23a47423-b3ad-4ba3-b0ab-9a452d485f2b"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.697776 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "23a47423-b3ad-4ba3-b0ab-9a452d485f2b" (UID: "23a47423-b3ad-4ba3-b0ab-9a452d485f2b"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.700232 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "23a47423-b3ad-4ba3-b0ab-9a452d485f2b" (UID: "23a47423-b3ad-4ba3-b0ab-9a452d485f2b"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.701988 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "23a47423-b3ad-4ba3-b0ab-9a452d485f2b" (UID: "23a47423-b3ad-4ba3-b0ab-9a452d485f2b"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.704156 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "23a47423-b3ad-4ba3-b0ab-9a452d485f2b" (UID: "23a47423-b3ad-4ba3-b0ab-9a452d485f2b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.704509 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "23a47423-b3ad-4ba3-b0ab-9a452d485f2b" (UID: "23a47423-b3ad-4ba3-b0ab-9a452d485f2b"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.706869 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-inventory" (OuterVolumeSpecName: "inventory") pod "23a47423-b3ad-4ba3-b0ab-9a452d485f2b" (UID: "23a47423-b3ad-4ba3-b0ab-9a452d485f2b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.772579 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfx8s\" (UniqueName: \"kubernetes.io/projected/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-kube-api-access-wfx8s\") on node \"crc\" DevicePath \"\"" Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.772615 4719 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.772625 4719 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.772635 4719 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.772644 4719 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.772652 4719 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.772659 4719 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.772667 4719 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 09 16:02:01 crc kubenswrapper[4719]: I1009 16:02:01.772677 4719 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/23a47423-b3ad-4ba3-b0ab-9a452d485f2b-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.169516 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" event={"ID":"23a47423-b3ad-4ba3-b0ab-9a452d485f2b","Type":"ContainerDied","Data":"7bf75817b4e267461a55b75d6345f00b740722f02f60c65bcab65ec4e237f1a8"} Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.169555 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bf75817b4e267461a55b75d6345f00b740722f02f60c65bcab65ec4e237f1a8" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.169579 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vc5ss" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.257111 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt"] Oct 09 16:02:02 crc kubenswrapper[4719]: E1009 16:02:02.257521 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a47423-b3ad-4ba3-b0ab-9a452d485f2b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.257538 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a47423-b3ad-4ba3-b0ab-9a452d485f2b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 09 16:02:02 crc kubenswrapper[4719]: E1009 16:02:02.257574 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88187911-7d06-4147-97ad-9279f3e101e0" containerName="keystone-cron" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.257581 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="88187911-7d06-4147-97ad-9279f3e101e0" containerName="keystone-cron" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.257761 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="23a47423-b3ad-4ba3-b0ab-9a452d485f2b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.257786 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="88187911-7d06-4147-97ad-9279f3e101e0" containerName="keystone-cron" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.258468 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.260010 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.262190 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ssvsw" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.262254 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.262604 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.267047 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.276171 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt"] Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.387286 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.387551 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.387667 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.387776 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.387851 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.387910 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5hm8\" (UniqueName: \"kubernetes.io/projected/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-kube-api-access-z5hm8\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.387974 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.490039 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.490840 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.490887 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.490916 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.490944 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5hm8\" (UniqueName: \"kubernetes.io/projected/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-kube-api-access-z5hm8\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.490971 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.491020 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.494236 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.494476 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.495280 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.496967 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.501497 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.505976 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.509395 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5hm8\" (UniqueName: \"kubernetes.io/projected/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-kube-api-access-z5hm8\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" Oct 09 16:02:02 crc kubenswrapper[4719]: I1009 16:02:02.582420 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" Oct 09 16:02:03 crc kubenswrapper[4719]: I1009 16:02:03.105016 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt"] Oct 09 16:02:03 crc kubenswrapper[4719]: I1009 16:02:03.180475 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" event={"ID":"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a","Type":"ContainerStarted","Data":"ba351ed4ff6b20f3ba67d0629c7cc61def443221402303b15ccda9dbfe71a67c"} Oct 09 16:02:04 crc kubenswrapper[4719]: I1009 16:02:04.191944 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" event={"ID":"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a","Type":"ContainerStarted","Data":"96579fad0d128493a63b17acfd367bcee0245bedae3079ee812a635e41dcd6c9"} Oct 09 16:02:04 crc kubenswrapper[4719]: I1009 16:02:04.215867 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" podStartSLOduration=1.6358975569999998 podStartE2EDuration="2.215849882s" podCreationTimestamp="2025-10-09 16:02:02 +0000 UTC" firstStartedPulling="2025-10-09 16:02:03.101069581 +0000 UTC m=+2628.610780866" lastFinishedPulling="2025-10-09 16:02:03.681021906 +0000 UTC m=+2629.190733191" observedRunningTime="2025-10-09 16:02:04.211810694 +0000 UTC m=+2629.721521989" watchObservedRunningTime="2025-10-09 16:02:04.215849882 +0000 UTC m=+2629.725561167" Oct 09 16:02:07 crc kubenswrapper[4719]: I1009 16:02:07.162277 4719 scope.go:117] "RemoveContainer" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" Oct 09 16:02:07 crc kubenswrapper[4719]: E1009 16:02:07.163164 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:02:18 crc kubenswrapper[4719]: I1009 16:02:18.161442 4719 scope.go:117] "RemoveContainer" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" Oct 09 16:02:18 crc kubenswrapper[4719]: E1009 16:02:18.162270 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:02:30 crc kubenswrapper[4719]: I1009 16:02:30.160724 4719 scope.go:117] "RemoveContainer" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" Oct 09 16:02:30 crc kubenswrapper[4719]: E1009 16:02:30.162628 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:02:42 crc kubenswrapper[4719]: I1009 16:02:42.161311 4719 scope.go:117] "RemoveContainer" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" Oct 09 16:02:42 crc kubenswrapper[4719]: E1009 16:02:42.162783 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:02:57 crc kubenswrapper[4719]: I1009 16:02:57.162415 4719 scope.go:117] "RemoveContainer" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" Oct 09 16:02:57 crc kubenswrapper[4719]: E1009 16:02:57.163445 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:03:09 crc kubenswrapper[4719]: I1009 16:03:09.162798 4719 scope.go:117] "RemoveContainer" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" Oct 09 16:03:09 crc kubenswrapper[4719]: I1009 16:03:09.764095 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerStarted","Data":"920ea73af6d2fdd926cecc482b1dc2188636a4f6e8a6fe0fff4b95bcf354a955"} Oct 09 16:03:19 crc kubenswrapper[4719]: I1009 16:03:19.269806 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pvqjp"] Oct 09 16:03:19 crc kubenswrapper[4719]: I1009 16:03:19.274891 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pvqjp" Oct 09 16:03:19 crc kubenswrapper[4719]: I1009 16:03:19.298475 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pvqjp"] Oct 09 16:03:19 crc kubenswrapper[4719]: I1009 16:03:19.390656 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7tz9\" (UniqueName: \"kubernetes.io/projected/05769cbb-e225-49c8-96ed-9e2da9764e33-kube-api-access-v7tz9\") pod \"redhat-operators-pvqjp\" (UID: \"05769cbb-e225-49c8-96ed-9e2da9764e33\") " pod="openshift-marketplace/redhat-operators-pvqjp" Oct 09 16:03:19 crc kubenswrapper[4719]: I1009 16:03:19.390803 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05769cbb-e225-49c8-96ed-9e2da9764e33-catalog-content\") pod \"redhat-operators-pvqjp\" (UID: \"05769cbb-e225-49c8-96ed-9e2da9764e33\") " pod="openshift-marketplace/redhat-operators-pvqjp" Oct 09 16:03:19 crc kubenswrapper[4719]: I1009 16:03:19.390915 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05769cbb-e225-49c8-96ed-9e2da9764e33-utilities\") pod \"redhat-operators-pvqjp\" (UID: \"05769cbb-e225-49c8-96ed-9e2da9764e33\") " pod="openshift-marketplace/redhat-operators-pvqjp" Oct 09 16:03:19 crc kubenswrapper[4719]: I1009 16:03:19.493492 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7tz9\" (UniqueName: \"kubernetes.io/projected/05769cbb-e225-49c8-96ed-9e2da9764e33-kube-api-access-v7tz9\") pod \"redhat-operators-pvqjp\" (UID: \"05769cbb-e225-49c8-96ed-9e2da9764e33\") " pod="openshift-marketplace/redhat-operators-pvqjp" Oct 09 16:03:19 crc kubenswrapper[4719]: I1009 16:03:19.493541 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05769cbb-e225-49c8-96ed-9e2da9764e33-catalog-content\") pod \"redhat-operators-pvqjp\" (UID: \"05769cbb-e225-49c8-96ed-9e2da9764e33\") " pod="openshift-marketplace/redhat-operators-pvqjp" Oct 09 16:03:19 crc kubenswrapper[4719]: I1009 16:03:19.493568 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05769cbb-e225-49c8-96ed-9e2da9764e33-utilities\") pod \"redhat-operators-pvqjp\" (UID: \"05769cbb-e225-49c8-96ed-9e2da9764e33\") " pod="openshift-marketplace/redhat-operators-pvqjp" Oct 09 16:03:19 crc kubenswrapper[4719]: I1009 16:03:19.494173 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05769cbb-e225-49c8-96ed-9e2da9764e33-utilities\") pod \"redhat-operators-pvqjp\" (UID: \"05769cbb-e225-49c8-96ed-9e2da9764e33\") " pod="openshift-marketplace/redhat-operators-pvqjp" Oct 09 16:03:19 crc kubenswrapper[4719]: I1009 16:03:19.494698 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05769cbb-e225-49c8-96ed-9e2da9764e33-catalog-content\") pod \"redhat-operators-pvqjp\" (UID: \"05769cbb-e225-49c8-96ed-9e2da9764e33\") " pod="openshift-marketplace/redhat-operators-pvqjp" Oct 09 16:03:19 crc kubenswrapper[4719]: I1009 16:03:19.524175 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7tz9\" (UniqueName: \"kubernetes.io/projected/05769cbb-e225-49c8-96ed-9e2da9764e33-kube-api-access-v7tz9\") pod \"redhat-operators-pvqjp\" (UID: \"05769cbb-e225-49c8-96ed-9e2da9764e33\") " pod="openshift-marketplace/redhat-operators-pvqjp" Oct 09 16:03:19 crc kubenswrapper[4719]: I1009 16:03:19.607189 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pvqjp" Oct 09 16:03:20 crc kubenswrapper[4719]: I1009 16:03:20.077753 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pvqjp"] Oct 09 16:03:20 crc kubenswrapper[4719]: I1009 16:03:20.914892 4719 generic.go:334] "Generic (PLEG): container finished" podID="05769cbb-e225-49c8-96ed-9e2da9764e33" containerID="fb1f349768785efd68aaa5d34d6292621f23ff4a3b6ea47056dd06f991165924" exitCode=0 Oct 09 16:03:20 crc kubenswrapper[4719]: I1009 16:03:20.914955 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvqjp" event={"ID":"05769cbb-e225-49c8-96ed-9e2da9764e33","Type":"ContainerDied","Data":"fb1f349768785efd68aaa5d34d6292621f23ff4a3b6ea47056dd06f991165924"} Oct 09 16:03:20 crc kubenswrapper[4719]: I1009 16:03:20.915281 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvqjp" event={"ID":"05769cbb-e225-49c8-96ed-9e2da9764e33","Type":"ContainerStarted","Data":"e3d4df4048337c274613ced2595f87968f1c38f1c5454c8d6db2b4e46e0eeec1"} Oct 09 16:03:22 crc kubenswrapper[4719]: I1009 16:03:22.940128 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvqjp" event={"ID":"05769cbb-e225-49c8-96ed-9e2da9764e33","Type":"ContainerStarted","Data":"a1c37e93dad50f2c5c8b34dc6a2cf3be1131c1d2126c732113438ad67d7a11b2"} Oct 09 16:03:23 crc kubenswrapper[4719]: I1009 16:03:23.950946 4719 generic.go:334] "Generic (PLEG): container finished" podID="05769cbb-e225-49c8-96ed-9e2da9764e33" containerID="a1c37e93dad50f2c5c8b34dc6a2cf3be1131c1d2126c732113438ad67d7a11b2" exitCode=0 Oct 09 16:03:23 crc kubenswrapper[4719]: I1009 16:03:23.951034 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvqjp" event={"ID":"05769cbb-e225-49c8-96ed-9e2da9764e33","Type":"ContainerDied","Data":"a1c37e93dad50f2c5c8b34dc6a2cf3be1131c1d2126c732113438ad67d7a11b2"} Oct 09 16:03:24 crc kubenswrapper[4719]: I1009 16:03:24.964384 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvqjp" event={"ID":"05769cbb-e225-49c8-96ed-9e2da9764e33","Type":"ContainerStarted","Data":"5fcef27ff4ce1f0dc97d9539d53a83dd9d31831b0479f6093b45afce12b3b87f"} Oct 09 16:03:24 crc kubenswrapper[4719]: I1009 16:03:24.987109 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pvqjp" podStartSLOduration=2.151276577 podStartE2EDuration="5.987090023s" podCreationTimestamp="2025-10-09 16:03:19 +0000 UTC" firstStartedPulling="2025-10-09 16:03:20.916736632 +0000 UTC m=+2706.426447907" lastFinishedPulling="2025-10-09 16:03:24.752550048 +0000 UTC m=+2710.262261353" observedRunningTime="2025-10-09 16:03:24.981225846 +0000 UTC m=+2710.490937131" watchObservedRunningTime="2025-10-09 16:03:24.987090023 +0000 UTC m=+2710.496801308" Oct 09 16:03:29 crc kubenswrapper[4719]: I1009 16:03:29.607504 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pvqjp" Oct 09 16:03:29 crc kubenswrapper[4719]: I1009 16:03:29.608451 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pvqjp" Oct 09 16:03:29 crc kubenswrapper[4719]: I1009 16:03:29.676081 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pvqjp" Oct 09 16:03:30 crc kubenswrapper[4719]: I1009 16:03:30.053261 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pvqjp" Oct 09 16:03:30 crc kubenswrapper[4719]: I1009 16:03:30.109707 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pvqjp"] Oct 09 16:03:32 crc kubenswrapper[4719]: I1009 16:03:32.025890 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pvqjp" podUID="05769cbb-e225-49c8-96ed-9e2da9764e33" containerName="registry-server" containerID="cri-o://5fcef27ff4ce1f0dc97d9539d53a83dd9d31831b0479f6093b45afce12b3b87f" gracePeriod=2 Oct 09 16:03:32 crc kubenswrapper[4719]: I1009 16:03:32.457339 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pvqjp" Oct 09 16:03:32 crc kubenswrapper[4719]: I1009 16:03:32.567419 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05769cbb-e225-49c8-96ed-9e2da9764e33-utilities\") pod \"05769cbb-e225-49c8-96ed-9e2da9764e33\" (UID: \"05769cbb-e225-49c8-96ed-9e2da9764e33\") " Oct 09 16:03:32 crc kubenswrapper[4719]: I1009 16:03:32.567790 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7tz9\" (UniqueName: \"kubernetes.io/projected/05769cbb-e225-49c8-96ed-9e2da9764e33-kube-api-access-v7tz9\") pod \"05769cbb-e225-49c8-96ed-9e2da9764e33\" (UID: \"05769cbb-e225-49c8-96ed-9e2da9764e33\") " Oct 09 16:03:32 crc kubenswrapper[4719]: I1009 16:03:32.567920 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05769cbb-e225-49c8-96ed-9e2da9764e33-catalog-content\") pod \"05769cbb-e225-49c8-96ed-9e2da9764e33\" (UID: \"05769cbb-e225-49c8-96ed-9e2da9764e33\") " Oct 09 16:03:32 crc kubenswrapper[4719]: I1009 16:03:32.568235 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05769cbb-e225-49c8-96ed-9e2da9764e33-utilities" (OuterVolumeSpecName: "utilities") pod "05769cbb-e225-49c8-96ed-9e2da9764e33" (UID: "05769cbb-e225-49c8-96ed-9e2da9764e33"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:03:32 crc kubenswrapper[4719]: I1009 16:03:32.568597 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05769cbb-e225-49c8-96ed-9e2da9764e33-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 16:03:32 crc kubenswrapper[4719]: I1009 16:03:32.573895 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05769cbb-e225-49c8-96ed-9e2da9764e33-kube-api-access-v7tz9" (OuterVolumeSpecName: "kube-api-access-v7tz9") pod "05769cbb-e225-49c8-96ed-9e2da9764e33" (UID: "05769cbb-e225-49c8-96ed-9e2da9764e33"). InnerVolumeSpecName "kube-api-access-v7tz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:03:32 crc kubenswrapper[4719]: I1009 16:03:32.653100 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05769cbb-e225-49c8-96ed-9e2da9764e33-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05769cbb-e225-49c8-96ed-9e2da9764e33" (UID: "05769cbb-e225-49c8-96ed-9e2da9764e33"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:03:32 crc kubenswrapper[4719]: I1009 16:03:32.670085 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7tz9\" (UniqueName: \"kubernetes.io/projected/05769cbb-e225-49c8-96ed-9e2da9764e33-kube-api-access-v7tz9\") on node \"crc\" DevicePath \"\"" Oct 09 16:03:32 crc kubenswrapper[4719]: I1009 16:03:32.670125 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05769cbb-e225-49c8-96ed-9e2da9764e33-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 16:03:33 crc kubenswrapper[4719]: I1009 16:03:33.036399 4719 generic.go:334] "Generic (PLEG): container finished" podID="05769cbb-e225-49c8-96ed-9e2da9764e33" containerID="5fcef27ff4ce1f0dc97d9539d53a83dd9d31831b0479f6093b45afce12b3b87f" exitCode=0 Oct 09 16:03:33 crc kubenswrapper[4719]: I1009 16:03:33.036451 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pvqjp" Oct 09 16:03:33 crc kubenswrapper[4719]: I1009 16:03:33.036453 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvqjp" event={"ID":"05769cbb-e225-49c8-96ed-9e2da9764e33","Type":"ContainerDied","Data":"5fcef27ff4ce1f0dc97d9539d53a83dd9d31831b0479f6093b45afce12b3b87f"} Oct 09 16:03:33 crc kubenswrapper[4719]: I1009 16:03:33.036589 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvqjp" event={"ID":"05769cbb-e225-49c8-96ed-9e2da9764e33","Type":"ContainerDied","Data":"e3d4df4048337c274613ced2595f87968f1c38f1c5454c8d6db2b4e46e0eeec1"} Oct 09 16:03:33 crc kubenswrapper[4719]: I1009 16:03:33.036615 4719 scope.go:117] "RemoveContainer" containerID="5fcef27ff4ce1f0dc97d9539d53a83dd9d31831b0479f6093b45afce12b3b87f" Oct 09 16:03:33 crc kubenswrapper[4719]: I1009 16:03:33.081595 4719 scope.go:117] "RemoveContainer" containerID="a1c37e93dad50f2c5c8b34dc6a2cf3be1131c1d2126c732113438ad67d7a11b2" Oct 09 16:03:33 crc kubenswrapper[4719]: I1009 16:03:33.087429 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pvqjp"] Oct 09 16:03:33 crc kubenswrapper[4719]: I1009 16:03:33.098418 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pvqjp"] Oct 09 16:03:33 crc kubenswrapper[4719]: I1009 16:03:33.103280 4719 scope.go:117] "RemoveContainer" containerID="fb1f349768785efd68aaa5d34d6292621f23ff4a3b6ea47056dd06f991165924" Oct 09 16:03:33 crc kubenswrapper[4719]: I1009 16:03:33.153720 4719 scope.go:117] "RemoveContainer" containerID="5fcef27ff4ce1f0dc97d9539d53a83dd9d31831b0479f6093b45afce12b3b87f" Oct 09 16:03:33 crc kubenswrapper[4719]: E1009 16:03:33.154915 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fcef27ff4ce1f0dc97d9539d53a83dd9d31831b0479f6093b45afce12b3b87f\": container with ID starting with 5fcef27ff4ce1f0dc97d9539d53a83dd9d31831b0479f6093b45afce12b3b87f not found: ID does not exist" containerID="5fcef27ff4ce1f0dc97d9539d53a83dd9d31831b0479f6093b45afce12b3b87f" Oct 09 16:03:33 crc kubenswrapper[4719]: I1009 16:03:33.155001 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fcef27ff4ce1f0dc97d9539d53a83dd9d31831b0479f6093b45afce12b3b87f"} err="failed to get container status \"5fcef27ff4ce1f0dc97d9539d53a83dd9d31831b0479f6093b45afce12b3b87f\": rpc error: code = NotFound desc = could not find container \"5fcef27ff4ce1f0dc97d9539d53a83dd9d31831b0479f6093b45afce12b3b87f\": container with ID starting with 5fcef27ff4ce1f0dc97d9539d53a83dd9d31831b0479f6093b45afce12b3b87f not found: ID does not exist" Oct 09 16:03:33 crc kubenswrapper[4719]: I1009 16:03:33.155035 4719 scope.go:117] "RemoveContainer" containerID="a1c37e93dad50f2c5c8b34dc6a2cf3be1131c1d2126c732113438ad67d7a11b2" Oct 09 16:03:33 crc kubenswrapper[4719]: E1009 16:03:33.155482 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1c37e93dad50f2c5c8b34dc6a2cf3be1131c1d2126c732113438ad67d7a11b2\": container with ID starting with a1c37e93dad50f2c5c8b34dc6a2cf3be1131c1d2126c732113438ad67d7a11b2 not found: ID does not exist" containerID="a1c37e93dad50f2c5c8b34dc6a2cf3be1131c1d2126c732113438ad67d7a11b2" Oct 09 16:03:33 crc kubenswrapper[4719]: I1009 16:03:33.155510 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1c37e93dad50f2c5c8b34dc6a2cf3be1131c1d2126c732113438ad67d7a11b2"} err="failed to get container status \"a1c37e93dad50f2c5c8b34dc6a2cf3be1131c1d2126c732113438ad67d7a11b2\": rpc error: code = NotFound desc = could not find container \"a1c37e93dad50f2c5c8b34dc6a2cf3be1131c1d2126c732113438ad67d7a11b2\": container with ID starting with a1c37e93dad50f2c5c8b34dc6a2cf3be1131c1d2126c732113438ad67d7a11b2 not found: ID does not exist" Oct 09 16:03:33 crc kubenswrapper[4719]: I1009 16:03:33.155524 4719 scope.go:117] "RemoveContainer" containerID="fb1f349768785efd68aaa5d34d6292621f23ff4a3b6ea47056dd06f991165924" Oct 09 16:03:33 crc kubenswrapper[4719]: E1009 16:03:33.155828 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb1f349768785efd68aaa5d34d6292621f23ff4a3b6ea47056dd06f991165924\": container with ID starting with fb1f349768785efd68aaa5d34d6292621f23ff4a3b6ea47056dd06f991165924 not found: ID does not exist" containerID="fb1f349768785efd68aaa5d34d6292621f23ff4a3b6ea47056dd06f991165924" Oct 09 16:03:33 crc kubenswrapper[4719]: I1009 16:03:33.155858 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb1f349768785efd68aaa5d34d6292621f23ff4a3b6ea47056dd06f991165924"} err="failed to get container status \"fb1f349768785efd68aaa5d34d6292621f23ff4a3b6ea47056dd06f991165924\": rpc error: code = NotFound desc = could not find container \"fb1f349768785efd68aaa5d34d6292621f23ff4a3b6ea47056dd06f991165924\": container with ID starting with fb1f349768785efd68aaa5d34d6292621f23ff4a3b6ea47056dd06f991165924 not found: ID does not exist" Oct 09 16:03:33 crc kubenswrapper[4719]: I1009 16:03:33.174632 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05769cbb-e225-49c8-96ed-9e2da9764e33" path="/var/lib/kubelet/pods/05769cbb-e225-49c8-96ed-9e2da9764e33/volumes" Oct 09 16:03:35 crc kubenswrapper[4719]: I1009 16:03:35.386056 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kg57n"] Oct 09 16:03:35 crc kubenswrapper[4719]: E1009 16:03:35.386919 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05769cbb-e225-49c8-96ed-9e2da9764e33" containerName="registry-server" Oct 09 16:03:35 crc kubenswrapper[4719]: I1009 16:03:35.386935 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="05769cbb-e225-49c8-96ed-9e2da9764e33" containerName="registry-server" Oct 09 16:03:35 crc kubenswrapper[4719]: E1009 16:03:35.386950 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05769cbb-e225-49c8-96ed-9e2da9764e33" containerName="extract-content" Oct 09 16:03:35 crc kubenswrapper[4719]: I1009 16:03:35.386957 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="05769cbb-e225-49c8-96ed-9e2da9764e33" containerName="extract-content" Oct 09 16:03:35 crc kubenswrapper[4719]: E1009 16:03:35.386998 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05769cbb-e225-49c8-96ed-9e2da9764e33" containerName="extract-utilities" Oct 09 16:03:35 crc kubenswrapper[4719]: I1009 16:03:35.387007 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="05769cbb-e225-49c8-96ed-9e2da9764e33" containerName="extract-utilities" Oct 09 16:03:35 crc kubenswrapper[4719]: I1009 16:03:35.387244 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="05769cbb-e225-49c8-96ed-9e2da9764e33" containerName="registry-server" Oct 09 16:03:35 crc kubenswrapper[4719]: I1009 16:03:35.392406 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kg57n" Oct 09 16:03:35 crc kubenswrapper[4719]: I1009 16:03:35.398435 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kg57n"] Oct 09 16:03:35 crc kubenswrapper[4719]: I1009 16:03:35.449597 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flq4b\" (UniqueName: \"kubernetes.io/projected/6e069a05-1883-4a86-aba9-87d7ac06e66b-kube-api-access-flq4b\") pod \"community-operators-kg57n\" (UID: \"6e069a05-1883-4a86-aba9-87d7ac06e66b\") " pod="openshift-marketplace/community-operators-kg57n" Oct 09 16:03:35 crc kubenswrapper[4719]: I1009 16:03:35.450010 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e069a05-1883-4a86-aba9-87d7ac06e66b-catalog-content\") pod \"community-operators-kg57n\" (UID: \"6e069a05-1883-4a86-aba9-87d7ac06e66b\") " pod="openshift-marketplace/community-operators-kg57n" Oct 09 16:03:35 crc kubenswrapper[4719]: I1009 16:03:35.450199 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e069a05-1883-4a86-aba9-87d7ac06e66b-utilities\") pod \"community-operators-kg57n\" (UID: \"6e069a05-1883-4a86-aba9-87d7ac06e66b\") " pod="openshift-marketplace/community-operators-kg57n" Oct 09 16:03:35 crc kubenswrapper[4719]: I1009 16:03:35.553507 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e069a05-1883-4a86-aba9-87d7ac06e66b-utilities\") pod \"community-operators-kg57n\" (UID: \"6e069a05-1883-4a86-aba9-87d7ac06e66b\") " pod="openshift-marketplace/community-operators-kg57n" Oct 09 16:03:35 crc kubenswrapper[4719]: I1009 16:03:35.553638 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flq4b\" (UniqueName: \"kubernetes.io/projected/6e069a05-1883-4a86-aba9-87d7ac06e66b-kube-api-access-flq4b\") pod \"community-operators-kg57n\" (UID: \"6e069a05-1883-4a86-aba9-87d7ac06e66b\") " pod="openshift-marketplace/community-operators-kg57n" Oct 09 16:03:35 crc kubenswrapper[4719]: I1009 16:03:35.553926 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e069a05-1883-4a86-aba9-87d7ac06e66b-catalog-content\") pod \"community-operators-kg57n\" (UID: \"6e069a05-1883-4a86-aba9-87d7ac06e66b\") " pod="openshift-marketplace/community-operators-kg57n" Oct 09 16:03:35 crc kubenswrapper[4719]: I1009 16:03:35.554103 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e069a05-1883-4a86-aba9-87d7ac06e66b-utilities\") pod \"community-operators-kg57n\" (UID: \"6e069a05-1883-4a86-aba9-87d7ac06e66b\") " pod="openshift-marketplace/community-operators-kg57n" Oct 09 16:03:35 crc kubenswrapper[4719]: I1009 16:03:35.554382 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e069a05-1883-4a86-aba9-87d7ac06e66b-catalog-content\") pod \"community-operators-kg57n\" (UID: \"6e069a05-1883-4a86-aba9-87d7ac06e66b\") " pod="openshift-marketplace/community-operators-kg57n" Oct 09 16:03:35 crc kubenswrapper[4719]: I1009 16:03:35.578012 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flq4b\" (UniqueName: \"kubernetes.io/projected/6e069a05-1883-4a86-aba9-87d7ac06e66b-kube-api-access-flq4b\") pod \"community-operators-kg57n\" (UID: \"6e069a05-1883-4a86-aba9-87d7ac06e66b\") " pod="openshift-marketplace/community-operators-kg57n" Oct 09 16:03:35 crc kubenswrapper[4719]: I1009 16:03:35.722691 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kg57n" Oct 09 16:03:36 crc kubenswrapper[4719]: I1009 16:03:36.175062 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kg57n"] Oct 09 16:03:37 crc kubenswrapper[4719]: I1009 16:03:37.075171 4719 generic.go:334] "Generic (PLEG): container finished" podID="6e069a05-1883-4a86-aba9-87d7ac06e66b" containerID="4be6b9932d86c33cd621cbe803f59fdd0402f1d3396cffc2a3e46383c2705cb9" exitCode=0 Oct 09 16:03:37 crc kubenswrapper[4719]: I1009 16:03:37.075282 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kg57n" event={"ID":"6e069a05-1883-4a86-aba9-87d7ac06e66b","Type":"ContainerDied","Data":"4be6b9932d86c33cd621cbe803f59fdd0402f1d3396cffc2a3e46383c2705cb9"} Oct 09 16:03:37 crc kubenswrapper[4719]: I1009 16:03:37.075927 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kg57n" event={"ID":"6e069a05-1883-4a86-aba9-87d7ac06e66b","Type":"ContainerStarted","Data":"4141a03347c33b3ce929c55270a615c587d1e37a93a4e60599ea06c408b39bb3"} Oct 09 16:03:39 crc kubenswrapper[4719]: I1009 16:03:39.102921 4719 generic.go:334] "Generic (PLEG): container finished" podID="6e069a05-1883-4a86-aba9-87d7ac06e66b" containerID="2cb4658c577acfc526f53402d690f3fbfe5d7df88be13fdb8235943c3481bf93" exitCode=0 Oct 09 16:03:39 crc kubenswrapper[4719]: I1009 16:03:39.103007 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kg57n" event={"ID":"6e069a05-1883-4a86-aba9-87d7ac06e66b","Type":"ContainerDied","Data":"2cb4658c577acfc526f53402d690f3fbfe5d7df88be13fdb8235943c3481bf93"} Oct 09 16:03:40 crc kubenswrapper[4719]: I1009 16:03:40.114293 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kg57n" event={"ID":"6e069a05-1883-4a86-aba9-87d7ac06e66b","Type":"ContainerStarted","Data":"d76397c1555cb364aa306848e06fca8c416e1d4239071bae8f1e48118b0cd25b"} Oct 09 16:03:40 crc kubenswrapper[4719]: I1009 16:03:40.133952 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kg57n" podStartSLOduration=2.443293223 podStartE2EDuration="5.133929152s" podCreationTimestamp="2025-10-09 16:03:35 +0000 UTC" firstStartedPulling="2025-10-09 16:03:37.076863019 +0000 UTC m=+2722.586574304" lastFinishedPulling="2025-10-09 16:03:39.767498948 +0000 UTC m=+2725.277210233" observedRunningTime="2025-10-09 16:03:40.129623505 +0000 UTC m=+2725.639334800" watchObservedRunningTime="2025-10-09 16:03:40.133929152 +0000 UTC m=+2725.643640447" Oct 09 16:03:45 crc kubenswrapper[4719]: I1009 16:03:45.722752 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kg57n" Oct 09 16:03:45 crc kubenswrapper[4719]: I1009 16:03:45.723553 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kg57n" Oct 09 16:03:45 crc kubenswrapper[4719]: I1009 16:03:45.773937 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kg57n" Oct 09 16:03:46 crc kubenswrapper[4719]: I1009 16:03:46.210970 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kg57n" Oct 09 16:03:46 crc kubenswrapper[4719]: I1009 16:03:46.260576 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kg57n"] Oct 09 16:03:48 crc kubenswrapper[4719]: I1009 16:03:48.191982 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kg57n" podUID="6e069a05-1883-4a86-aba9-87d7ac06e66b" containerName="registry-server" containerID="cri-o://d76397c1555cb364aa306848e06fca8c416e1d4239071bae8f1e48118b0cd25b" gracePeriod=2 Oct 09 16:03:48 crc kubenswrapper[4719]: I1009 16:03:48.698713 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kg57n" Oct 09 16:03:48 crc kubenswrapper[4719]: I1009 16:03:48.846479 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e069a05-1883-4a86-aba9-87d7ac06e66b-utilities\") pod \"6e069a05-1883-4a86-aba9-87d7ac06e66b\" (UID: \"6e069a05-1883-4a86-aba9-87d7ac06e66b\") " Oct 09 16:03:48 crc kubenswrapper[4719]: I1009 16:03:48.846532 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flq4b\" (UniqueName: \"kubernetes.io/projected/6e069a05-1883-4a86-aba9-87d7ac06e66b-kube-api-access-flq4b\") pod \"6e069a05-1883-4a86-aba9-87d7ac06e66b\" (UID: \"6e069a05-1883-4a86-aba9-87d7ac06e66b\") " Oct 09 16:03:48 crc kubenswrapper[4719]: I1009 16:03:48.846624 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e069a05-1883-4a86-aba9-87d7ac06e66b-catalog-content\") pod \"6e069a05-1883-4a86-aba9-87d7ac06e66b\" (UID: \"6e069a05-1883-4a86-aba9-87d7ac06e66b\") " Oct 09 16:03:48 crc kubenswrapper[4719]: I1009 16:03:48.847275 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e069a05-1883-4a86-aba9-87d7ac06e66b-utilities" (OuterVolumeSpecName: "utilities") pod "6e069a05-1883-4a86-aba9-87d7ac06e66b" (UID: "6e069a05-1883-4a86-aba9-87d7ac06e66b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:03:48 crc kubenswrapper[4719]: I1009 16:03:48.854509 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e069a05-1883-4a86-aba9-87d7ac06e66b-kube-api-access-flq4b" (OuterVolumeSpecName: "kube-api-access-flq4b") pod "6e069a05-1883-4a86-aba9-87d7ac06e66b" (UID: "6e069a05-1883-4a86-aba9-87d7ac06e66b"). InnerVolumeSpecName "kube-api-access-flq4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:03:48 crc kubenswrapper[4719]: I1009 16:03:48.949667 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e069a05-1883-4a86-aba9-87d7ac06e66b-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 16:03:48 crc kubenswrapper[4719]: I1009 16:03:48.949705 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flq4b\" (UniqueName: \"kubernetes.io/projected/6e069a05-1883-4a86-aba9-87d7ac06e66b-kube-api-access-flq4b\") on node \"crc\" DevicePath \"\"" Oct 09 16:03:48 crc kubenswrapper[4719]: I1009 16:03:48.999594 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e069a05-1883-4a86-aba9-87d7ac06e66b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e069a05-1883-4a86-aba9-87d7ac06e66b" (UID: "6e069a05-1883-4a86-aba9-87d7ac06e66b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:03:49 crc kubenswrapper[4719]: I1009 16:03:49.052517 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e069a05-1883-4a86-aba9-87d7ac06e66b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 16:03:49 crc kubenswrapper[4719]: I1009 16:03:49.207959 4719 generic.go:334] "Generic (PLEG): container finished" podID="6e069a05-1883-4a86-aba9-87d7ac06e66b" containerID="d76397c1555cb364aa306848e06fca8c416e1d4239071bae8f1e48118b0cd25b" exitCode=0 Oct 09 16:03:49 crc kubenswrapper[4719]: I1009 16:03:49.208027 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kg57n" event={"ID":"6e069a05-1883-4a86-aba9-87d7ac06e66b","Type":"ContainerDied","Data":"d76397c1555cb364aa306848e06fca8c416e1d4239071bae8f1e48118b0cd25b"} Oct 09 16:03:49 crc kubenswrapper[4719]: I1009 16:03:49.208072 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kg57n" event={"ID":"6e069a05-1883-4a86-aba9-87d7ac06e66b","Type":"ContainerDied","Data":"4141a03347c33b3ce929c55270a615c587d1e37a93a4e60599ea06c408b39bb3"} Oct 09 16:03:49 crc kubenswrapper[4719]: I1009 16:03:49.208099 4719 scope.go:117] "RemoveContainer" containerID="d76397c1555cb364aa306848e06fca8c416e1d4239071bae8f1e48118b0cd25b" Oct 09 16:03:49 crc kubenswrapper[4719]: I1009 16:03:49.208321 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kg57n" Oct 09 16:03:49 crc kubenswrapper[4719]: I1009 16:03:49.247673 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kg57n"] Oct 09 16:03:49 crc kubenswrapper[4719]: I1009 16:03:49.256801 4719 scope.go:117] "RemoveContainer" containerID="2cb4658c577acfc526f53402d690f3fbfe5d7df88be13fdb8235943c3481bf93" Oct 09 16:03:49 crc kubenswrapper[4719]: I1009 16:03:49.272292 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kg57n"] Oct 09 16:03:49 crc kubenswrapper[4719]: I1009 16:03:49.285086 4719 scope.go:117] "RemoveContainer" containerID="4be6b9932d86c33cd621cbe803f59fdd0402f1d3396cffc2a3e46383c2705cb9" Oct 09 16:03:49 crc kubenswrapper[4719]: I1009 16:03:49.327039 4719 scope.go:117] "RemoveContainer" containerID="d76397c1555cb364aa306848e06fca8c416e1d4239071bae8f1e48118b0cd25b" Oct 09 16:03:49 crc kubenswrapper[4719]: E1009 16:03:49.327474 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d76397c1555cb364aa306848e06fca8c416e1d4239071bae8f1e48118b0cd25b\": container with ID starting with d76397c1555cb364aa306848e06fca8c416e1d4239071bae8f1e48118b0cd25b not found: ID does not exist" containerID="d76397c1555cb364aa306848e06fca8c416e1d4239071bae8f1e48118b0cd25b" Oct 09 16:03:49 crc kubenswrapper[4719]: I1009 16:03:49.327508 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d76397c1555cb364aa306848e06fca8c416e1d4239071bae8f1e48118b0cd25b"} err="failed to get container status \"d76397c1555cb364aa306848e06fca8c416e1d4239071bae8f1e48118b0cd25b\": rpc error: code = NotFound desc = could not find container \"d76397c1555cb364aa306848e06fca8c416e1d4239071bae8f1e48118b0cd25b\": container with ID starting with d76397c1555cb364aa306848e06fca8c416e1d4239071bae8f1e48118b0cd25b not found: ID does not exist" Oct 09 16:03:49 crc kubenswrapper[4719]: I1009 16:03:49.327528 4719 scope.go:117] "RemoveContainer" containerID="2cb4658c577acfc526f53402d690f3fbfe5d7df88be13fdb8235943c3481bf93" Oct 09 16:03:49 crc kubenswrapper[4719]: E1009 16:03:49.327825 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cb4658c577acfc526f53402d690f3fbfe5d7df88be13fdb8235943c3481bf93\": container with ID starting with 2cb4658c577acfc526f53402d690f3fbfe5d7df88be13fdb8235943c3481bf93 not found: ID does not exist" containerID="2cb4658c577acfc526f53402d690f3fbfe5d7df88be13fdb8235943c3481bf93" Oct 09 16:03:49 crc kubenswrapper[4719]: I1009 16:03:49.327872 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb4658c577acfc526f53402d690f3fbfe5d7df88be13fdb8235943c3481bf93"} err="failed to get container status \"2cb4658c577acfc526f53402d690f3fbfe5d7df88be13fdb8235943c3481bf93\": rpc error: code = NotFound desc = could not find container \"2cb4658c577acfc526f53402d690f3fbfe5d7df88be13fdb8235943c3481bf93\": container with ID starting with 2cb4658c577acfc526f53402d690f3fbfe5d7df88be13fdb8235943c3481bf93 not found: ID does not exist" Oct 09 16:03:49 crc kubenswrapper[4719]: I1009 16:03:49.327906 4719 scope.go:117] "RemoveContainer" containerID="4be6b9932d86c33cd621cbe803f59fdd0402f1d3396cffc2a3e46383c2705cb9" Oct 09 16:03:49 crc kubenswrapper[4719]: E1009 16:03:49.328203 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4be6b9932d86c33cd621cbe803f59fdd0402f1d3396cffc2a3e46383c2705cb9\": container with ID starting with 4be6b9932d86c33cd621cbe803f59fdd0402f1d3396cffc2a3e46383c2705cb9 not found: ID does not exist" containerID="4be6b9932d86c33cd621cbe803f59fdd0402f1d3396cffc2a3e46383c2705cb9" Oct 09 16:03:49 crc kubenswrapper[4719]: I1009 16:03:49.328228 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4be6b9932d86c33cd621cbe803f59fdd0402f1d3396cffc2a3e46383c2705cb9"} err="failed to get container status \"4be6b9932d86c33cd621cbe803f59fdd0402f1d3396cffc2a3e46383c2705cb9\": rpc error: code = NotFound desc = could not find container \"4be6b9932d86c33cd621cbe803f59fdd0402f1d3396cffc2a3e46383c2705cb9\": container with ID starting with 4be6b9932d86c33cd621cbe803f59fdd0402f1d3396cffc2a3e46383c2705cb9 not found: ID does not exist" Oct 09 16:03:51 crc kubenswrapper[4719]: I1009 16:03:51.179739 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e069a05-1883-4a86-aba9-87d7ac06e66b" path="/var/lib/kubelet/pods/6e069a05-1883-4a86-aba9-87d7ac06e66b/volumes" Oct 09 16:04:13 crc kubenswrapper[4719]: I1009 16:04:13.438968 4719 generic.go:334] "Generic (PLEG): container finished" podID="0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a" containerID="96579fad0d128493a63b17acfd367bcee0245bedae3079ee812a635e41dcd6c9" exitCode=0 Oct 09 16:04:13 crc kubenswrapper[4719]: I1009 16:04:13.439050 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" event={"ID":"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a","Type":"ContainerDied","Data":"96579fad0d128493a63b17acfd367bcee0245bedae3079ee812a635e41dcd6c9"} Oct 09 16:04:14 crc kubenswrapper[4719]: I1009 16:04:14.879666 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" Oct 09 16:04:15 crc kubenswrapper[4719]: I1009 16:04:15.002821 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-ceilometer-compute-config-data-0\") pod \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " Oct 09 16:04:15 crc kubenswrapper[4719]: I1009 16:04:15.003214 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5hm8\" (UniqueName: \"kubernetes.io/projected/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-kube-api-access-z5hm8\") pod \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " Oct 09 16:04:15 crc kubenswrapper[4719]: I1009 16:04:15.003240 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-ceilometer-compute-config-data-2\") pod \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " Oct 09 16:04:15 crc kubenswrapper[4719]: I1009 16:04:15.003265 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-inventory\") pod \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " Oct 09 16:04:15 crc kubenswrapper[4719]: I1009 16:04:15.003392 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-ceilometer-compute-config-data-1\") pod \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " Oct 09 16:04:15 crc kubenswrapper[4719]: I1009 16:04:15.003447 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-telemetry-combined-ca-bundle\") pod \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " Oct 09 16:04:15 crc kubenswrapper[4719]: I1009 16:04:15.003479 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-ssh-key\") pod \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\" (UID: \"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a\") " Oct 09 16:04:15 crc kubenswrapper[4719]: I1009 16:04:15.009541 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a" (UID: "0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:04:15 crc kubenswrapper[4719]: I1009 16:04:15.010267 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-kube-api-access-z5hm8" (OuterVolumeSpecName: "kube-api-access-z5hm8") pod "0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a" (UID: "0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a"). InnerVolumeSpecName "kube-api-access-z5hm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:04:15 crc kubenswrapper[4719]: I1009 16:04:15.032972 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a" (UID: "0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:04:15 crc kubenswrapper[4719]: I1009 16:04:15.035319 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a" (UID: "0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:04:15 crc kubenswrapper[4719]: I1009 16:04:15.036207 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a" (UID: "0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:04:15 crc kubenswrapper[4719]: I1009 16:04:15.040253 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a" (UID: "0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:04:15 crc kubenswrapper[4719]: I1009 16:04:15.050857 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-inventory" (OuterVolumeSpecName: "inventory") pod "0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a" (UID: "0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:04:15 crc kubenswrapper[4719]: I1009 16:04:15.106427 4719 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 09 16:04:15 crc kubenswrapper[4719]: I1009 16:04:15.106476 4719 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 16:04:15 crc kubenswrapper[4719]: I1009 16:04:15.106495 4719 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 16:04:15 crc kubenswrapper[4719]: I1009 16:04:15.106510 4719 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 09 16:04:15 crc kubenswrapper[4719]: I1009 16:04:15.106524 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5hm8\" (UniqueName: \"kubernetes.io/projected/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-kube-api-access-z5hm8\") on node \"crc\" DevicePath \"\"" Oct 09 16:04:15 crc kubenswrapper[4719]: I1009 16:04:15.106537 4719 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 09 16:04:15 crc kubenswrapper[4719]: I1009 16:04:15.106549 4719 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 16:04:15 crc kubenswrapper[4719]: I1009 16:04:15.459421 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" event={"ID":"0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a","Type":"ContainerDied","Data":"ba351ed4ff6b20f3ba67d0629c7cc61def443221402303b15ccda9dbfe71a67c"} Oct 09 16:04:15 crc kubenswrapper[4719]: I1009 16:04:15.459463 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba351ed4ff6b20f3ba67d0629c7cc61def443221402303b15ccda9dbfe71a67c" Oct 09 16:04:15 crc kubenswrapper[4719]: I1009 16:04:15.459470 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt" Oct 09 16:04:23 crc kubenswrapper[4719]: E1009 16:04:23.312955 4719 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d3cdd16_36d0_40d7_8f12_62c79d0e0c9a.slice\": RecentStats: unable to find data in memory cache]" Oct 09 16:04:33 crc kubenswrapper[4719]: E1009 16:04:33.597935 4719 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d3cdd16_36d0_40d7_8f12_62c79d0e0c9a.slice\": RecentStats: unable to find data in memory cache]" Oct 09 16:04:43 crc kubenswrapper[4719]: E1009 16:04:43.871343 4719 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d3cdd16_36d0_40d7_8f12_62c79d0e0c9a.slice\": RecentStats: unable to find data in memory cache]" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.240609 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Oct 09 16:04:50 crc kubenswrapper[4719]: E1009 16:04:50.241791 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e069a05-1883-4a86-aba9-87d7ac06e66b" containerName="registry-server" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.241809 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e069a05-1883-4a86-aba9-87d7ac06e66b" containerName="registry-server" Oct 09 16:04:50 crc kubenswrapper[4719]: E1009 16:04:50.241833 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.241843 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 09 16:04:50 crc kubenswrapper[4719]: E1009 16:04:50.241866 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e069a05-1883-4a86-aba9-87d7ac06e66b" containerName="extract-utilities" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.241873 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e069a05-1883-4a86-aba9-87d7ac06e66b" containerName="extract-utilities" Oct 09 16:04:50 crc kubenswrapper[4719]: E1009 16:04:50.241908 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e069a05-1883-4a86-aba9-87d7ac06e66b" containerName="extract-content" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.241916 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e069a05-1883-4a86-aba9-87d7ac06e66b" containerName="extract-content" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.242131 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e069a05-1883-4a86-aba9-87d7ac06e66b" containerName="registry-server" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.242168 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.247995 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.254103 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.262936 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.346608 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-0"] Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.348786 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.352778 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-config-data" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.369203 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-lib-modules\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.369250 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-dev\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.369307 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-config-data-custom\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.369406 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-config-data\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.369431 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.369445 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.369473 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.369492 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.369513 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-scripts\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.369533 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.369556 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-run\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.369575 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqqc8\" (UniqueName: \"kubernetes.io/projected/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-kube-api-access-lqqc8\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.369614 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.369634 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-etc-nvme\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.369653 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-sys\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.379280 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.399868 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.402499 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.405006 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-2-config-data" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.412707 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.471922 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-etc-nvme\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.471981 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.472053 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-sys\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.472078 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.472093 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b95abe4c-159b-460a-b238-3be4b341ccc2-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.472164 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3633b1f-2c6e-4483-8255-71551f8a25db-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.472189 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.472280 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-sys\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.472314 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.472335 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b95abe4c-159b-460a-b238-3be4b341ccc2-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.472423 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jljvb\" (UniqueName: \"kubernetes.io/projected/c3633b1f-2c6e-4483-8255-71551f8a25db-kube-api-access-jljvb\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.472211 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-sys\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.472510 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.472567 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-lib-modules\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.472611 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.472613 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-lib-modules\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.472692 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-dev\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.472707 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-etc-nvme\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.472726 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-dev\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.472820 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b95abe4c-159b-460a-b238-3be4b341ccc2-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.472882 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3633b1f-2c6e-4483-8255-71551f8a25db-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.472932 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.473036 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-config-data-custom\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.473081 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.473122 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b95abe4c-159b-460a-b238-3be4b341ccc2-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.473155 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.473383 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-config-data\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.473686 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.473725 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.473753 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.474044 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.474581 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.474632 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.474674 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.474691 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.474773 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-scripts\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.474800 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjdnf\" (UniqueName: \"kubernetes.io/projected/b95abe4c-159b-460a-b238-3be4b341ccc2-kube-api-access-xjdnf\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.474815 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.474839 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.474862 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.474909 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-run\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.474950 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-run\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.474997 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqqc8\" (UniqueName: \"kubernetes.io/projected/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-kube-api-access-lqqc8\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.475015 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-run\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.475025 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.475054 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3633b1f-2c6e-4483-8255-71551f8a25db-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.475072 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-dev\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.475206 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.475278 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3633b1f-2c6e-4483-8255-71551f8a25db-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.475320 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.475343 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.475396 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.475428 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.475445 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.475813 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.481067 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-scripts\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.481196 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.482013 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-config-data-custom\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.482904 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-config-data\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.492042 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqqc8\" (UniqueName: \"kubernetes.io/projected/7fd9ad9a-1651-46cc-9c22-adae6a548ef8-kube-api-access-lqqc8\") pod \"cinder-backup-0\" (UID: \"7fd9ad9a-1651-46cc-9c22-adae6a548ef8\") " pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.573760 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.577607 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.577644 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3633b1f-2c6e-4483-8255-71551f8a25db-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.577667 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.577685 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.577712 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.577730 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.577734 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.577745 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.577767 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.577789 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.577805 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.577806 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b95abe4c-159b-460a-b238-3be4b341ccc2-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.577833 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3633b1f-2c6e-4483-8255-71551f8a25db-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.577846 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.577871 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-sys\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.577891 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.577906 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b95abe4c-159b-460a-b238-3be4b341ccc2-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.577932 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jljvb\" (UniqueName: \"kubernetes.io/projected/c3633b1f-2c6e-4483-8255-71551f8a25db-kube-api-access-jljvb\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.577949 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.577966 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.577986 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b95abe4c-159b-460a-b238-3be4b341ccc2-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.578004 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3633b1f-2c6e-4483-8255-71551f8a25db-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.578021 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.578048 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.578064 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b95abe4c-159b-460a-b238-3be4b341ccc2-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.578080 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.578120 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.578147 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.578177 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjdnf\" (UniqueName: \"kubernetes.io/projected/b95abe4c-159b-460a-b238-3be4b341ccc2-kube-api-access-xjdnf\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.578202 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-run\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.578260 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.578276 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-dev\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.578292 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3633b1f-2c6e-4483-8255-71551f8a25db-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.578290 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.578735 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.578773 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.578807 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.578953 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.579031 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.579078 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.579610 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.579673 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.579702 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.579757 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.579792 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-sys\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.579831 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.579865 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3633b1f-2c6e-4483-8255-71551f8a25db-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.580025 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.580063 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-dev\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.580093 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b95abe4c-159b-460a-b238-3be4b341ccc2-run\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.581599 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b95abe4c-159b-460a-b238-3be4b341ccc2-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.582959 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b95abe4c-159b-460a-b238-3be4b341ccc2-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.583111 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b95abe4c-159b-460a-b238-3be4b341ccc2-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.583884 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3633b1f-2c6e-4483-8255-71551f8a25db-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.584162 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3633b1f-2c6e-4483-8255-71551f8a25db-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.584490 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3633b1f-2c6e-4483-8255-71551f8a25db-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.585793 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b95abe4c-159b-460a-b238-3be4b341ccc2-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.587069 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3633b1f-2c6e-4483-8255-71551f8a25db-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.598324 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jljvb\" (UniqueName: \"kubernetes.io/projected/c3633b1f-2c6e-4483-8255-71551f8a25db-kube-api-access-jljvb\") pod \"cinder-volume-nfs-2-0\" (UID: \"c3633b1f-2c6e-4483-8255-71551f8a25db\") " pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.604146 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjdnf\" (UniqueName: \"kubernetes.io/projected/b95abe4c-159b-460a-b238-3be4b341ccc2-kube-api-access-xjdnf\") pod \"cinder-volume-nfs-0\" (UID: \"b95abe4c-159b-460a-b238-3be4b341ccc2\") " pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.668540 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:50 crc kubenswrapper[4719]: I1009 16:04:50.766114 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:04:51 crc kubenswrapper[4719]: W1009 16:04:51.261803 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fd9ad9a_1651_46cc_9c22_adae6a548ef8.slice/crio-faebcbc905537279fb3115d58b8dc6c56ebde12494a55c5e33d22fed92b35576 WatchSource:0}: Error finding container faebcbc905537279fb3115d58b8dc6c56ebde12494a55c5e33d22fed92b35576: Status 404 returned error can't find the container with id faebcbc905537279fb3115d58b8dc6c56ebde12494a55c5e33d22fed92b35576 Oct 09 16:04:51 crc kubenswrapper[4719]: I1009 16:04:51.262775 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 09 16:04:51 crc kubenswrapper[4719]: I1009 16:04:51.267932 4719 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 16:04:51 crc kubenswrapper[4719]: I1009 16:04:51.387804 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Oct 09 16:04:51 crc kubenswrapper[4719]: I1009 16:04:51.554605 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Oct 09 16:04:51 crc kubenswrapper[4719]: I1009 16:04:51.835534 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"b95abe4c-159b-460a-b238-3be4b341ccc2","Type":"ContainerStarted","Data":"e8fdb70743abbd1e53e13ab3fb8ffea9a37ef9d0ed40ac56c06d71c85e213d4c"} Oct 09 16:04:51 crc kubenswrapper[4719]: I1009 16:04:51.836879 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"7fd9ad9a-1651-46cc-9c22-adae6a548ef8","Type":"ContainerStarted","Data":"faebcbc905537279fb3115d58b8dc6c56ebde12494a55c5e33d22fed92b35576"} Oct 09 16:04:51 crc kubenswrapper[4719]: I1009 16:04:51.840274 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"c3633b1f-2c6e-4483-8255-71551f8a25db","Type":"ContainerStarted","Data":"c91606205a121669fc2b0ee906953cab4c1067801cbd97f2497e343ab41e2a17"} Oct 09 16:04:52 crc kubenswrapper[4719]: I1009 16:04:52.852010 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"b95abe4c-159b-460a-b238-3be4b341ccc2","Type":"ContainerStarted","Data":"db61009b87e09c55a19130c9fec5ceb65886271f9cfaae4306b5e1ea6eed2880"} Oct 09 16:04:52 crc kubenswrapper[4719]: I1009 16:04:52.852701 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"b95abe4c-159b-460a-b238-3be4b341ccc2","Type":"ContainerStarted","Data":"ce353f07a9e814fe5755aa9fb9194776b93b1a448e14ebdb6edfaa6f5a846f3e"} Oct 09 16:04:52 crc kubenswrapper[4719]: I1009 16:04:52.855303 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"7fd9ad9a-1651-46cc-9c22-adae6a548ef8","Type":"ContainerStarted","Data":"66eb2c8e5b8b0d965c540b7f853883ba484fd541d611412af3acb4ae92b73257"} Oct 09 16:04:52 crc kubenswrapper[4719]: I1009 16:04:52.855336 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"7fd9ad9a-1651-46cc-9c22-adae6a548ef8","Type":"ContainerStarted","Data":"fa889fa133ebe43f6237682128d3d46e7eb356f4a353bc8385ae33d62660199b"} Oct 09 16:04:52 crc kubenswrapper[4719]: I1009 16:04:52.857218 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"c3633b1f-2c6e-4483-8255-71551f8a25db","Type":"ContainerStarted","Data":"f95a71c55fa9d57cc5f6065a1de04d5d234ab0bba7328defd02af789733bf953"} Oct 09 16:04:52 crc kubenswrapper[4719]: I1009 16:04:52.857245 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"c3633b1f-2c6e-4483-8255-71551f8a25db","Type":"ContainerStarted","Data":"d42ec22ebfde228e57d9484a24880301b72f9c90ff7b83575ebe7a6cb3ec7407"} Oct 09 16:04:52 crc kubenswrapper[4719]: I1009 16:04:52.890682 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-0" podStartSLOduration=2.618313997 podStartE2EDuration="2.890667286s" podCreationTimestamp="2025-10-09 16:04:50 +0000 UTC" firstStartedPulling="2025-10-09 16:04:51.466051744 +0000 UTC m=+2796.975763029" lastFinishedPulling="2025-10-09 16:04:51.738405033 +0000 UTC m=+2797.248116318" observedRunningTime="2025-10-09 16:04:52.884804629 +0000 UTC m=+2798.394515944" watchObservedRunningTime="2025-10-09 16:04:52.890667286 +0000 UTC m=+2798.400378571" Oct 09 16:04:52 crc kubenswrapper[4719]: I1009 16:04:52.940792 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-2-0" podStartSLOduration=2.940777277 podStartE2EDuration="2.940777277s" podCreationTimestamp="2025-10-09 16:04:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 16:04:52.91677167 +0000 UTC m=+2798.426482955" watchObservedRunningTime="2025-10-09 16:04:52.940777277 +0000 UTC m=+2798.450488562" Oct 09 16:04:52 crc kubenswrapper[4719]: I1009 16:04:52.948282 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.70638215 podStartE2EDuration="2.948264345s" podCreationTimestamp="2025-10-09 16:04:50 +0000 UTC" firstStartedPulling="2025-10-09 16:04:51.267522603 +0000 UTC m=+2796.777233878" lastFinishedPulling="2025-10-09 16:04:51.509404788 +0000 UTC m=+2797.019116073" observedRunningTime="2025-10-09 16:04:52.938754502 +0000 UTC m=+2798.448465787" watchObservedRunningTime="2025-10-09 16:04:52.948264345 +0000 UTC m=+2798.457975630" Oct 09 16:04:54 crc kubenswrapper[4719]: E1009 16:04:54.158321 4719 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d3cdd16_36d0_40d7_8f12_62c79d0e0c9a.slice\": RecentStats: unable to find data in memory cache]" Oct 09 16:04:55 crc kubenswrapper[4719]: I1009 16:04:55.574184 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Oct 09 16:04:55 crc kubenswrapper[4719]: I1009 16:04:55.668934 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-0" Oct 09 16:04:55 crc kubenswrapper[4719]: I1009 16:04:55.767424 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:05:00 crc kubenswrapper[4719]: I1009 16:05:00.770965 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Oct 09 16:05:00 crc kubenswrapper[4719]: I1009 16:05:00.951615 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-0" Oct 09 16:05:01 crc kubenswrapper[4719]: I1009 16:05:01.044697 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-2-0" Oct 09 16:05:04 crc kubenswrapper[4719]: E1009 16:05:04.424086 4719 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d3cdd16_36d0_40d7_8f12_62c79d0e0c9a.slice\": RecentStats: unable to find data in memory cache]" Oct 09 16:05:14 crc kubenswrapper[4719]: E1009 16:05:14.688359 4719 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d3cdd16_36d0_40d7_8f12_62c79d0e0c9a.slice\": RecentStats: unable to find data in memory cache]" Oct 09 16:05:36 crc kubenswrapper[4719]: I1009 16:05:36.976863 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:05:36 crc kubenswrapper[4719]: I1009 16:05:36.977521 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:05:53 crc kubenswrapper[4719]: I1009 16:05:53.930456 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 09 16:05:53 crc kubenswrapper[4719]: I1009 16:05:53.931268 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="936ff8ba-7ad7-4796-af1a-4b1cbf75f560" containerName="prometheus" containerID="cri-o://a251b00040279e44f69da7e525e52b3c57cc34b753966fadd377ad44ac45289d" gracePeriod=600 Oct 09 16:05:53 crc kubenswrapper[4719]: I1009 16:05:53.931366 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="936ff8ba-7ad7-4796-af1a-4b1cbf75f560" containerName="thanos-sidecar" containerID="cri-o://a010c0eed107a2550b83adc1060a991b99bc6bfc506ebd0f7dbc9e1b3b961505" gracePeriod=600 Oct 09 16:05:53 crc kubenswrapper[4719]: I1009 16:05:53.931448 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="936ff8ba-7ad7-4796-af1a-4b1cbf75f560" containerName="config-reloader" containerID="cri-o://68ee72b70100c95f49d5c64a588f6a9104cf4ddfe72e3dd2e724807481956bd4" gracePeriod=600 Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.458471 4719 generic.go:334] "Generic (PLEG): container finished" podID="936ff8ba-7ad7-4796-af1a-4b1cbf75f560" containerID="a010c0eed107a2550b83adc1060a991b99bc6bfc506ebd0f7dbc9e1b3b961505" exitCode=0 Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.458964 4719 generic.go:334] "Generic (PLEG): container finished" podID="936ff8ba-7ad7-4796-af1a-4b1cbf75f560" containerID="68ee72b70100c95f49d5c64a588f6a9104cf4ddfe72e3dd2e724807481956bd4" exitCode=0 Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.458982 4719 generic.go:334] "Generic (PLEG): container finished" podID="936ff8ba-7ad7-4796-af1a-4b1cbf75f560" containerID="a251b00040279e44f69da7e525e52b3c57cc34b753966fadd377ad44ac45289d" exitCode=0 Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.458708 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"936ff8ba-7ad7-4796-af1a-4b1cbf75f560","Type":"ContainerDied","Data":"a010c0eed107a2550b83adc1060a991b99bc6bfc506ebd0f7dbc9e1b3b961505"} Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.459045 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"936ff8ba-7ad7-4796-af1a-4b1cbf75f560","Type":"ContainerDied","Data":"68ee72b70100c95f49d5c64a588f6a9104cf4ddfe72e3dd2e724807481956bd4"} Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.459072 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"936ff8ba-7ad7-4796-af1a-4b1cbf75f560","Type":"ContainerDied","Data":"a251b00040279e44f69da7e525e52b3c57cc34b753966fadd377ad44ac45289d"} Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.735191 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.848242 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-thanos-prometheus-http-client-file\") pod \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.848287 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-prometheus-metric-storage-rulefiles-0\") pod \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.848881 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\") pod \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.848919 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-tls-assets\") pod \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.848988 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-config-out\") pod \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.849063 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.849166 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkpd8\" (UniqueName: \"kubernetes.io/projected/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-kube-api-access-fkpd8\") pod \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.849193 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.849218 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-config\") pod \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.849270 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-secret-combined-ca-bundle\") pod \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.849317 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-web-config\") pod \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\" (UID: \"936ff8ba-7ad7-4796-af1a-4b1cbf75f560\") " Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.849689 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "936ff8ba-7ad7-4796-af1a-4b1cbf75f560" (UID: "936ff8ba-7ad7-4796-af1a-4b1cbf75f560"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.850249 4719 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.855671 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "936ff8ba-7ad7-4796-af1a-4b1cbf75f560" (UID: "936ff8ba-7ad7-4796-af1a-4b1cbf75f560"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.857205 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "936ff8ba-7ad7-4796-af1a-4b1cbf75f560" (UID: "936ff8ba-7ad7-4796-af1a-4b1cbf75f560"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.857207 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-config-out" (OuterVolumeSpecName: "config-out") pod "936ff8ba-7ad7-4796-af1a-4b1cbf75f560" (UID: "936ff8ba-7ad7-4796-af1a-4b1cbf75f560"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.858073 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-config" (OuterVolumeSpecName: "config") pod "936ff8ba-7ad7-4796-af1a-4b1cbf75f560" (UID: "936ff8ba-7ad7-4796-af1a-4b1cbf75f560"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.858127 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "936ff8ba-7ad7-4796-af1a-4b1cbf75f560" (UID: "936ff8ba-7ad7-4796-af1a-4b1cbf75f560"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.858455 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "936ff8ba-7ad7-4796-af1a-4b1cbf75f560" (UID: "936ff8ba-7ad7-4796-af1a-4b1cbf75f560"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.858704 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "936ff8ba-7ad7-4796-af1a-4b1cbf75f560" (UID: "936ff8ba-7ad7-4796-af1a-4b1cbf75f560"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.858791 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-kube-api-access-fkpd8" (OuterVolumeSpecName: "kube-api-access-fkpd8") pod "936ff8ba-7ad7-4796-af1a-4b1cbf75f560" (UID: "936ff8ba-7ad7-4796-af1a-4b1cbf75f560"). InnerVolumeSpecName "kube-api-access-fkpd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.883953 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "936ff8ba-7ad7-4796-af1a-4b1cbf75f560" (UID: "936ff8ba-7ad7-4796-af1a-4b1cbf75f560"). InnerVolumeSpecName "pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.938929 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-web-config" (OuterVolumeSpecName: "web-config") pod "936ff8ba-7ad7-4796-af1a-4b1cbf75f560" (UID: "936ff8ba-7ad7-4796-af1a-4b1cbf75f560"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.951905 4719 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-config-out\") on node \"crc\" DevicePath \"\"" Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.951950 4719 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.951967 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkpd8\" (UniqueName: \"kubernetes.io/projected/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-kube-api-access-fkpd8\") on node \"crc\" DevicePath \"\"" Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.951982 4719 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.951996 4719 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-config\") on node \"crc\" DevicePath \"\"" Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.952080 4719 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.952093 4719 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-web-config\") on node \"crc\" DevicePath \"\"" Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.952107 4719 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.952178 4719 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\") on node \"crc\" " Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.952197 4719 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/936ff8ba-7ad7-4796-af1a-4b1cbf75f560-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.981985 4719 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 09 16:05:54 crc kubenswrapper[4719]: I1009 16:05:54.983882 4719 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1") on node "crc" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.054908 4719 reconciler_common.go:293] "Volume detached for volume \"pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\") on node \"crc\" DevicePath \"\"" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.479815 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"936ff8ba-7ad7-4796-af1a-4b1cbf75f560","Type":"ContainerDied","Data":"145fbdf04b1f275a452075b05971b0e2bd5e559cddecce92628d9397dd991031"} Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.479876 4719 scope.go:117] "RemoveContainer" containerID="a010c0eed107a2550b83adc1060a991b99bc6bfc506ebd0f7dbc9e1b3b961505" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.479929 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.530415 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.545176 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.545949 4719 scope.go:117] "RemoveContainer" containerID="68ee72b70100c95f49d5c64a588f6a9104cf4ddfe72e3dd2e724807481956bd4" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.572581 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 09 16:05:55 crc kubenswrapper[4719]: E1009 16:05:55.573094 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="936ff8ba-7ad7-4796-af1a-4b1cbf75f560" containerName="prometheus" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.573113 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="936ff8ba-7ad7-4796-af1a-4b1cbf75f560" containerName="prometheus" Oct 09 16:05:55 crc kubenswrapper[4719]: E1009 16:05:55.573131 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="936ff8ba-7ad7-4796-af1a-4b1cbf75f560" containerName="config-reloader" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.573141 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="936ff8ba-7ad7-4796-af1a-4b1cbf75f560" containerName="config-reloader" Oct 09 16:05:55 crc kubenswrapper[4719]: E1009 16:05:55.573163 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="936ff8ba-7ad7-4796-af1a-4b1cbf75f560" containerName="thanos-sidecar" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.573170 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="936ff8ba-7ad7-4796-af1a-4b1cbf75f560" containerName="thanos-sidecar" Oct 09 16:05:55 crc kubenswrapper[4719]: E1009 16:05:55.573190 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="936ff8ba-7ad7-4796-af1a-4b1cbf75f560" containerName="init-config-reloader" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.573199 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="936ff8ba-7ad7-4796-af1a-4b1cbf75f560" containerName="init-config-reloader" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.573458 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="936ff8ba-7ad7-4796-af1a-4b1cbf75f560" containerName="thanos-sidecar" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.573489 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="936ff8ba-7ad7-4796-af1a-4b1cbf75f560" containerName="config-reloader" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.573506 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="936ff8ba-7ad7-4796-af1a-4b1cbf75f560" containerName="prometheus" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.575969 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.586332 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.586619 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.586768 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.586918 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-4npj7" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.587050 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.597556 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.600022 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.606985 4719 scope.go:117] "RemoveContainer" containerID="a251b00040279e44f69da7e525e52b3c57cc34b753966fadd377ad44ac45289d" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.649204 4719 scope.go:117] "RemoveContainer" containerID="338a1afba406d4123764fa09c2c00cb6cc9e088c9b16d3636056291c2b3a2576" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.668171 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9cebaae5-69d4-4429-a062-aef6cafb9f4a-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.668259 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9cebaae5-69d4-4429-a062-aef6cafb9f4a-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.668868 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9cebaae5-69d4-4429-a062-aef6cafb9f4a-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.668921 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9cebaae5-69d4-4429-a062-aef6cafb9f4a-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.669011 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9cebaae5-69d4-4429-a062-aef6cafb9f4a-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.669085 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.669246 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9cebaae5-69d4-4429-a062-aef6cafb9f4a-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.669308 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9cebaae5-69d4-4429-a062-aef6cafb9f4a-config\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.669341 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cebaae5-69d4-4429-a062-aef6cafb9f4a-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.669392 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnbv8\" (UniqueName: \"kubernetes.io/projected/9cebaae5-69d4-4429-a062-aef6cafb9f4a-kube-api-access-tnbv8\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.669425 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9cebaae5-69d4-4429-a062-aef6cafb9f4a-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.770958 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9cebaae5-69d4-4429-a062-aef6cafb9f4a-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.771013 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9cebaae5-69d4-4429-a062-aef6cafb9f4a-config\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.771039 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cebaae5-69d4-4429-a062-aef6cafb9f4a-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.771061 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnbv8\" (UniqueName: \"kubernetes.io/projected/9cebaae5-69d4-4429-a062-aef6cafb9f4a-kube-api-access-tnbv8\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.771082 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9cebaae5-69d4-4429-a062-aef6cafb9f4a-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.771159 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9cebaae5-69d4-4429-a062-aef6cafb9f4a-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.771186 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9cebaae5-69d4-4429-a062-aef6cafb9f4a-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.771227 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9cebaae5-69d4-4429-a062-aef6cafb9f4a-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.771250 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9cebaae5-69d4-4429-a062-aef6cafb9f4a-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.771290 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9cebaae5-69d4-4429-a062-aef6cafb9f4a-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.771332 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.772257 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9cebaae5-69d4-4429-a062-aef6cafb9f4a-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.775620 4719 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.775656 4719 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/37a51f19e15282ab5032b2bf09c91363092e9b48becd8acf5f5419f3d47a69ff/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.779509 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9cebaae5-69d4-4429-a062-aef6cafb9f4a-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.779897 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cebaae5-69d4-4429-a062-aef6cafb9f4a-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.780306 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9cebaae5-69d4-4429-a062-aef6cafb9f4a-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.780686 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9cebaae5-69d4-4429-a062-aef6cafb9f4a-config\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.780699 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9cebaae5-69d4-4429-a062-aef6cafb9f4a-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.783275 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9cebaae5-69d4-4429-a062-aef6cafb9f4a-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.786632 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9cebaae5-69d4-4429-a062-aef6cafb9f4a-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.800182 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/9cebaae5-69d4-4429-a062-aef6cafb9f4a-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.800763 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnbv8\" (UniqueName: \"kubernetes.io/projected/9cebaae5-69d4-4429-a062-aef6cafb9f4a-kube-api-access-tnbv8\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.837840 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc309e3-6788-4c9b-a012-eab4f39ddcb1\") pod \"prometheus-metric-storage-0\" (UID: \"9cebaae5-69d4-4429-a062-aef6cafb9f4a\") " pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:55 crc kubenswrapper[4719]: I1009 16:05:55.918335 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 09 16:05:56 crc kubenswrapper[4719]: I1009 16:05:56.429038 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 09 16:05:56 crc kubenswrapper[4719]: I1009 16:05:56.489696 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9cebaae5-69d4-4429-a062-aef6cafb9f4a","Type":"ContainerStarted","Data":"96672727c0e933a63dbac261b3bc57a2251525821918ecd28e733571353315b2"} Oct 09 16:05:57 crc kubenswrapper[4719]: I1009 16:05:57.173657 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="936ff8ba-7ad7-4796-af1a-4b1cbf75f560" path="/var/lib/kubelet/pods/936ff8ba-7ad7-4796-af1a-4b1cbf75f560/volumes" Oct 09 16:05:59 crc kubenswrapper[4719]: I1009 16:05:59.518539 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9cebaae5-69d4-4429-a062-aef6cafb9f4a","Type":"ContainerStarted","Data":"df745e132d3ea9100b0f2fa3c66989b1af1cf1a2d8ff275c6c0606f8639e7db3"} Oct 09 16:06:06 crc kubenswrapper[4719]: I1009 16:06:06.594043 4719 generic.go:334] "Generic (PLEG): container finished" podID="9cebaae5-69d4-4429-a062-aef6cafb9f4a" containerID="df745e132d3ea9100b0f2fa3c66989b1af1cf1a2d8ff275c6c0606f8639e7db3" exitCode=0 Oct 09 16:06:06 crc kubenswrapper[4719]: I1009 16:06:06.594264 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9cebaae5-69d4-4429-a062-aef6cafb9f4a","Type":"ContainerDied","Data":"df745e132d3ea9100b0f2fa3c66989b1af1cf1a2d8ff275c6c0606f8639e7db3"} Oct 09 16:06:06 crc kubenswrapper[4719]: I1009 16:06:06.976475 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:06:06 crc kubenswrapper[4719]: I1009 16:06:06.977088 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:06:07 crc kubenswrapper[4719]: I1009 16:06:07.625645 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9cebaae5-69d4-4429-a062-aef6cafb9f4a","Type":"ContainerStarted","Data":"55c5d67c1e7da3efedfa63b09926bb0d6b5f2cc71325446ba5a0b0c20f2268a8"} Oct 09 16:06:10 crc kubenswrapper[4719]: I1009 16:06:10.657880 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9cebaae5-69d4-4429-a062-aef6cafb9f4a","Type":"ContainerStarted","Data":"824a83a9eaeb00c89b7df16088acc0f4d5b551a5182dbb0b4ad92a9df1084b51"} Oct 09 16:06:10 crc kubenswrapper[4719]: I1009 16:06:10.658505 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9cebaae5-69d4-4429-a062-aef6cafb9f4a","Type":"ContainerStarted","Data":"e0ed9c5daca9e49071602b8e36dd7a2eb670cd3da227cd7e02eae43a689632c1"} Oct 09 16:06:10 crc kubenswrapper[4719]: I1009 16:06:10.683955 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=15.683937194 podStartE2EDuration="15.683937194s" podCreationTimestamp="2025-10-09 16:05:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 16:06:10.683120639 +0000 UTC m=+2876.192831924" watchObservedRunningTime="2025-10-09 16:06:10.683937194 +0000 UTC m=+2876.193648479" Oct 09 16:06:10 crc kubenswrapper[4719]: I1009 16:06:10.919560 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 09 16:06:10 crc kubenswrapper[4719]: I1009 16:06:10.919609 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 09 16:06:10 crc kubenswrapper[4719]: I1009 16:06:10.925742 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 09 16:06:11 crc kubenswrapper[4719]: I1009 16:06:11.670427 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 09 16:06:36 crc kubenswrapper[4719]: I1009 16:06:36.977098 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:06:36 crc kubenswrapper[4719]: I1009 16:06:36.978528 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:06:36 crc kubenswrapper[4719]: I1009 16:06:36.978609 4719 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 16:06:36 crc kubenswrapper[4719]: I1009 16:06:36.979436 4719 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"920ea73af6d2fdd926cecc482b1dc2188636a4f6e8a6fe0fff4b95bcf354a955"} pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 16:06:36 crc kubenswrapper[4719]: I1009 16:06:36.979506 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" containerID="cri-o://920ea73af6d2fdd926cecc482b1dc2188636a4f6e8a6fe0fff4b95bcf354a955" gracePeriod=600 Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.639609 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.642820 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.645289 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.646167 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-vfn4v" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.646473 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.646696 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.659186 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.698947 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.699007 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8905824a-8f15-4df7-b938-b63f2a5aebb1-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.699232 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8905824a-8f15-4df7-b938-b63f2a5aebb1-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.699266 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8905824a-8f15-4df7-b938-b63f2a5aebb1-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.699331 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8905824a-8f15-4df7-b938-b63f2a5aebb1-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.699421 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8905824a-8f15-4df7-b938-b63f2a5aebb1-config-data\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.699549 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8905824a-8f15-4df7-b938-b63f2a5aebb1-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.699642 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8905824a-8f15-4df7-b938-b63f2a5aebb1-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.699740 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ch8x\" (UniqueName: \"kubernetes.io/projected/8905824a-8f15-4df7-b938-b63f2a5aebb1-kube-api-access-7ch8x\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.801659 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8905824a-8f15-4df7-b938-b63f2a5aebb1-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.802051 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8905824a-8f15-4df7-b938-b63f2a5aebb1-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.802089 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8905824a-8f15-4df7-b938-b63f2a5aebb1-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.802134 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8905824a-8f15-4df7-b938-b63f2a5aebb1-config-data\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.802170 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8905824a-8f15-4df7-b938-b63f2a5aebb1-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.802192 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8905824a-8f15-4df7-b938-b63f2a5aebb1-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.802220 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ch8x\" (UniqueName: \"kubernetes.io/projected/8905824a-8f15-4df7-b938-b63f2a5aebb1-kube-api-access-7ch8x\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.802278 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.802298 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8905824a-8f15-4df7-b938-b63f2a5aebb1-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.802646 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8905824a-8f15-4df7-b938-b63f2a5aebb1-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.802799 4719 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.803435 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8905824a-8f15-4df7-b938-b63f2a5aebb1-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.803742 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8905824a-8f15-4df7-b938-b63f2a5aebb1-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.804049 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8905824a-8f15-4df7-b938-b63f2a5aebb1-config-data\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.808233 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8905824a-8f15-4df7-b938-b63f2a5aebb1-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.809265 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8905824a-8f15-4df7-b938-b63f2a5aebb1-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.810099 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8905824a-8f15-4df7-b938-b63f2a5aebb1-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.835666 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ch8x\" (UniqueName: \"kubernetes.io/projected/8905824a-8f15-4df7-b938-b63f2a5aebb1-kube-api-access-7ch8x\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.849720 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " pod="openstack/tempest-tests-tempest" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.924120 4719 generic.go:334] "Generic (PLEG): container finished" podID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerID="920ea73af6d2fdd926cecc482b1dc2188636a4f6e8a6fe0fff4b95bcf354a955" exitCode=0 Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.924172 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerDied","Data":"920ea73af6d2fdd926cecc482b1dc2188636a4f6e8a6fe0fff4b95bcf354a955"} Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.924203 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerStarted","Data":"e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e"} Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.924220 4719 scope.go:117] "RemoveContainer" containerID="ea4fd9c18f02a0999586973814878f55184063f0958e1fb25fd19302f9bb81f9" Oct 09 16:06:37 crc kubenswrapper[4719]: I1009 16:06:37.962274 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 09 16:06:38 crc kubenswrapper[4719]: I1009 16:06:38.426125 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 09 16:06:38 crc kubenswrapper[4719]: W1009 16:06:38.429751 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8905824a_8f15_4df7_b938_b63f2a5aebb1.slice/crio-2f68ed9af2b0741aa503f9e940fa5a4c11805bd4e49ec880fb00e2ec7ed1b8d3 WatchSource:0}: Error finding container 2f68ed9af2b0741aa503f9e940fa5a4c11805bd4e49ec880fb00e2ec7ed1b8d3: Status 404 returned error can't find the container with id 2f68ed9af2b0741aa503f9e940fa5a4c11805bd4e49ec880fb00e2ec7ed1b8d3 Oct 09 16:06:38 crc kubenswrapper[4719]: I1009 16:06:38.939854 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"8905824a-8f15-4df7-b938-b63f2a5aebb1","Type":"ContainerStarted","Data":"2f68ed9af2b0741aa503f9e940fa5a4c11805bd4e49ec880fb00e2ec7ed1b8d3"} Oct 09 16:06:47 crc kubenswrapper[4719]: I1009 16:06:47.818875 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jhxtk"] Oct 09 16:06:47 crc kubenswrapper[4719]: I1009 16:06:47.825405 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jhxtk" Oct 09 16:06:47 crc kubenswrapper[4719]: I1009 16:06:47.847521 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jhxtk"] Oct 09 16:06:47 crc kubenswrapper[4719]: I1009 16:06:47.909763 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c8218d6-7a30-4d12-873d-db54dd2d2217-catalog-content\") pod \"redhat-marketplace-jhxtk\" (UID: \"7c8218d6-7a30-4d12-873d-db54dd2d2217\") " pod="openshift-marketplace/redhat-marketplace-jhxtk" Oct 09 16:06:47 crc kubenswrapper[4719]: I1009 16:06:47.909923 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c8218d6-7a30-4d12-873d-db54dd2d2217-utilities\") pod \"redhat-marketplace-jhxtk\" (UID: \"7c8218d6-7a30-4d12-873d-db54dd2d2217\") " pod="openshift-marketplace/redhat-marketplace-jhxtk" Oct 09 16:06:47 crc kubenswrapper[4719]: I1009 16:06:47.909945 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m79w2\" (UniqueName: \"kubernetes.io/projected/7c8218d6-7a30-4d12-873d-db54dd2d2217-kube-api-access-m79w2\") pod \"redhat-marketplace-jhxtk\" (UID: \"7c8218d6-7a30-4d12-873d-db54dd2d2217\") " pod="openshift-marketplace/redhat-marketplace-jhxtk" Oct 09 16:06:48 crc kubenswrapper[4719]: I1009 16:06:48.011652 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c8218d6-7a30-4d12-873d-db54dd2d2217-catalog-content\") pod \"redhat-marketplace-jhxtk\" (UID: \"7c8218d6-7a30-4d12-873d-db54dd2d2217\") " pod="openshift-marketplace/redhat-marketplace-jhxtk" Oct 09 16:06:48 crc kubenswrapper[4719]: I1009 16:06:48.012372 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c8218d6-7a30-4d12-873d-db54dd2d2217-catalog-content\") pod \"redhat-marketplace-jhxtk\" (UID: \"7c8218d6-7a30-4d12-873d-db54dd2d2217\") " pod="openshift-marketplace/redhat-marketplace-jhxtk" Oct 09 16:06:48 crc kubenswrapper[4719]: I1009 16:06:48.012697 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c8218d6-7a30-4d12-873d-db54dd2d2217-utilities\") pod \"redhat-marketplace-jhxtk\" (UID: \"7c8218d6-7a30-4d12-873d-db54dd2d2217\") " pod="openshift-marketplace/redhat-marketplace-jhxtk" Oct 09 16:06:48 crc kubenswrapper[4719]: I1009 16:06:48.012831 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m79w2\" (UniqueName: \"kubernetes.io/projected/7c8218d6-7a30-4d12-873d-db54dd2d2217-kube-api-access-m79w2\") pod \"redhat-marketplace-jhxtk\" (UID: \"7c8218d6-7a30-4d12-873d-db54dd2d2217\") " pod="openshift-marketplace/redhat-marketplace-jhxtk" Oct 09 16:06:48 crc kubenswrapper[4719]: I1009 16:06:48.013042 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c8218d6-7a30-4d12-873d-db54dd2d2217-utilities\") pod \"redhat-marketplace-jhxtk\" (UID: \"7c8218d6-7a30-4d12-873d-db54dd2d2217\") " pod="openshift-marketplace/redhat-marketplace-jhxtk" Oct 09 16:06:48 crc kubenswrapper[4719]: I1009 16:06:48.044152 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m79w2\" (UniqueName: \"kubernetes.io/projected/7c8218d6-7a30-4d12-873d-db54dd2d2217-kube-api-access-m79w2\") pod \"redhat-marketplace-jhxtk\" (UID: \"7c8218d6-7a30-4d12-873d-db54dd2d2217\") " pod="openshift-marketplace/redhat-marketplace-jhxtk" Oct 09 16:06:48 crc kubenswrapper[4719]: I1009 16:06:48.153398 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jhxtk" Oct 09 16:06:49 crc kubenswrapper[4719]: I1009 16:06:49.640153 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zt6n2"] Oct 09 16:06:49 crc kubenswrapper[4719]: I1009 16:06:49.644543 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zt6n2" Oct 09 16:06:49 crc kubenswrapper[4719]: I1009 16:06:49.657100 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwzj7\" (UniqueName: \"kubernetes.io/projected/24591ba0-cfd8-4607-ba4c-143d2167a104-kube-api-access-nwzj7\") pod \"certified-operators-zt6n2\" (UID: \"24591ba0-cfd8-4607-ba4c-143d2167a104\") " pod="openshift-marketplace/certified-operators-zt6n2" Oct 09 16:06:49 crc kubenswrapper[4719]: I1009 16:06:49.657156 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24591ba0-cfd8-4607-ba4c-143d2167a104-utilities\") pod \"certified-operators-zt6n2\" (UID: \"24591ba0-cfd8-4607-ba4c-143d2167a104\") " pod="openshift-marketplace/certified-operators-zt6n2" Oct 09 16:06:49 crc kubenswrapper[4719]: I1009 16:06:49.657221 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24591ba0-cfd8-4607-ba4c-143d2167a104-catalog-content\") pod \"certified-operators-zt6n2\" (UID: \"24591ba0-cfd8-4607-ba4c-143d2167a104\") " pod="openshift-marketplace/certified-operators-zt6n2" Oct 09 16:06:49 crc kubenswrapper[4719]: I1009 16:06:49.665303 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zt6n2"] Oct 09 16:06:49 crc kubenswrapper[4719]: I1009 16:06:49.761089 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwzj7\" (UniqueName: \"kubernetes.io/projected/24591ba0-cfd8-4607-ba4c-143d2167a104-kube-api-access-nwzj7\") pod \"certified-operators-zt6n2\" (UID: \"24591ba0-cfd8-4607-ba4c-143d2167a104\") " pod="openshift-marketplace/certified-operators-zt6n2" Oct 09 16:06:49 crc kubenswrapper[4719]: I1009 16:06:49.761150 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24591ba0-cfd8-4607-ba4c-143d2167a104-utilities\") pod \"certified-operators-zt6n2\" (UID: \"24591ba0-cfd8-4607-ba4c-143d2167a104\") " pod="openshift-marketplace/certified-operators-zt6n2" Oct 09 16:06:49 crc kubenswrapper[4719]: I1009 16:06:49.761238 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24591ba0-cfd8-4607-ba4c-143d2167a104-catalog-content\") pod \"certified-operators-zt6n2\" (UID: \"24591ba0-cfd8-4607-ba4c-143d2167a104\") " pod="openshift-marketplace/certified-operators-zt6n2" Oct 09 16:06:49 crc kubenswrapper[4719]: I1009 16:06:49.762021 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24591ba0-cfd8-4607-ba4c-143d2167a104-catalog-content\") pod \"certified-operators-zt6n2\" (UID: \"24591ba0-cfd8-4607-ba4c-143d2167a104\") " pod="openshift-marketplace/certified-operators-zt6n2" Oct 09 16:06:49 crc kubenswrapper[4719]: I1009 16:06:49.762644 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24591ba0-cfd8-4607-ba4c-143d2167a104-utilities\") pod \"certified-operators-zt6n2\" (UID: \"24591ba0-cfd8-4607-ba4c-143d2167a104\") " pod="openshift-marketplace/certified-operators-zt6n2" Oct 09 16:06:49 crc kubenswrapper[4719]: I1009 16:06:49.778776 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwzj7\" (UniqueName: \"kubernetes.io/projected/24591ba0-cfd8-4607-ba4c-143d2167a104-kube-api-access-nwzj7\") pod \"certified-operators-zt6n2\" (UID: \"24591ba0-cfd8-4607-ba4c-143d2167a104\") " pod="openshift-marketplace/certified-operators-zt6n2" Oct 09 16:06:49 crc kubenswrapper[4719]: I1009 16:06:49.830411 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zt6n2" Oct 09 16:06:50 crc kubenswrapper[4719]: I1009 16:06:50.039894 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jhxtk"] Oct 09 16:06:50 crc kubenswrapper[4719]: I1009 16:06:50.097993 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhxtk" event={"ID":"7c8218d6-7a30-4d12-873d-db54dd2d2217","Type":"ContainerStarted","Data":"0674b759b4b058f3e96183ccc0b5ef2a5d5300ed5bc62e7ad17339d91d9b8c65"} Oct 09 16:06:50 crc kubenswrapper[4719]: I1009 16:06:50.383433 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zt6n2"] Oct 09 16:06:50 crc kubenswrapper[4719]: W1009 16:06:50.413536 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24591ba0_cfd8_4607_ba4c_143d2167a104.slice/crio-61fbad29fe18794c15ea1766156f1bb2279bff8a52117fb1c2b31f259179f361 WatchSource:0}: Error finding container 61fbad29fe18794c15ea1766156f1bb2279bff8a52117fb1c2b31f259179f361: Status 404 returned error can't find the container with id 61fbad29fe18794c15ea1766156f1bb2279bff8a52117fb1c2b31f259179f361 Oct 09 16:06:51 crc kubenswrapper[4719]: I1009 16:06:51.110784 4719 generic.go:334] "Generic (PLEG): container finished" podID="24591ba0-cfd8-4607-ba4c-143d2167a104" containerID="aac4bda0c522f14f861c21050c03d52d4ca516012816524200d93d0c606ab433" exitCode=0 Oct 09 16:06:51 crc kubenswrapper[4719]: I1009 16:06:51.110900 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zt6n2" event={"ID":"24591ba0-cfd8-4607-ba4c-143d2167a104","Type":"ContainerDied","Data":"aac4bda0c522f14f861c21050c03d52d4ca516012816524200d93d0c606ab433"} Oct 09 16:06:51 crc kubenswrapper[4719]: I1009 16:06:51.111192 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zt6n2" event={"ID":"24591ba0-cfd8-4607-ba4c-143d2167a104","Type":"ContainerStarted","Data":"61fbad29fe18794c15ea1766156f1bb2279bff8a52117fb1c2b31f259179f361"} Oct 09 16:06:51 crc kubenswrapper[4719]: I1009 16:06:51.114976 4719 generic.go:334] "Generic (PLEG): container finished" podID="7c8218d6-7a30-4d12-873d-db54dd2d2217" containerID="9f43fddf3c9e222636a40f9235e502918bca7746335a86eca0f71878596e7a6f" exitCode=0 Oct 09 16:06:51 crc kubenswrapper[4719]: I1009 16:06:51.115020 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhxtk" event={"ID":"7c8218d6-7a30-4d12-873d-db54dd2d2217","Type":"ContainerDied","Data":"9f43fddf3c9e222636a40f9235e502918bca7746335a86eca0f71878596e7a6f"} Oct 09 16:06:51 crc kubenswrapper[4719]: I1009 16:06:51.117143 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"8905824a-8f15-4df7-b938-b63f2a5aebb1","Type":"ContainerStarted","Data":"53273a4a10d0010e94e9c43e9a4573892691b9349c2b9e5ed482a717c208480c"} Oct 09 16:06:51 crc kubenswrapper[4719]: I1009 16:06:51.181953 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.028848639 podStartE2EDuration="15.181929717s" podCreationTimestamp="2025-10-09 16:06:36 +0000 UTC" firstStartedPulling="2025-10-09 16:06:38.43246179 +0000 UTC m=+2903.942173065" lastFinishedPulling="2025-10-09 16:06:49.585542858 +0000 UTC m=+2915.095254143" observedRunningTime="2025-10-09 16:06:51.176035009 +0000 UTC m=+2916.685746324" watchObservedRunningTime="2025-10-09 16:06:51.181929717 +0000 UTC m=+2916.691641012" Oct 09 16:06:53 crc kubenswrapper[4719]: I1009 16:06:53.150861 4719 generic.go:334] "Generic (PLEG): container finished" podID="7c8218d6-7a30-4d12-873d-db54dd2d2217" containerID="b0d6bdd28fc8078f0585b16f70df5eaf8649dd280d82dce32016edb8439ffe9a" exitCode=0 Oct 09 16:06:53 crc kubenswrapper[4719]: I1009 16:06:53.150968 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhxtk" event={"ID":"7c8218d6-7a30-4d12-873d-db54dd2d2217","Type":"ContainerDied","Data":"b0d6bdd28fc8078f0585b16f70df5eaf8649dd280d82dce32016edb8439ffe9a"} Oct 09 16:06:53 crc kubenswrapper[4719]: I1009 16:06:53.155584 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zt6n2" event={"ID":"24591ba0-cfd8-4607-ba4c-143d2167a104","Type":"ContainerStarted","Data":"a83d1e8497630e7f77d1a5a4e4e5e14968907e3bbe5749ab8bf74cb4477c0e9c"} Oct 09 16:06:56 crc kubenswrapper[4719]: I1009 16:06:56.188315 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhxtk" event={"ID":"7c8218d6-7a30-4d12-873d-db54dd2d2217","Type":"ContainerStarted","Data":"907622a18ff398e4474946f07bc41841eed060830275e3062bed410cb16da5b3"} Oct 09 16:06:56 crc kubenswrapper[4719]: I1009 16:06:56.190428 4719 generic.go:334] "Generic (PLEG): container finished" podID="24591ba0-cfd8-4607-ba4c-143d2167a104" containerID="a83d1e8497630e7f77d1a5a4e4e5e14968907e3bbe5749ab8bf74cb4477c0e9c" exitCode=0 Oct 09 16:06:56 crc kubenswrapper[4719]: I1009 16:06:56.190480 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zt6n2" event={"ID":"24591ba0-cfd8-4607-ba4c-143d2167a104","Type":"ContainerDied","Data":"a83d1e8497630e7f77d1a5a4e4e5e14968907e3bbe5749ab8bf74cb4477c0e9c"} Oct 09 16:06:56 crc kubenswrapper[4719]: I1009 16:06:56.219063 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jhxtk" podStartSLOduration=5.004297943 podStartE2EDuration="9.219028141s" podCreationTimestamp="2025-10-09 16:06:47 +0000 UTC" firstStartedPulling="2025-10-09 16:06:51.118667556 +0000 UTC m=+2916.628378841" lastFinishedPulling="2025-10-09 16:06:55.333397754 +0000 UTC m=+2920.843109039" observedRunningTime="2025-10-09 16:06:56.205250361 +0000 UTC m=+2921.714961666" watchObservedRunningTime="2025-10-09 16:06:56.219028141 +0000 UTC m=+2921.728739426" Oct 09 16:06:57 crc kubenswrapper[4719]: I1009 16:06:57.201302 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zt6n2" event={"ID":"24591ba0-cfd8-4607-ba4c-143d2167a104","Type":"ContainerStarted","Data":"a2c34ad04281fa98016f48dadd73e70109fb0e75cbd56a601eb0f7c0a3b2656c"} Oct 09 16:06:57 crc kubenswrapper[4719]: I1009 16:06:57.227380 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zt6n2" podStartSLOduration=2.655663747 podStartE2EDuration="8.227362617s" podCreationTimestamp="2025-10-09 16:06:49 +0000 UTC" firstStartedPulling="2025-10-09 16:06:51.11284677 +0000 UTC m=+2916.622558055" lastFinishedPulling="2025-10-09 16:06:56.68454564 +0000 UTC m=+2922.194256925" observedRunningTime="2025-10-09 16:06:57.21930077 +0000 UTC m=+2922.729012055" watchObservedRunningTime="2025-10-09 16:06:57.227362617 +0000 UTC m=+2922.737073902" Oct 09 16:06:58 crc kubenswrapper[4719]: I1009 16:06:58.154560 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jhxtk" Oct 09 16:06:58 crc kubenswrapper[4719]: I1009 16:06:58.155603 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jhxtk" Oct 09 16:06:59 crc kubenswrapper[4719]: I1009 16:06:59.208411 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-jhxtk" podUID="7c8218d6-7a30-4d12-873d-db54dd2d2217" containerName="registry-server" probeResult="failure" output=< Oct 09 16:06:59 crc kubenswrapper[4719]: timeout: failed to connect service ":50051" within 1s Oct 09 16:06:59 crc kubenswrapper[4719]: > Oct 09 16:06:59 crc kubenswrapper[4719]: I1009 16:06:59.831420 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zt6n2" Oct 09 16:06:59 crc kubenswrapper[4719]: I1009 16:06:59.832092 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zt6n2" Oct 09 16:06:59 crc kubenswrapper[4719]: I1009 16:06:59.878903 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zt6n2" Oct 09 16:07:08 crc kubenswrapper[4719]: I1009 16:07:08.211282 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jhxtk" Oct 09 16:07:08 crc kubenswrapper[4719]: I1009 16:07:08.259858 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jhxtk" Oct 09 16:07:08 crc kubenswrapper[4719]: I1009 16:07:08.443891 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jhxtk"] Oct 09 16:07:09 crc kubenswrapper[4719]: I1009 16:07:09.317323 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jhxtk" podUID="7c8218d6-7a30-4d12-873d-db54dd2d2217" containerName="registry-server" containerID="cri-o://907622a18ff398e4474946f07bc41841eed060830275e3062bed410cb16da5b3" gracePeriod=2 Oct 09 16:07:09 crc kubenswrapper[4719]: I1009 16:07:09.851002 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jhxtk" Oct 09 16:07:09 crc kubenswrapper[4719]: I1009 16:07:09.891989 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zt6n2" Oct 09 16:07:09 crc kubenswrapper[4719]: I1009 16:07:09.931596 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m79w2\" (UniqueName: \"kubernetes.io/projected/7c8218d6-7a30-4d12-873d-db54dd2d2217-kube-api-access-m79w2\") pod \"7c8218d6-7a30-4d12-873d-db54dd2d2217\" (UID: \"7c8218d6-7a30-4d12-873d-db54dd2d2217\") " Oct 09 16:07:09 crc kubenswrapper[4719]: I1009 16:07:09.931670 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c8218d6-7a30-4d12-873d-db54dd2d2217-utilities\") pod \"7c8218d6-7a30-4d12-873d-db54dd2d2217\" (UID: \"7c8218d6-7a30-4d12-873d-db54dd2d2217\") " Oct 09 16:07:09 crc kubenswrapper[4719]: I1009 16:07:09.932764 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c8218d6-7a30-4d12-873d-db54dd2d2217-catalog-content\") pod \"7c8218d6-7a30-4d12-873d-db54dd2d2217\" (UID: \"7c8218d6-7a30-4d12-873d-db54dd2d2217\") " Oct 09 16:07:09 crc kubenswrapper[4719]: I1009 16:07:09.932985 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c8218d6-7a30-4d12-873d-db54dd2d2217-utilities" (OuterVolumeSpecName: "utilities") pod "7c8218d6-7a30-4d12-873d-db54dd2d2217" (UID: "7c8218d6-7a30-4d12-873d-db54dd2d2217"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:07:09 crc kubenswrapper[4719]: I1009 16:07:09.933549 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c8218d6-7a30-4d12-873d-db54dd2d2217-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 16:07:09 crc kubenswrapper[4719]: I1009 16:07:09.947789 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c8218d6-7a30-4d12-873d-db54dd2d2217-kube-api-access-m79w2" (OuterVolumeSpecName: "kube-api-access-m79w2") pod "7c8218d6-7a30-4d12-873d-db54dd2d2217" (UID: "7c8218d6-7a30-4d12-873d-db54dd2d2217"). InnerVolumeSpecName "kube-api-access-m79w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:07:09 crc kubenswrapper[4719]: I1009 16:07:09.948335 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c8218d6-7a30-4d12-873d-db54dd2d2217-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c8218d6-7a30-4d12-873d-db54dd2d2217" (UID: "7c8218d6-7a30-4d12-873d-db54dd2d2217"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:07:10 crc kubenswrapper[4719]: I1009 16:07:10.035362 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c8218d6-7a30-4d12-873d-db54dd2d2217-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 16:07:10 crc kubenswrapper[4719]: I1009 16:07:10.035396 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m79w2\" (UniqueName: \"kubernetes.io/projected/7c8218d6-7a30-4d12-873d-db54dd2d2217-kube-api-access-m79w2\") on node \"crc\" DevicePath \"\"" Oct 09 16:07:10 crc kubenswrapper[4719]: I1009 16:07:10.332615 4719 generic.go:334] "Generic (PLEG): container finished" podID="7c8218d6-7a30-4d12-873d-db54dd2d2217" containerID="907622a18ff398e4474946f07bc41841eed060830275e3062bed410cb16da5b3" exitCode=0 Oct 09 16:07:10 crc kubenswrapper[4719]: I1009 16:07:10.333211 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhxtk" event={"ID":"7c8218d6-7a30-4d12-873d-db54dd2d2217","Type":"ContainerDied","Data":"907622a18ff398e4474946f07bc41841eed060830275e3062bed410cb16da5b3"} Oct 09 16:07:10 crc kubenswrapper[4719]: I1009 16:07:10.333245 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhxtk" event={"ID":"7c8218d6-7a30-4d12-873d-db54dd2d2217","Type":"ContainerDied","Data":"0674b759b4b058f3e96183ccc0b5ef2a5d5300ed5bc62e7ad17339d91d9b8c65"} Oct 09 16:07:10 crc kubenswrapper[4719]: I1009 16:07:10.333266 4719 scope.go:117] "RemoveContainer" containerID="907622a18ff398e4474946f07bc41841eed060830275e3062bed410cb16da5b3" Oct 09 16:07:10 crc kubenswrapper[4719]: I1009 16:07:10.333432 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jhxtk" Oct 09 16:07:10 crc kubenswrapper[4719]: I1009 16:07:10.381876 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jhxtk"] Oct 09 16:07:10 crc kubenswrapper[4719]: I1009 16:07:10.382305 4719 scope.go:117] "RemoveContainer" containerID="b0d6bdd28fc8078f0585b16f70df5eaf8649dd280d82dce32016edb8439ffe9a" Oct 09 16:07:10 crc kubenswrapper[4719]: I1009 16:07:10.394170 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jhxtk"] Oct 09 16:07:10 crc kubenswrapper[4719]: I1009 16:07:10.407271 4719 scope.go:117] "RemoveContainer" containerID="9f43fddf3c9e222636a40f9235e502918bca7746335a86eca0f71878596e7a6f" Oct 09 16:07:10 crc kubenswrapper[4719]: I1009 16:07:10.458500 4719 scope.go:117] "RemoveContainer" containerID="907622a18ff398e4474946f07bc41841eed060830275e3062bed410cb16da5b3" Oct 09 16:07:10 crc kubenswrapper[4719]: E1009 16:07:10.458995 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"907622a18ff398e4474946f07bc41841eed060830275e3062bed410cb16da5b3\": container with ID starting with 907622a18ff398e4474946f07bc41841eed060830275e3062bed410cb16da5b3 not found: ID does not exist" containerID="907622a18ff398e4474946f07bc41841eed060830275e3062bed410cb16da5b3" Oct 09 16:07:10 crc kubenswrapper[4719]: I1009 16:07:10.459041 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"907622a18ff398e4474946f07bc41841eed060830275e3062bed410cb16da5b3"} err="failed to get container status \"907622a18ff398e4474946f07bc41841eed060830275e3062bed410cb16da5b3\": rpc error: code = NotFound desc = could not find container \"907622a18ff398e4474946f07bc41841eed060830275e3062bed410cb16da5b3\": container with ID starting with 907622a18ff398e4474946f07bc41841eed060830275e3062bed410cb16da5b3 not found: ID does not exist" Oct 09 16:07:10 crc kubenswrapper[4719]: I1009 16:07:10.459068 4719 scope.go:117] "RemoveContainer" containerID="b0d6bdd28fc8078f0585b16f70df5eaf8649dd280d82dce32016edb8439ffe9a" Oct 09 16:07:10 crc kubenswrapper[4719]: E1009 16:07:10.459525 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0d6bdd28fc8078f0585b16f70df5eaf8649dd280d82dce32016edb8439ffe9a\": container with ID starting with b0d6bdd28fc8078f0585b16f70df5eaf8649dd280d82dce32016edb8439ffe9a not found: ID does not exist" containerID="b0d6bdd28fc8078f0585b16f70df5eaf8649dd280d82dce32016edb8439ffe9a" Oct 09 16:07:10 crc kubenswrapper[4719]: I1009 16:07:10.459559 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0d6bdd28fc8078f0585b16f70df5eaf8649dd280d82dce32016edb8439ffe9a"} err="failed to get container status \"b0d6bdd28fc8078f0585b16f70df5eaf8649dd280d82dce32016edb8439ffe9a\": rpc error: code = NotFound desc = could not find container \"b0d6bdd28fc8078f0585b16f70df5eaf8649dd280d82dce32016edb8439ffe9a\": container with ID starting with b0d6bdd28fc8078f0585b16f70df5eaf8649dd280d82dce32016edb8439ffe9a not found: ID does not exist" Oct 09 16:07:10 crc kubenswrapper[4719]: I1009 16:07:10.459575 4719 scope.go:117] "RemoveContainer" containerID="9f43fddf3c9e222636a40f9235e502918bca7746335a86eca0f71878596e7a6f" Oct 09 16:07:10 crc kubenswrapper[4719]: E1009 16:07:10.459884 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f43fddf3c9e222636a40f9235e502918bca7746335a86eca0f71878596e7a6f\": container with ID starting with 9f43fddf3c9e222636a40f9235e502918bca7746335a86eca0f71878596e7a6f not found: ID does not exist" containerID="9f43fddf3c9e222636a40f9235e502918bca7746335a86eca0f71878596e7a6f" Oct 09 16:07:10 crc kubenswrapper[4719]: I1009 16:07:10.459909 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f43fddf3c9e222636a40f9235e502918bca7746335a86eca0f71878596e7a6f"} err="failed to get container status \"9f43fddf3c9e222636a40f9235e502918bca7746335a86eca0f71878596e7a6f\": rpc error: code = NotFound desc = could not find container \"9f43fddf3c9e222636a40f9235e502918bca7746335a86eca0f71878596e7a6f\": container with ID starting with 9f43fddf3c9e222636a40f9235e502918bca7746335a86eca0f71878596e7a6f not found: ID does not exist" Oct 09 16:07:11 crc kubenswrapper[4719]: I1009 16:07:11.188560 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c8218d6-7a30-4d12-873d-db54dd2d2217" path="/var/lib/kubelet/pods/7c8218d6-7a30-4d12-873d-db54dd2d2217/volumes" Oct 09 16:07:12 crc kubenswrapper[4719]: I1009 16:07:12.245108 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zt6n2"] Oct 09 16:07:12 crc kubenswrapper[4719]: I1009 16:07:12.245679 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zt6n2" podUID="24591ba0-cfd8-4607-ba4c-143d2167a104" containerName="registry-server" containerID="cri-o://a2c34ad04281fa98016f48dadd73e70109fb0e75cbd56a601eb0f7c0a3b2656c" gracePeriod=2 Oct 09 16:07:12 crc kubenswrapper[4719]: I1009 16:07:12.733461 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zt6n2" Oct 09 16:07:12 crc kubenswrapper[4719]: I1009 16:07:12.905533 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwzj7\" (UniqueName: \"kubernetes.io/projected/24591ba0-cfd8-4607-ba4c-143d2167a104-kube-api-access-nwzj7\") pod \"24591ba0-cfd8-4607-ba4c-143d2167a104\" (UID: \"24591ba0-cfd8-4607-ba4c-143d2167a104\") " Oct 09 16:07:12 crc kubenswrapper[4719]: I1009 16:07:12.905657 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24591ba0-cfd8-4607-ba4c-143d2167a104-catalog-content\") pod \"24591ba0-cfd8-4607-ba4c-143d2167a104\" (UID: \"24591ba0-cfd8-4607-ba4c-143d2167a104\") " Oct 09 16:07:12 crc kubenswrapper[4719]: I1009 16:07:12.905768 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24591ba0-cfd8-4607-ba4c-143d2167a104-utilities\") pod \"24591ba0-cfd8-4607-ba4c-143d2167a104\" (UID: \"24591ba0-cfd8-4607-ba4c-143d2167a104\") " Oct 09 16:07:12 crc kubenswrapper[4719]: I1009 16:07:12.906981 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24591ba0-cfd8-4607-ba4c-143d2167a104-utilities" (OuterVolumeSpecName: "utilities") pod "24591ba0-cfd8-4607-ba4c-143d2167a104" (UID: "24591ba0-cfd8-4607-ba4c-143d2167a104"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:07:12 crc kubenswrapper[4719]: I1009 16:07:12.913610 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24591ba0-cfd8-4607-ba4c-143d2167a104-kube-api-access-nwzj7" (OuterVolumeSpecName: "kube-api-access-nwzj7") pod "24591ba0-cfd8-4607-ba4c-143d2167a104" (UID: "24591ba0-cfd8-4607-ba4c-143d2167a104"). InnerVolumeSpecName "kube-api-access-nwzj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:07:12 crc kubenswrapper[4719]: I1009 16:07:12.963625 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24591ba0-cfd8-4607-ba4c-143d2167a104-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24591ba0-cfd8-4607-ba4c-143d2167a104" (UID: "24591ba0-cfd8-4607-ba4c-143d2167a104"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:07:13 crc kubenswrapper[4719]: I1009 16:07:13.008603 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24591ba0-cfd8-4607-ba4c-143d2167a104-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 16:07:13 crc kubenswrapper[4719]: I1009 16:07:13.008637 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24591ba0-cfd8-4607-ba4c-143d2167a104-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 16:07:13 crc kubenswrapper[4719]: I1009 16:07:13.008650 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwzj7\" (UniqueName: \"kubernetes.io/projected/24591ba0-cfd8-4607-ba4c-143d2167a104-kube-api-access-nwzj7\") on node \"crc\" DevicePath \"\"" Oct 09 16:07:13 crc kubenswrapper[4719]: I1009 16:07:13.367144 4719 generic.go:334] "Generic (PLEG): container finished" podID="24591ba0-cfd8-4607-ba4c-143d2167a104" containerID="a2c34ad04281fa98016f48dadd73e70109fb0e75cbd56a601eb0f7c0a3b2656c" exitCode=0 Oct 09 16:07:13 crc kubenswrapper[4719]: I1009 16:07:13.367185 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zt6n2" event={"ID":"24591ba0-cfd8-4607-ba4c-143d2167a104","Type":"ContainerDied","Data":"a2c34ad04281fa98016f48dadd73e70109fb0e75cbd56a601eb0f7c0a3b2656c"} Oct 09 16:07:13 crc kubenswrapper[4719]: I1009 16:07:13.367201 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zt6n2" Oct 09 16:07:13 crc kubenswrapper[4719]: I1009 16:07:13.367223 4719 scope.go:117] "RemoveContainer" containerID="a2c34ad04281fa98016f48dadd73e70109fb0e75cbd56a601eb0f7c0a3b2656c" Oct 09 16:07:13 crc kubenswrapper[4719]: I1009 16:07:13.367210 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zt6n2" event={"ID":"24591ba0-cfd8-4607-ba4c-143d2167a104","Type":"ContainerDied","Data":"61fbad29fe18794c15ea1766156f1bb2279bff8a52117fb1c2b31f259179f361"} Oct 09 16:07:13 crc kubenswrapper[4719]: I1009 16:07:13.393790 4719 scope.go:117] "RemoveContainer" containerID="a83d1e8497630e7f77d1a5a4e4e5e14968907e3bbe5749ab8bf74cb4477c0e9c" Oct 09 16:07:13 crc kubenswrapper[4719]: I1009 16:07:13.395148 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zt6n2"] Oct 09 16:07:13 crc kubenswrapper[4719]: I1009 16:07:13.404254 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zt6n2"] Oct 09 16:07:13 crc kubenswrapper[4719]: I1009 16:07:13.418227 4719 scope.go:117] "RemoveContainer" containerID="aac4bda0c522f14f861c21050c03d52d4ca516012816524200d93d0c606ab433" Oct 09 16:07:13 crc kubenswrapper[4719]: I1009 16:07:13.477664 4719 scope.go:117] "RemoveContainer" containerID="a2c34ad04281fa98016f48dadd73e70109fb0e75cbd56a601eb0f7c0a3b2656c" Oct 09 16:07:13 crc kubenswrapper[4719]: E1009 16:07:13.478309 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2c34ad04281fa98016f48dadd73e70109fb0e75cbd56a601eb0f7c0a3b2656c\": container with ID starting with a2c34ad04281fa98016f48dadd73e70109fb0e75cbd56a601eb0f7c0a3b2656c not found: ID does not exist" containerID="a2c34ad04281fa98016f48dadd73e70109fb0e75cbd56a601eb0f7c0a3b2656c" Oct 09 16:07:13 crc kubenswrapper[4719]: I1009 16:07:13.478519 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2c34ad04281fa98016f48dadd73e70109fb0e75cbd56a601eb0f7c0a3b2656c"} err="failed to get container status \"a2c34ad04281fa98016f48dadd73e70109fb0e75cbd56a601eb0f7c0a3b2656c\": rpc error: code = NotFound desc = could not find container \"a2c34ad04281fa98016f48dadd73e70109fb0e75cbd56a601eb0f7c0a3b2656c\": container with ID starting with a2c34ad04281fa98016f48dadd73e70109fb0e75cbd56a601eb0f7c0a3b2656c not found: ID does not exist" Oct 09 16:07:13 crc kubenswrapper[4719]: I1009 16:07:13.478563 4719 scope.go:117] "RemoveContainer" containerID="a83d1e8497630e7f77d1a5a4e4e5e14968907e3bbe5749ab8bf74cb4477c0e9c" Oct 09 16:07:13 crc kubenswrapper[4719]: E1009 16:07:13.480468 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a83d1e8497630e7f77d1a5a4e4e5e14968907e3bbe5749ab8bf74cb4477c0e9c\": container with ID starting with a83d1e8497630e7f77d1a5a4e4e5e14968907e3bbe5749ab8bf74cb4477c0e9c not found: ID does not exist" containerID="a83d1e8497630e7f77d1a5a4e4e5e14968907e3bbe5749ab8bf74cb4477c0e9c" Oct 09 16:07:13 crc kubenswrapper[4719]: I1009 16:07:13.480499 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a83d1e8497630e7f77d1a5a4e4e5e14968907e3bbe5749ab8bf74cb4477c0e9c"} err="failed to get container status \"a83d1e8497630e7f77d1a5a4e4e5e14968907e3bbe5749ab8bf74cb4477c0e9c\": rpc error: code = NotFound desc = could not find container \"a83d1e8497630e7f77d1a5a4e4e5e14968907e3bbe5749ab8bf74cb4477c0e9c\": container with ID starting with a83d1e8497630e7f77d1a5a4e4e5e14968907e3bbe5749ab8bf74cb4477c0e9c not found: ID does not exist" Oct 09 16:07:13 crc kubenswrapper[4719]: I1009 16:07:13.480521 4719 scope.go:117] "RemoveContainer" containerID="aac4bda0c522f14f861c21050c03d52d4ca516012816524200d93d0c606ab433" Oct 09 16:07:13 crc kubenswrapper[4719]: E1009 16:07:13.480942 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aac4bda0c522f14f861c21050c03d52d4ca516012816524200d93d0c606ab433\": container with ID starting with aac4bda0c522f14f861c21050c03d52d4ca516012816524200d93d0c606ab433 not found: ID does not exist" containerID="aac4bda0c522f14f861c21050c03d52d4ca516012816524200d93d0c606ab433" Oct 09 16:07:13 crc kubenswrapper[4719]: I1009 16:07:13.480974 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aac4bda0c522f14f861c21050c03d52d4ca516012816524200d93d0c606ab433"} err="failed to get container status \"aac4bda0c522f14f861c21050c03d52d4ca516012816524200d93d0c606ab433\": rpc error: code = NotFound desc = could not find container \"aac4bda0c522f14f861c21050c03d52d4ca516012816524200d93d0c606ab433\": container with ID starting with aac4bda0c522f14f861c21050c03d52d4ca516012816524200d93d0c606ab433 not found: ID does not exist" Oct 09 16:07:15 crc kubenswrapper[4719]: I1009 16:07:15.207907 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24591ba0-cfd8-4607-ba4c-143d2167a104" path="/var/lib/kubelet/pods/24591ba0-cfd8-4607-ba4c-143d2167a104/volumes" Oct 09 16:09:06 crc kubenswrapper[4719]: I1009 16:09:06.977133 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:09:06 crc kubenswrapper[4719]: I1009 16:09:06.977663 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:09:36 crc kubenswrapper[4719]: I1009 16:09:36.976994 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:09:36 crc kubenswrapper[4719]: I1009 16:09:36.977928 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:10:06 crc kubenswrapper[4719]: I1009 16:10:06.977170 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:10:06 crc kubenswrapper[4719]: I1009 16:10:06.977989 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:10:06 crc kubenswrapper[4719]: I1009 16:10:06.978049 4719 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 16:10:06 crc kubenswrapper[4719]: I1009 16:10:06.978996 4719 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e"} pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 16:10:06 crc kubenswrapper[4719]: I1009 16:10:06.979070 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" containerID="cri-o://e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" gracePeriod=600 Oct 09 16:10:07 crc kubenswrapper[4719]: E1009 16:10:07.107610 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:10:08 crc kubenswrapper[4719]: I1009 16:10:08.081544 4719 generic.go:334] "Generic (PLEG): container finished" podID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerID="e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" exitCode=0 Oct 09 16:10:08 crc kubenswrapper[4719]: I1009 16:10:08.081625 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerDied","Data":"e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e"} Oct 09 16:10:08 crc kubenswrapper[4719]: I1009 16:10:08.081664 4719 scope.go:117] "RemoveContainer" containerID="920ea73af6d2fdd926cecc482b1dc2188636a4f6e8a6fe0fff4b95bcf354a955" Oct 09 16:10:08 crc kubenswrapper[4719]: I1009 16:10:08.082644 4719 scope.go:117] "RemoveContainer" containerID="e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" Oct 09 16:10:08 crc kubenswrapper[4719]: E1009 16:10:08.082901 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:10:22 crc kubenswrapper[4719]: I1009 16:10:22.162049 4719 scope.go:117] "RemoveContainer" containerID="e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" Oct 09 16:10:22 crc kubenswrapper[4719]: E1009 16:10:22.162773 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:10:33 crc kubenswrapper[4719]: I1009 16:10:33.160976 4719 scope.go:117] "RemoveContainer" containerID="e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" Oct 09 16:10:33 crc kubenswrapper[4719]: E1009 16:10:33.161750 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:10:46 crc kubenswrapper[4719]: I1009 16:10:46.160774 4719 scope.go:117] "RemoveContainer" containerID="e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" Oct 09 16:10:46 crc kubenswrapper[4719]: E1009 16:10:46.161517 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:11:00 crc kubenswrapper[4719]: I1009 16:11:00.161697 4719 scope.go:117] "RemoveContainer" containerID="e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" Oct 09 16:11:00 crc kubenswrapper[4719]: E1009 16:11:00.162535 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:11:13 crc kubenswrapper[4719]: I1009 16:11:13.163235 4719 scope.go:117] "RemoveContainer" containerID="e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" Oct 09 16:11:13 crc kubenswrapper[4719]: E1009 16:11:13.164631 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:11:28 crc kubenswrapper[4719]: I1009 16:11:28.161813 4719 scope.go:117] "RemoveContainer" containerID="e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" Oct 09 16:11:28 crc kubenswrapper[4719]: E1009 16:11:28.163648 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:11:40 crc kubenswrapper[4719]: I1009 16:11:40.162261 4719 scope.go:117] "RemoveContainer" containerID="e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" Oct 09 16:11:40 crc kubenswrapper[4719]: E1009 16:11:40.163691 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:11:54 crc kubenswrapper[4719]: I1009 16:11:54.161516 4719 scope.go:117] "RemoveContainer" containerID="e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" Oct 09 16:11:54 crc kubenswrapper[4719]: E1009 16:11:54.162419 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:12:07 crc kubenswrapper[4719]: I1009 16:12:07.162218 4719 scope.go:117] "RemoveContainer" containerID="e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" Oct 09 16:12:07 crc kubenswrapper[4719]: E1009 16:12:07.171577 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:12:22 crc kubenswrapper[4719]: I1009 16:12:22.161647 4719 scope.go:117] "RemoveContainer" containerID="e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" Oct 09 16:12:22 crc kubenswrapper[4719]: E1009 16:12:22.162543 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:12:36 crc kubenswrapper[4719]: I1009 16:12:36.161953 4719 scope.go:117] "RemoveContainer" containerID="e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" Oct 09 16:12:36 crc kubenswrapper[4719]: E1009 16:12:36.162875 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:12:51 crc kubenswrapper[4719]: I1009 16:12:51.161527 4719 scope.go:117] "RemoveContainer" containerID="e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" Oct 09 16:12:51 crc kubenswrapper[4719]: E1009 16:12:51.162786 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:13:06 crc kubenswrapper[4719]: I1009 16:13:06.161427 4719 scope.go:117] "RemoveContainer" containerID="e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" Oct 09 16:13:06 crc kubenswrapper[4719]: E1009 16:13:06.162186 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:13:17 crc kubenswrapper[4719]: I1009 16:13:17.161489 4719 scope.go:117] "RemoveContainer" containerID="e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" Oct 09 16:13:17 crc kubenswrapper[4719]: E1009 16:13:17.162297 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:13:32 crc kubenswrapper[4719]: I1009 16:13:32.161935 4719 scope.go:117] "RemoveContainer" containerID="e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" Oct 09 16:13:32 crc kubenswrapper[4719]: E1009 16:13:32.162870 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:13:36 crc kubenswrapper[4719]: I1009 16:13:36.307531 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vkrw2"] Oct 09 16:13:36 crc kubenswrapper[4719]: E1009 16:13:36.308264 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c8218d6-7a30-4d12-873d-db54dd2d2217" containerName="registry-server" Oct 09 16:13:36 crc kubenswrapper[4719]: I1009 16:13:36.308282 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c8218d6-7a30-4d12-873d-db54dd2d2217" containerName="registry-server" Oct 09 16:13:36 crc kubenswrapper[4719]: E1009 16:13:36.308310 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c8218d6-7a30-4d12-873d-db54dd2d2217" containerName="extract-content" Oct 09 16:13:36 crc kubenswrapper[4719]: I1009 16:13:36.308318 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c8218d6-7a30-4d12-873d-db54dd2d2217" containerName="extract-content" Oct 09 16:13:36 crc kubenswrapper[4719]: E1009 16:13:36.308366 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24591ba0-cfd8-4607-ba4c-143d2167a104" containerName="registry-server" Oct 09 16:13:36 crc kubenswrapper[4719]: I1009 16:13:36.308374 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="24591ba0-cfd8-4607-ba4c-143d2167a104" containerName="registry-server" Oct 09 16:13:36 crc kubenswrapper[4719]: E1009 16:13:36.308393 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c8218d6-7a30-4d12-873d-db54dd2d2217" containerName="extract-utilities" Oct 09 16:13:36 crc kubenswrapper[4719]: I1009 16:13:36.308401 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c8218d6-7a30-4d12-873d-db54dd2d2217" containerName="extract-utilities" Oct 09 16:13:36 crc kubenswrapper[4719]: E1009 16:13:36.308418 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24591ba0-cfd8-4607-ba4c-143d2167a104" containerName="extract-content" Oct 09 16:13:36 crc kubenswrapper[4719]: I1009 16:13:36.308423 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="24591ba0-cfd8-4607-ba4c-143d2167a104" containerName="extract-content" Oct 09 16:13:36 crc kubenswrapper[4719]: E1009 16:13:36.308436 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24591ba0-cfd8-4607-ba4c-143d2167a104" containerName="extract-utilities" Oct 09 16:13:36 crc kubenswrapper[4719]: I1009 16:13:36.308441 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="24591ba0-cfd8-4607-ba4c-143d2167a104" containerName="extract-utilities" Oct 09 16:13:36 crc kubenswrapper[4719]: I1009 16:13:36.308635 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="24591ba0-cfd8-4607-ba4c-143d2167a104" containerName="registry-server" Oct 09 16:13:36 crc kubenswrapper[4719]: I1009 16:13:36.308650 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c8218d6-7a30-4d12-873d-db54dd2d2217" containerName="registry-server" Oct 09 16:13:36 crc kubenswrapper[4719]: I1009 16:13:36.310222 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkrw2" Oct 09 16:13:36 crc kubenswrapper[4719]: I1009 16:13:36.325730 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vkrw2"] Oct 09 16:13:36 crc kubenswrapper[4719]: I1009 16:13:36.353240 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pglll\" (UniqueName: \"kubernetes.io/projected/c2f26c10-395e-4f1a-b76a-25c9b10654ab-kube-api-access-pglll\") pod \"community-operators-vkrw2\" (UID: \"c2f26c10-395e-4f1a-b76a-25c9b10654ab\") " pod="openshift-marketplace/community-operators-vkrw2" Oct 09 16:13:36 crc kubenswrapper[4719]: I1009 16:13:36.353371 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2f26c10-395e-4f1a-b76a-25c9b10654ab-catalog-content\") pod \"community-operators-vkrw2\" (UID: \"c2f26c10-395e-4f1a-b76a-25c9b10654ab\") " pod="openshift-marketplace/community-operators-vkrw2" Oct 09 16:13:36 crc kubenswrapper[4719]: I1009 16:13:36.353520 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2f26c10-395e-4f1a-b76a-25c9b10654ab-utilities\") pod \"community-operators-vkrw2\" (UID: \"c2f26c10-395e-4f1a-b76a-25c9b10654ab\") " pod="openshift-marketplace/community-operators-vkrw2" Oct 09 16:13:36 crc kubenswrapper[4719]: I1009 16:13:36.455707 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2f26c10-395e-4f1a-b76a-25c9b10654ab-utilities\") pod \"community-operators-vkrw2\" (UID: \"c2f26c10-395e-4f1a-b76a-25c9b10654ab\") " pod="openshift-marketplace/community-operators-vkrw2" Oct 09 16:13:36 crc kubenswrapper[4719]: I1009 16:13:36.455891 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pglll\" (UniqueName: \"kubernetes.io/projected/c2f26c10-395e-4f1a-b76a-25c9b10654ab-kube-api-access-pglll\") pod \"community-operators-vkrw2\" (UID: \"c2f26c10-395e-4f1a-b76a-25c9b10654ab\") " pod="openshift-marketplace/community-operators-vkrw2" Oct 09 16:13:36 crc kubenswrapper[4719]: I1009 16:13:36.456015 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2f26c10-395e-4f1a-b76a-25c9b10654ab-catalog-content\") pod \"community-operators-vkrw2\" (UID: \"c2f26c10-395e-4f1a-b76a-25c9b10654ab\") " pod="openshift-marketplace/community-operators-vkrw2" Oct 09 16:13:36 crc kubenswrapper[4719]: I1009 16:13:36.456595 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2f26c10-395e-4f1a-b76a-25c9b10654ab-utilities\") pod \"community-operators-vkrw2\" (UID: \"c2f26c10-395e-4f1a-b76a-25c9b10654ab\") " pod="openshift-marketplace/community-operators-vkrw2" Oct 09 16:13:36 crc kubenswrapper[4719]: I1009 16:13:36.456992 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2f26c10-395e-4f1a-b76a-25c9b10654ab-catalog-content\") pod \"community-operators-vkrw2\" (UID: \"c2f26c10-395e-4f1a-b76a-25c9b10654ab\") " pod="openshift-marketplace/community-operators-vkrw2" Oct 09 16:13:36 crc kubenswrapper[4719]: I1009 16:13:36.478012 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pglll\" (UniqueName: \"kubernetes.io/projected/c2f26c10-395e-4f1a-b76a-25c9b10654ab-kube-api-access-pglll\") pod \"community-operators-vkrw2\" (UID: \"c2f26c10-395e-4f1a-b76a-25c9b10654ab\") " pod="openshift-marketplace/community-operators-vkrw2" Oct 09 16:13:36 crc kubenswrapper[4719]: I1009 16:13:36.642408 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkrw2" Oct 09 16:13:37 crc kubenswrapper[4719]: I1009 16:13:37.233869 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vkrw2"] Oct 09 16:13:38 crc kubenswrapper[4719]: I1009 16:13:38.112487 4719 generic.go:334] "Generic (PLEG): container finished" podID="c2f26c10-395e-4f1a-b76a-25c9b10654ab" containerID="132a6ffcbe958f806a498022fd64bced37e80dd33e854737ff5ab76390a552c5" exitCode=0 Oct 09 16:13:38 crc kubenswrapper[4719]: I1009 16:13:38.112772 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkrw2" event={"ID":"c2f26c10-395e-4f1a-b76a-25c9b10654ab","Type":"ContainerDied","Data":"132a6ffcbe958f806a498022fd64bced37e80dd33e854737ff5ab76390a552c5"} Oct 09 16:13:38 crc kubenswrapper[4719]: I1009 16:13:38.112798 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkrw2" event={"ID":"c2f26c10-395e-4f1a-b76a-25c9b10654ab","Type":"ContainerStarted","Data":"fe37d43ed41c1a985ff8c462bceb170e1eb43a56f33743c0c9d96b01d2e84dc8"} Oct 09 16:13:38 crc kubenswrapper[4719]: I1009 16:13:38.118182 4719 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 16:13:40 crc kubenswrapper[4719]: I1009 16:13:40.130847 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkrw2" event={"ID":"c2f26c10-395e-4f1a-b76a-25c9b10654ab","Type":"ContainerStarted","Data":"1f316860dbef0a9f99e0fd9b64a0e44f76cc171d7da823b5d2e71e7575272cbf"} Oct 09 16:13:40 crc kubenswrapper[4719]: I1009 16:13:40.516860 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fs8pv"] Oct 09 16:13:40 crc kubenswrapper[4719]: I1009 16:13:40.519554 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fs8pv" Oct 09 16:13:40 crc kubenswrapper[4719]: I1009 16:13:40.529318 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fs8pv"] Oct 09 16:13:40 crc kubenswrapper[4719]: I1009 16:13:40.566795 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/740b0123-1951-4062-b412-8f8b669804d0-utilities\") pod \"redhat-operators-fs8pv\" (UID: \"740b0123-1951-4062-b412-8f8b669804d0\") " pod="openshift-marketplace/redhat-operators-fs8pv" Oct 09 16:13:40 crc kubenswrapper[4719]: I1009 16:13:40.566932 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mss6c\" (UniqueName: \"kubernetes.io/projected/740b0123-1951-4062-b412-8f8b669804d0-kube-api-access-mss6c\") pod \"redhat-operators-fs8pv\" (UID: \"740b0123-1951-4062-b412-8f8b669804d0\") " pod="openshift-marketplace/redhat-operators-fs8pv" Oct 09 16:13:40 crc kubenswrapper[4719]: I1009 16:13:40.567190 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/740b0123-1951-4062-b412-8f8b669804d0-catalog-content\") pod \"redhat-operators-fs8pv\" (UID: \"740b0123-1951-4062-b412-8f8b669804d0\") " pod="openshift-marketplace/redhat-operators-fs8pv" Oct 09 16:13:40 crc kubenswrapper[4719]: I1009 16:13:40.670135 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/740b0123-1951-4062-b412-8f8b669804d0-utilities\") pod \"redhat-operators-fs8pv\" (UID: \"740b0123-1951-4062-b412-8f8b669804d0\") " pod="openshift-marketplace/redhat-operators-fs8pv" Oct 09 16:13:40 crc kubenswrapper[4719]: I1009 16:13:40.670285 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mss6c\" (UniqueName: \"kubernetes.io/projected/740b0123-1951-4062-b412-8f8b669804d0-kube-api-access-mss6c\") pod \"redhat-operators-fs8pv\" (UID: \"740b0123-1951-4062-b412-8f8b669804d0\") " pod="openshift-marketplace/redhat-operators-fs8pv" Oct 09 16:13:40 crc kubenswrapper[4719]: I1009 16:13:40.670382 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/740b0123-1951-4062-b412-8f8b669804d0-catalog-content\") pod \"redhat-operators-fs8pv\" (UID: \"740b0123-1951-4062-b412-8f8b669804d0\") " pod="openshift-marketplace/redhat-operators-fs8pv" Oct 09 16:13:40 crc kubenswrapper[4719]: I1009 16:13:40.670699 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/740b0123-1951-4062-b412-8f8b669804d0-utilities\") pod \"redhat-operators-fs8pv\" (UID: \"740b0123-1951-4062-b412-8f8b669804d0\") " pod="openshift-marketplace/redhat-operators-fs8pv" Oct 09 16:13:40 crc kubenswrapper[4719]: I1009 16:13:40.670759 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/740b0123-1951-4062-b412-8f8b669804d0-catalog-content\") pod \"redhat-operators-fs8pv\" (UID: \"740b0123-1951-4062-b412-8f8b669804d0\") " pod="openshift-marketplace/redhat-operators-fs8pv" Oct 09 16:13:40 crc kubenswrapper[4719]: I1009 16:13:40.690140 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mss6c\" (UniqueName: \"kubernetes.io/projected/740b0123-1951-4062-b412-8f8b669804d0-kube-api-access-mss6c\") pod \"redhat-operators-fs8pv\" (UID: \"740b0123-1951-4062-b412-8f8b669804d0\") " pod="openshift-marketplace/redhat-operators-fs8pv" Oct 09 16:13:40 crc kubenswrapper[4719]: I1009 16:13:40.847901 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fs8pv" Oct 09 16:13:41 crc kubenswrapper[4719]: W1009 16:13:41.376080 4719 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod740b0123_1951_4062_b412_8f8b669804d0.slice/crio-10f85aeb2b737fabdcb65e9ffb494acaa268123146cb74b79ec0b84b679078a2 WatchSource:0}: Error finding container 10f85aeb2b737fabdcb65e9ffb494acaa268123146cb74b79ec0b84b679078a2: Status 404 returned error can't find the container with id 10f85aeb2b737fabdcb65e9ffb494acaa268123146cb74b79ec0b84b679078a2 Oct 09 16:13:41 crc kubenswrapper[4719]: I1009 16:13:41.389046 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fs8pv"] Oct 09 16:13:42 crc kubenswrapper[4719]: I1009 16:13:42.155962 4719 generic.go:334] "Generic (PLEG): container finished" podID="740b0123-1951-4062-b412-8f8b669804d0" containerID="940a9a9f1d79059d7fc816838bafaee8a7247d6f55bfffadd55bd789c128055c" exitCode=0 Oct 09 16:13:42 crc kubenswrapper[4719]: I1009 16:13:42.156076 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fs8pv" event={"ID":"740b0123-1951-4062-b412-8f8b669804d0","Type":"ContainerDied","Data":"940a9a9f1d79059d7fc816838bafaee8a7247d6f55bfffadd55bd789c128055c"} Oct 09 16:13:42 crc kubenswrapper[4719]: I1009 16:13:42.156476 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fs8pv" event={"ID":"740b0123-1951-4062-b412-8f8b669804d0","Type":"ContainerStarted","Data":"10f85aeb2b737fabdcb65e9ffb494acaa268123146cb74b79ec0b84b679078a2"} Oct 09 16:13:45 crc kubenswrapper[4719]: I1009 16:13:45.189188 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fs8pv" event={"ID":"740b0123-1951-4062-b412-8f8b669804d0","Type":"ContainerStarted","Data":"4a10700208ec9b1bb84b84ac5eeae3090615925c3c6a21720cae6ee453e4e796"} Oct 09 16:13:45 crc kubenswrapper[4719]: I1009 16:13:45.193413 4719 generic.go:334] "Generic (PLEG): container finished" podID="c2f26c10-395e-4f1a-b76a-25c9b10654ab" containerID="1f316860dbef0a9f99e0fd9b64a0e44f76cc171d7da823b5d2e71e7575272cbf" exitCode=0 Oct 09 16:13:45 crc kubenswrapper[4719]: I1009 16:13:45.193445 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkrw2" event={"ID":"c2f26c10-395e-4f1a-b76a-25c9b10654ab","Type":"ContainerDied","Data":"1f316860dbef0a9f99e0fd9b64a0e44f76cc171d7da823b5d2e71e7575272cbf"} Oct 09 16:13:46 crc kubenswrapper[4719]: I1009 16:13:46.212985 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkrw2" event={"ID":"c2f26c10-395e-4f1a-b76a-25c9b10654ab","Type":"ContainerStarted","Data":"02000df75b59901209b8c78bf27ca3719f8fd90dda21a6b6272993b6bc327d0f"} Oct 09 16:13:46 crc kubenswrapper[4719]: I1009 16:13:46.240580 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vkrw2" podStartSLOduration=2.561368292 podStartE2EDuration="10.240554314s" podCreationTimestamp="2025-10-09 16:13:36 +0000 UTC" firstStartedPulling="2025-10-09 16:13:38.117970889 +0000 UTC m=+3323.627682174" lastFinishedPulling="2025-10-09 16:13:45.797156901 +0000 UTC m=+3331.306868196" observedRunningTime="2025-10-09 16:13:46.231595087 +0000 UTC m=+3331.741306372" watchObservedRunningTime="2025-10-09 16:13:46.240554314 +0000 UTC m=+3331.750265599" Oct 09 16:13:46 crc kubenswrapper[4719]: I1009 16:13:46.642819 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vkrw2" Oct 09 16:13:46 crc kubenswrapper[4719]: I1009 16:13:46.643267 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vkrw2" Oct 09 16:13:47 crc kubenswrapper[4719]: I1009 16:13:47.161595 4719 scope.go:117] "RemoveContainer" containerID="e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" Oct 09 16:13:47 crc kubenswrapper[4719]: E1009 16:13:47.161910 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:13:47 crc kubenswrapper[4719]: I1009 16:13:47.691890 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-vkrw2" podUID="c2f26c10-395e-4f1a-b76a-25c9b10654ab" containerName="registry-server" probeResult="failure" output=< Oct 09 16:13:47 crc kubenswrapper[4719]: timeout: failed to connect service ":50051" within 1s Oct 09 16:13:47 crc kubenswrapper[4719]: > Oct 09 16:13:55 crc kubenswrapper[4719]: I1009 16:13:55.291907 4719 generic.go:334] "Generic (PLEG): container finished" podID="740b0123-1951-4062-b412-8f8b669804d0" containerID="4a10700208ec9b1bb84b84ac5eeae3090615925c3c6a21720cae6ee453e4e796" exitCode=0 Oct 09 16:13:55 crc kubenswrapper[4719]: I1009 16:13:55.292092 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fs8pv" event={"ID":"740b0123-1951-4062-b412-8f8b669804d0","Type":"ContainerDied","Data":"4a10700208ec9b1bb84b84ac5eeae3090615925c3c6a21720cae6ee453e4e796"} Oct 09 16:13:56 crc kubenswrapper[4719]: I1009 16:13:56.302084 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fs8pv" event={"ID":"740b0123-1951-4062-b412-8f8b669804d0","Type":"ContainerStarted","Data":"eb5525ffb5faff4a47c76c3ffdf55ebf13007d75fb264eea24bb3aacb70237db"} Oct 09 16:13:56 crc kubenswrapper[4719]: I1009 16:13:56.319096 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fs8pv" podStartSLOduration=2.700806244 podStartE2EDuration="16.31907658s" podCreationTimestamp="2025-10-09 16:13:40 +0000 UTC" firstStartedPulling="2025-10-09 16:13:42.159585637 +0000 UTC m=+3327.669296922" lastFinishedPulling="2025-10-09 16:13:55.777855973 +0000 UTC m=+3341.287567258" observedRunningTime="2025-10-09 16:13:56.316142696 +0000 UTC m=+3341.825853991" watchObservedRunningTime="2025-10-09 16:13:56.31907658 +0000 UTC m=+3341.828787865" Oct 09 16:13:56 crc kubenswrapper[4719]: I1009 16:13:56.695961 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vkrw2" Oct 09 16:13:56 crc kubenswrapper[4719]: I1009 16:13:56.753505 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vkrw2" Oct 09 16:13:57 crc kubenswrapper[4719]: I1009 16:13:57.535505 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vkrw2"] Oct 09 16:13:58 crc kubenswrapper[4719]: I1009 16:13:58.319082 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vkrw2" podUID="c2f26c10-395e-4f1a-b76a-25c9b10654ab" containerName="registry-server" containerID="cri-o://02000df75b59901209b8c78bf27ca3719f8fd90dda21a6b6272993b6bc327d0f" gracePeriod=2 Oct 09 16:13:58 crc kubenswrapper[4719]: I1009 16:13:58.828901 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkrw2" Oct 09 16:13:58 crc kubenswrapper[4719]: I1009 16:13:58.959470 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2f26c10-395e-4f1a-b76a-25c9b10654ab-utilities\") pod \"c2f26c10-395e-4f1a-b76a-25c9b10654ab\" (UID: \"c2f26c10-395e-4f1a-b76a-25c9b10654ab\") " Oct 09 16:13:58 crc kubenswrapper[4719]: I1009 16:13:58.959572 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2f26c10-395e-4f1a-b76a-25c9b10654ab-catalog-content\") pod \"c2f26c10-395e-4f1a-b76a-25c9b10654ab\" (UID: \"c2f26c10-395e-4f1a-b76a-25c9b10654ab\") " Oct 09 16:13:58 crc kubenswrapper[4719]: I1009 16:13:58.959643 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pglll\" (UniqueName: \"kubernetes.io/projected/c2f26c10-395e-4f1a-b76a-25c9b10654ab-kube-api-access-pglll\") pod \"c2f26c10-395e-4f1a-b76a-25c9b10654ab\" (UID: \"c2f26c10-395e-4f1a-b76a-25c9b10654ab\") " Oct 09 16:13:58 crc kubenswrapper[4719]: I1009 16:13:58.960197 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2f26c10-395e-4f1a-b76a-25c9b10654ab-utilities" (OuterVolumeSpecName: "utilities") pod "c2f26c10-395e-4f1a-b76a-25c9b10654ab" (UID: "c2f26c10-395e-4f1a-b76a-25c9b10654ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:13:58 crc kubenswrapper[4719]: I1009 16:13:58.960496 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2f26c10-395e-4f1a-b76a-25c9b10654ab-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 16:13:58 crc kubenswrapper[4719]: I1009 16:13:58.966746 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2f26c10-395e-4f1a-b76a-25c9b10654ab-kube-api-access-pglll" (OuterVolumeSpecName: "kube-api-access-pglll") pod "c2f26c10-395e-4f1a-b76a-25c9b10654ab" (UID: "c2f26c10-395e-4f1a-b76a-25c9b10654ab"). InnerVolumeSpecName "kube-api-access-pglll". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:13:59 crc kubenswrapper[4719]: I1009 16:13:59.002422 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2f26c10-395e-4f1a-b76a-25c9b10654ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2f26c10-395e-4f1a-b76a-25c9b10654ab" (UID: "c2f26c10-395e-4f1a-b76a-25c9b10654ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:13:59 crc kubenswrapper[4719]: I1009 16:13:59.063022 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2f26c10-395e-4f1a-b76a-25c9b10654ab-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 16:13:59 crc kubenswrapper[4719]: I1009 16:13:59.063061 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pglll\" (UniqueName: \"kubernetes.io/projected/c2f26c10-395e-4f1a-b76a-25c9b10654ab-kube-api-access-pglll\") on node \"crc\" DevicePath \"\"" Oct 09 16:13:59 crc kubenswrapper[4719]: I1009 16:13:59.335578 4719 generic.go:334] "Generic (PLEG): container finished" podID="c2f26c10-395e-4f1a-b76a-25c9b10654ab" containerID="02000df75b59901209b8c78bf27ca3719f8fd90dda21a6b6272993b6bc327d0f" exitCode=0 Oct 09 16:13:59 crc kubenswrapper[4719]: I1009 16:13:59.335641 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkrw2" event={"ID":"c2f26c10-395e-4f1a-b76a-25c9b10654ab","Type":"ContainerDied","Data":"02000df75b59901209b8c78bf27ca3719f8fd90dda21a6b6272993b6bc327d0f"} Oct 09 16:13:59 crc kubenswrapper[4719]: I1009 16:13:59.335677 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkrw2" event={"ID":"c2f26c10-395e-4f1a-b76a-25c9b10654ab","Type":"ContainerDied","Data":"fe37d43ed41c1a985ff8c462bceb170e1eb43a56f33743c0c9d96b01d2e84dc8"} Oct 09 16:13:59 crc kubenswrapper[4719]: I1009 16:13:59.335702 4719 scope.go:117] "RemoveContainer" containerID="02000df75b59901209b8c78bf27ca3719f8fd90dda21a6b6272993b6bc327d0f" Oct 09 16:13:59 crc kubenswrapper[4719]: I1009 16:13:59.335912 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkrw2" Oct 09 16:13:59 crc kubenswrapper[4719]: I1009 16:13:59.365994 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vkrw2"] Oct 09 16:13:59 crc kubenswrapper[4719]: I1009 16:13:59.367525 4719 scope.go:117] "RemoveContainer" containerID="1f316860dbef0a9f99e0fd9b64a0e44f76cc171d7da823b5d2e71e7575272cbf" Oct 09 16:13:59 crc kubenswrapper[4719]: I1009 16:13:59.378990 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vkrw2"] Oct 09 16:13:59 crc kubenswrapper[4719]: I1009 16:13:59.388512 4719 scope.go:117] "RemoveContainer" containerID="132a6ffcbe958f806a498022fd64bced37e80dd33e854737ff5ab76390a552c5" Oct 09 16:13:59 crc kubenswrapper[4719]: I1009 16:13:59.438532 4719 scope.go:117] "RemoveContainer" containerID="02000df75b59901209b8c78bf27ca3719f8fd90dda21a6b6272993b6bc327d0f" Oct 09 16:13:59 crc kubenswrapper[4719]: E1009 16:13:59.439085 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02000df75b59901209b8c78bf27ca3719f8fd90dda21a6b6272993b6bc327d0f\": container with ID starting with 02000df75b59901209b8c78bf27ca3719f8fd90dda21a6b6272993b6bc327d0f not found: ID does not exist" containerID="02000df75b59901209b8c78bf27ca3719f8fd90dda21a6b6272993b6bc327d0f" Oct 09 16:13:59 crc kubenswrapper[4719]: I1009 16:13:59.439153 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02000df75b59901209b8c78bf27ca3719f8fd90dda21a6b6272993b6bc327d0f"} err="failed to get container status \"02000df75b59901209b8c78bf27ca3719f8fd90dda21a6b6272993b6bc327d0f\": rpc error: code = NotFound desc = could not find container \"02000df75b59901209b8c78bf27ca3719f8fd90dda21a6b6272993b6bc327d0f\": container with ID starting with 02000df75b59901209b8c78bf27ca3719f8fd90dda21a6b6272993b6bc327d0f not found: ID does not exist" Oct 09 16:13:59 crc kubenswrapper[4719]: I1009 16:13:59.439203 4719 scope.go:117] "RemoveContainer" containerID="1f316860dbef0a9f99e0fd9b64a0e44f76cc171d7da823b5d2e71e7575272cbf" Oct 09 16:13:59 crc kubenswrapper[4719]: E1009 16:13:59.439735 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f316860dbef0a9f99e0fd9b64a0e44f76cc171d7da823b5d2e71e7575272cbf\": container with ID starting with 1f316860dbef0a9f99e0fd9b64a0e44f76cc171d7da823b5d2e71e7575272cbf not found: ID does not exist" containerID="1f316860dbef0a9f99e0fd9b64a0e44f76cc171d7da823b5d2e71e7575272cbf" Oct 09 16:13:59 crc kubenswrapper[4719]: I1009 16:13:59.439779 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f316860dbef0a9f99e0fd9b64a0e44f76cc171d7da823b5d2e71e7575272cbf"} err="failed to get container status \"1f316860dbef0a9f99e0fd9b64a0e44f76cc171d7da823b5d2e71e7575272cbf\": rpc error: code = NotFound desc = could not find container \"1f316860dbef0a9f99e0fd9b64a0e44f76cc171d7da823b5d2e71e7575272cbf\": container with ID starting with 1f316860dbef0a9f99e0fd9b64a0e44f76cc171d7da823b5d2e71e7575272cbf not found: ID does not exist" Oct 09 16:13:59 crc kubenswrapper[4719]: I1009 16:13:59.439797 4719 scope.go:117] "RemoveContainer" containerID="132a6ffcbe958f806a498022fd64bced37e80dd33e854737ff5ab76390a552c5" Oct 09 16:13:59 crc kubenswrapper[4719]: E1009 16:13:59.440144 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"132a6ffcbe958f806a498022fd64bced37e80dd33e854737ff5ab76390a552c5\": container with ID starting with 132a6ffcbe958f806a498022fd64bced37e80dd33e854737ff5ab76390a552c5 not found: ID does not exist" containerID="132a6ffcbe958f806a498022fd64bced37e80dd33e854737ff5ab76390a552c5" Oct 09 16:13:59 crc kubenswrapper[4719]: I1009 16:13:59.440191 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"132a6ffcbe958f806a498022fd64bced37e80dd33e854737ff5ab76390a552c5"} err="failed to get container status \"132a6ffcbe958f806a498022fd64bced37e80dd33e854737ff5ab76390a552c5\": rpc error: code = NotFound desc = could not find container \"132a6ffcbe958f806a498022fd64bced37e80dd33e854737ff5ab76390a552c5\": container with ID starting with 132a6ffcbe958f806a498022fd64bced37e80dd33e854737ff5ab76390a552c5 not found: ID does not exist" Oct 09 16:14:00 crc kubenswrapper[4719]: I1009 16:14:00.161324 4719 scope.go:117] "RemoveContainer" containerID="e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" Oct 09 16:14:00 crc kubenswrapper[4719]: E1009 16:14:00.161649 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:14:00 crc kubenswrapper[4719]: I1009 16:14:00.848789 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fs8pv" Oct 09 16:14:00 crc kubenswrapper[4719]: I1009 16:14:00.849233 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fs8pv" Oct 09 16:14:00 crc kubenswrapper[4719]: I1009 16:14:00.897426 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fs8pv" Oct 09 16:14:01 crc kubenswrapper[4719]: I1009 16:14:01.173418 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2f26c10-395e-4f1a-b76a-25c9b10654ab" path="/var/lib/kubelet/pods/c2f26c10-395e-4f1a-b76a-25c9b10654ab/volumes" Oct 09 16:14:01 crc kubenswrapper[4719]: I1009 16:14:01.395500 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fs8pv" Oct 09 16:14:01 crc kubenswrapper[4719]: I1009 16:14:01.936076 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fs8pv"] Oct 09 16:14:03 crc kubenswrapper[4719]: I1009 16:14:03.370434 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fs8pv" podUID="740b0123-1951-4062-b412-8f8b669804d0" containerName="registry-server" containerID="cri-o://eb5525ffb5faff4a47c76c3ffdf55ebf13007d75fb264eea24bb3aacb70237db" gracePeriod=2 Oct 09 16:14:03 crc kubenswrapper[4719]: I1009 16:14:03.994045 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fs8pv" Oct 09 16:14:04 crc kubenswrapper[4719]: I1009 16:14:04.169778 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/740b0123-1951-4062-b412-8f8b669804d0-utilities\") pod \"740b0123-1951-4062-b412-8f8b669804d0\" (UID: \"740b0123-1951-4062-b412-8f8b669804d0\") " Oct 09 16:14:04 crc kubenswrapper[4719]: I1009 16:14:04.169875 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/740b0123-1951-4062-b412-8f8b669804d0-catalog-content\") pod \"740b0123-1951-4062-b412-8f8b669804d0\" (UID: \"740b0123-1951-4062-b412-8f8b669804d0\") " Oct 09 16:14:04 crc kubenswrapper[4719]: I1009 16:14:04.170616 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/740b0123-1951-4062-b412-8f8b669804d0-utilities" (OuterVolumeSpecName: "utilities") pod "740b0123-1951-4062-b412-8f8b669804d0" (UID: "740b0123-1951-4062-b412-8f8b669804d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:14:04 crc kubenswrapper[4719]: I1009 16:14:04.171115 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mss6c\" (UniqueName: \"kubernetes.io/projected/740b0123-1951-4062-b412-8f8b669804d0-kube-api-access-mss6c\") pod \"740b0123-1951-4062-b412-8f8b669804d0\" (UID: \"740b0123-1951-4062-b412-8f8b669804d0\") " Oct 09 16:14:04 crc kubenswrapper[4719]: I1009 16:14:04.171784 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/740b0123-1951-4062-b412-8f8b669804d0-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 16:14:04 crc kubenswrapper[4719]: I1009 16:14:04.177723 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/740b0123-1951-4062-b412-8f8b669804d0-kube-api-access-mss6c" (OuterVolumeSpecName: "kube-api-access-mss6c") pod "740b0123-1951-4062-b412-8f8b669804d0" (UID: "740b0123-1951-4062-b412-8f8b669804d0"). InnerVolumeSpecName "kube-api-access-mss6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:14:04 crc kubenswrapper[4719]: I1009 16:14:04.262471 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/740b0123-1951-4062-b412-8f8b669804d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "740b0123-1951-4062-b412-8f8b669804d0" (UID: "740b0123-1951-4062-b412-8f8b669804d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:14:04 crc kubenswrapper[4719]: I1009 16:14:04.276284 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/740b0123-1951-4062-b412-8f8b669804d0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 16:14:04 crc kubenswrapper[4719]: I1009 16:14:04.276318 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mss6c\" (UniqueName: \"kubernetes.io/projected/740b0123-1951-4062-b412-8f8b669804d0-kube-api-access-mss6c\") on node \"crc\" DevicePath \"\"" Oct 09 16:14:04 crc kubenswrapper[4719]: I1009 16:14:04.395873 4719 generic.go:334] "Generic (PLEG): container finished" podID="740b0123-1951-4062-b412-8f8b669804d0" containerID="eb5525ffb5faff4a47c76c3ffdf55ebf13007d75fb264eea24bb3aacb70237db" exitCode=0 Oct 09 16:14:04 crc kubenswrapper[4719]: I1009 16:14:04.395922 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fs8pv" event={"ID":"740b0123-1951-4062-b412-8f8b669804d0","Type":"ContainerDied","Data":"eb5525ffb5faff4a47c76c3ffdf55ebf13007d75fb264eea24bb3aacb70237db"} Oct 09 16:14:04 crc kubenswrapper[4719]: I1009 16:14:04.395952 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fs8pv" event={"ID":"740b0123-1951-4062-b412-8f8b669804d0","Type":"ContainerDied","Data":"10f85aeb2b737fabdcb65e9ffb494acaa268123146cb74b79ec0b84b679078a2"} Oct 09 16:14:04 crc kubenswrapper[4719]: I1009 16:14:04.395969 4719 scope.go:117] "RemoveContainer" containerID="eb5525ffb5faff4a47c76c3ffdf55ebf13007d75fb264eea24bb3aacb70237db" Oct 09 16:14:04 crc kubenswrapper[4719]: I1009 16:14:04.396084 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fs8pv" Oct 09 16:14:04 crc kubenswrapper[4719]: I1009 16:14:04.432460 4719 scope.go:117] "RemoveContainer" containerID="4a10700208ec9b1bb84b84ac5eeae3090615925c3c6a21720cae6ee453e4e796" Oct 09 16:14:04 crc kubenswrapper[4719]: I1009 16:14:04.437103 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fs8pv"] Oct 09 16:14:04 crc kubenswrapper[4719]: I1009 16:14:04.465151 4719 scope.go:117] "RemoveContainer" containerID="940a9a9f1d79059d7fc816838bafaee8a7247d6f55bfffadd55bd789c128055c" Oct 09 16:14:04 crc kubenswrapper[4719]: I1009 16:14:04.469973 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fs8pv"] Oct 09 16:14:04 crc kubenswrapper[4719]: I1009 16:14:04.512278 4719 scope.go:117] "RemoveContainer" containerID="eb5525ffb5faff4a47c76c3ffdf55ebf13007d75fb264eea24bb3aacb70237db" Oct 09 16:14:04 crc kubenswrapper[4719]: E1009 16:14:04.512754 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb5525ffb5faff4a47c76c3ffdf55ebf13007d75fb264eea24bb3aacb70237db\": container with ID starting with eb5525ffb5faff4a47c76c3ffdf55ebf13007d75fb264eea24bb3aacb70237db not found: ID does not exist" containerID="eb5525ffb5faff4a47c76c3ffdf55ebf13007d75fb264eea24bb3aacb70237db" Oct 09 16:14:04 crc kubenswrapper[4719]: I1009 16:14:04.512804 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb5525ffb5faff4a47c76c3ffdf55ebf13007d75fb264eea24bb3aacb70237db"} err="failed to get container status \"eb5525ffb5faff4a47c76c3ffdf55ebf13007d75fb264eea24bb3aacb70237db\": rpc error: code = NotFound desc = could not find container \"eb5525ffb5faff4a47c76c3ffdf55ebf13007d75fb264eea24bb3aacb70237db\": container with ID starting with eb5525ffb5faff4a47c76c3ffdf55ebf13007d75fb264eea24bb3aacb70237db not found: ID does not exist" Oct 09 16:14:04 crc kubenswrapper[4719]: I1009 16:14:04.512833 4719 scope.go:117] "RemoveContainer" containerID="4a10700208ec9b1bb84b84ac5eeae3090615925c3c6a21720cae6ee453e4e796" Oct 09 16:14:04 crc kubenswrapper[4719]: E1009 16:14:04.513156 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a10700208ec9b1bb84b84ac5eeae3090615925c3c6a21720cae6ee453e4e796\": container with ID starting with 4a10700208ec9b1bb84b84ac5eeae3090615925c3c6a21720cae6ee453e4e796 not found: ID does not exist" containerID="4a10700208ec9b1bb84b84ac5eeae3090615925c3c6a21720cae6ee453e4e796" Oct 09 16:14:04 crc kubenswrapper[4719]: I1009 16:14:04.513186 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a10700208ec9b1bb84b84ac5eeae3090615925c3c6a21720cae6ee453e4e796"} err="failed to get container status \"4a10700208ec9b1bb84b84ac5eeae3090615925c3c6a21720cae6ee453e4e796\": rpc error: code = NotFound desc = could not find container \"4a10700208ec9b1bb84b84ac5eeae3090615925c3c6a21720cae6ee453e4e796\": container with ID starting with 4a10700208ec9b1bb84b84ac5eeae3090615925c3c6a21720cae6ee453e4e796 not found: ID does not exist" Oct 09 16:14:04 crc kubenswrapper[4719]: I1009 16:14:04.513200 4719 scope.go:117] "RemoveContainer" containerID="940a9a9f1d79059d7fc816838bafaee8a7247d6f55bfffadd55bd789c128055c" Oct 09 16:14:04 crc kubenswrapper[4719]: E1009 16:14:04.513682 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"940a9a9f1d79059d7fc816838bafaee8a7247d6f55bfffadd55bd789c128055c\": container with ID starting with 940a9a9f1d79059d7fc816838bafaee8a7247d6f55bfffadd55bd789c128055c not found: ID does not exist" containerID="940a9a9f1d79059d7fc816838bafaee8a7247d6f55bfffadd55bd789c128055c" Oct 09 16:14:04 crc kubenswrapper[4719]: I1009 16:14:04.513701 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"940a9a9f1d79059d7fc816838bafaee8a7247d6f55bfffadd55bd789c128055c"} err="failed to get container status \"940a9a9f1d79059d7fc816838bafaee8a7247d6f55bfffadd55bd789c128055c\": rpc error: code = NotFound desc = could not find container \"940a9a9f1d79059d7fc816838bafaee8a7247d6f55bfffadd55bd789c128055c\": container with ID starting with 940a9a9f1d79059d7fc816838bafaee8a7247d6f55bfffadd55bd789c128055c not found: ID does not exist" Oct 09 16:14:05 crc kubenswrapper[4719]: I1009 16:14:05.175275 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="740b0123-1951-4062-b412-8f8b669804d0" path="/var/lib/kubelet/pods/740b0123-1951-4062-b412-8f8b669804d0/volumes" Oct 09 16:14:15 crc kubenswrapper[4719]: I1009 16:14:15.168825 4719 scope.go:117] "RemoveContainer" containerID="e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" Oct 09 16:14:15 crc kubenswrapper[4719]: E1009 16:14:15.169813 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:14:30 crc kubenswrapper[4719]: I1009 16:14:30.161837 4719 scope.go:117] "RemoveContainer" containerID="e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" Oct 09 16:14:30 crc kubenswrapper[4719]: E1009 16:14:30.162735 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:14:44 crc kubenswrapper[4719]: I1009 16:14:44.161543 4719 scope.go:117] "RemoveContainer" containerID="e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" Oct 09 16:14:44 crc kubenswrapper[4719]: E1009 16:14:44.162530 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:14:58 crc kubenswrapper[4719]: I1009 16:14:58.162795 4719 scope.go:117] "RemoveContainer" containerID="e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" Oct 09 16:14:58 crc kubenswrapper[4719]: E1009 16:14:58.164162 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:15:00 crc kubenswrapper[4719]: I1009 16:15:00.184281 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333775-ks2hw"] Oct 09 16:15:00 crc kubenswrapper[4719]: E1009 16:15:00.185194 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2f26c10-395e-4f1a-b76a-25c9b10654ab" containerName="extract-utilities" Oct 09 16:15:00 crc kubenswrapper[4719]: I1009 16:15:00.185216 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f26c10-395e-4f1a-b76a-25c9b10654ab" containerName="extract-utilities" Oct 09 16:15:00 crc kubenswrapper[4719]: E1009 16:15:00.185233 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740b0123-1951-4062-b412-8f8b669804d0" containerName="extract-utilities" Oct 09 16:15:00 crc kubenswrapper[4719]: I1009 16:15:00.185240 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="740b0123-1951-4062-b412-8f8b669804d0" containerName="extract-utilities" Oct 09 16:15:00 crc kubenswrapper[4719]: E1009 16:15:00.185265 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2f26c10-395e-4f1a-b76a-25c9b10654ab" containerName="registry-server" Oct 09 16:15:00 crc kubenswrapper[4719]: I1009 16:15:00.185273 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f26c10-395e-4f1a-b76a-25c9b10654ab" containerName="registry-server" Oct 09 16:15:00 crc kubenswrapper[4719]: E1009 16:15:00.185295 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740b0123-1951-4062-b412-8f8b669804d0" containerName="extract-content" Oct 09 16:15:00 crc kubenswrapper[4719]: I1009 16:15:00.185302 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="740b0123-1951-4062-b412-8f8b669804d0" containerName="extract-content" Oct 09 16:15:00 crc kubenswrapper[4719]: E1009 16:15:00.185318 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740b0123-1951-4062-b412-8f8b669804d0" containerName="registry-server" Oct 09 16:15:00 crc kubenswrapper[4719]: I1009 16:15:00.185325 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="740b0123-1951-4062-b412-8f8b669804d0" containerName="registry-server" Oct 09 16:15:00 crc kubenswrapper[4719]: E1009 16:15:00.185371 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2f26c10-395e-4f1a-b76a-25c9b10654ab" containerName="extract-content" Oct 09 16:15:00 crc kubenswrapper[4719]: I1009 16:15:00.185380 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f26c10-395e-4f1a-b76a-25c9b10654ab" containerName="extract-content" Oct 09 16:15:00 crc kubenswrapper[4719]: I1009 16:15:00.185628 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2f26c10-395e-4f1a-b76a-25c9b10654ab" containerName="registry-server" Oct 09 16:15:00 crc kubenswrapper[4719]: I1009 16:15:00.185950 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="740b0123-1951-4062-b412-8f8b669804d0" containerName="registry-server" Oct 09 16:15:00 crc kubenswrapper[4719]: I1009 16:15:00.186774 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333775-ks2hw" Oct 09 16:15:00 crc kubenswrapper[4719]: I1009 16:15:00.189897 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 16:15:00 crc kubenswrapper[4719]: I1009 16:15:00.190153 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 16:15:00 crc kubenswrapper[4719]: I1009 16:15:00.200526 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333775-ks2hw"] Oct 09 16:15:00 crc kubenswrapper[4719]: I1009 16:15:00.291217 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/044dad3e-af09-4af6-bc36-5e0d89f1c7b3-secret-volume\") pod \"collect-profiles-29333775-ks2hw\" (UID: \"044dad3e-af09-4af6-bc36-5e0d89f1c7b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333775-ks2hw" Oct 09 16:15:00 crc kubenswrapper[4719]: I1009 16:15:00.291340 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/044dad3e-af09-4af6-bc36-5e0d89f1c7b3-config-volume\") pod \"collect-profiles-29333775-ks2hw\" (UID: \"044dad3e-af09-4af6-bc36-5e0d89f1c7b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333775-ks2hw" Oct 09 16:15:00 crc kubenswrapper[4719]: I1009 16:15:00.291439 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snbgs\" (UniqueName: \"kubernetes.io/projected/044dad3e-af09-4af6-bc36-5e0d89f1c7b3-kube-api-access-snbgs\") pod \"collect-profiles-29333775-ks2hw\" (UID: \"044dad3e-af09-4af6-bc36-5e0d89f1c7b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333775-ks2hw" Oct 09 16:15:00 crc kubenswrapper[4719]: I1009 16:15:00.392956 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/044dad3e-af09-4af6-bc36-5e0d89f1c7b3-secret-volume\") pod \"collect-profiles-29333775-ks2hw\" (UID: \"044dad3e-af09-4af6-bc36-5e0d89f1c7b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333775-ks2hw" Oct 09 16:15:00 crc kubenswrapper[4719]: I1009 16:15:00.393053 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/044dad3e-af09-4af6-bc36-5e0d89f1c7b3-config-volume\") pod \"collect-profiles-29333775-ks2hw\" (UID: \"044dad3e-af09-4af6-bc36-5e0d89f1c7b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333775-ks2hw" Oct 09 16:15:00 crc kubenswrapper[4719]: I1009 16:15:00.393104 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snbgs\" (UniqueName: \"kubernetes.io/projected/044dad3e-af09-4af6-bc36-5e0d89f1c7b3-kube-api-access-snbgs\") pod \"collect-profiles-29333775-ks2hw\" (UID: \"044dad3e-af09-4af6-bc36-5e0d89f1c7b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333775-ks2hw" Oct 09 16:15:00 crc kubenswrapper[4719]: I1009 16:15:00.394409 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/044dad3e-af09-4af6-bc36-5e0d89f1c7b3-config-volume\") pod \"collect-profiles-29333775-ks2hw\" (UID: \"044dad3e-af09-4af6-bc36-5e0d89f1c7b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333775-ks2hw" Oct 09 16:15:00 crc kubenswrapper[4719]: I1009 16:15:00.418620 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/044dad3e-af09-4af6-bc36-5e0d89f1c7b3-secret-volume\") pod \"collect-profiles-29333775-ks2hw\" (UID: \"044dad3e-af09-4af6-bc36-5e0d89f1c7b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333775-ks2hw" Oct 09 16:15:00 crc kubenswrapper[4719]: I1009 16:15:00.450540 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snbgs\" (UniqueName: \"kubernetes.io/projected/044dad3e-af09-4af6-bc36-5e0d89f1c7b3-kube-api-access-snbgs\") pod \"collect-profiles-29333775-ks2hw\" (UID: \"044dad3e-af09-4af6-bc36-5e0d89f1c7b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333775-ks2hw" Oct 09 16:15:00 crc kubenswrapper[4719]: I1009 16:15:00.517509 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333775-ks2hw" Oct 09 16:15:01 crc kubenswrapper[4719]: I1009 16:15:01.038990 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333775-ks2hw"] Oct 09 16:15:01 crc kubenswrapper[4719]: I1009 16:15:01.935091 4719 generic.go:334] "Generic (PLEG): container finished" podID="044dad3e-af09-4af6-bc36-5e0d89f1c7b3" containerID="1c114b175c938b52c7de9d20c67d12f5490dcbf54941ccdd8b69aa032233db68" exitCode=0 Oct 09 16:15:01 crc kubenswrapper[4719]: I1009 16:15:01.935140 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333775-ks2hw" event={"ID":"044dad3e-af09-4af6-bc36-5e0d89f1c7b3","Type":"ContainerDied","Data":"1c114b175c938b52c7de9d20c67d12f5490dcbf54941ccdd8b69aa032233db68"} Oct 09 16:15:01 crc kubenswrapper[4719]: I1009 16:15:01.935404 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333775-ks2hw" event={"ID":"044dad3e-af09-4af6-bc36-5e0d89f1c7b3","Type":"ContainerStarted","Data":"21c451288da00628ef6597c252a84ef710f56798cac631ad5f23ed6e19a78a79"} Oct 09 16:15:03 crc kubenswrapper[4719]: I1009 16:15:03.330866 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333775-ks2hw" Oct 09 16:15:03 crc kubenswrapper[4719]: I1009 16:15:03.460067 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/044dad3e-af09-4af6-bc36-5e0d89f1c7b3-config-volume\") pod \"044dad3e-af09-4af6-bc36-5e0d89f1c7b3\" (UID: \"044dad3e-af09-4af6-bc36-5e0d89f1c7b3\") " Oct 09 16:15:03 crc kubenswrapper[4719]: I1009 16:15:03.460182 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/044dad3e-af09-4af6-bc36-5e0d89f1c7b3-secret-volume\") pod \"044dad3e-af09-4af6-bc36-5e0d89f1c7b3\" (UID: \"044dad3e-af09-4af6-bc36-5e0d89f1c7b3\") " Oct 09 16:15:03 crc kubenswrapper[4719]: I1009 16:15:03.460249 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snbgs\" (UniqueName: \"kubernetes.io/projected/044dad3e-af09-4af6-bc36-5e0d89f1c7b3-kube-api-access-snbgs\") pod \"044dad3e-af09-4af6-bc36-5e0d89f1c7b3\" (UID: \"044dad3e-af09-4af6-bc36-5e0d89f1c7b3\") " Oct 09 16:15:03 crc kubenswrapper[4719]: I1009 16:15:03.460666 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/044dad3e-af09-4af6-bc36-5e0d89f1c7b3-config-volume" (OuterVolumeSpecName: "config-volume") pod "044dad3e-af09-4af6-bc36-5e0d89f1c7b3" (UID: "044dad3e-af09-4af6-bc36-5e0d89f1c7b3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 16:15:03 crc kubenswrapper[4719]: I1009 16:15:03.462031 4719 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/044dad3e-af09-4af6-bc36-5e0d89f1c7b3-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 16:15:03 crc kubenswrapper[4719]: I1009 16:15:03.465801 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/044dad3e-af09-4af6-bc36-5e0d89f1c7b3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "044dad3e-af09-4af6-bc36-5e0d89f1c7b3" (UID: "044dad3e-af09-4af6-bc36-5e0d89f1c7b3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:15:03 crc kubenswrapper[4719]: I1009 16:15:03.468624 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/044dad3e-af09-4af6-bc36-5e0d89f1c7b3-kube-api-access-snbgs" (OuterVolumeSpecName: "kube-api-access-snbgs") pod "044dad3e-af09-4af6-bc36-5e0d89f1c7b3" (UID: "044dad3e-af09-4af6-bc36-5e0d89f1c7b3"). InnerVolumeSpecName "kube-api-access-snbgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:15:03 crc kubenswrapper[4719]: I1009 16:15:03.564251 4719 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/044dad3e-af09-4af6-bc36-5e0d89f1c7b3-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 16:15:03 crc kubenswrapper[4719]: I1009 16:15:03.564565 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snbgs\" (UniqueName: \"kubernetes.io/projected/044dad3e-af09-4af6-bc36-5e0d89f1c7b3-kube-api-access-snbgs\") on node \"crc\" DevicePath \"\"" Oct 09 16:15:03 crc kubenswrapper[4719]: I1009 16:15:03.959476 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333775-ks2hw" event={"ID":"044dad3e-af09-4af6-bc36-5e0d89f1c7b3","Type":"ContainerDied","Data":"21c451288da00628ef6597c252a84ef710f56798cac631ad5f23ed6e19a78a79"} Oct 09 16:15:03 crc kubenswrapper[4719]: I1009 16:15:03.959519 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21c451288da00628ef6597c252a84ef710f56798cac631ad5f23ed6e19a78a79" Oct 09 16:15:03 crc kubenswrapper[4719]: I1009 16:15:03.959644 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333775-ks2hw" Oct 09 16:15:04 crc kubenswrapper[4719]: I1009 16:15:04.413204 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333730-5vlr5"] Oct 09 16:15:04 crc kubenswrapper[4719]: I1009 16:15:04.422325 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333730-5vlr5"] Oct 09 16:15:05 crc kubenswrapper[4719]: I1009 16:15:05.190293 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fd57e39-ad27-4ca6-89c0-01c9278f6c86" path="/var/lib/kubelet/pods/8fd57e39-ad27-4ca6-89c0-01c9278f6c86/volumes" Oct 09 16:15:13 crc kubenswrapper[4719]: I1009 16:15:13.161825 4719 scope.go:117] "RemoveContainer" containerID="e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" Oct 09 16:15:14 crc kubenswrapper[4719]: I1009 16:15:14.055837 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerStarted","Data":"0d408dab1638d77830ba4de58904c6a39c353da082d525becd504533d9c11701"} Oct 09 16:15:27 crc kubenswrapper[4719]: I1009 16:15:27.312623 4719 scope.go:117] "RemoveContainer" containerID="e55d433f1889fccde78745ae4d13a88764e8e2361557eb5a7a815e3990b280a2" Oct 09 16:17:00 crc kubenswrapper[4719]: I1009 16:17:00.285622 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x9fhk"] Oct 09 16:17:00 crc kubenswrapper[4719]: E1009 16:17:00.286688 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="044dad3e-af09-4af6-bc36-5e0d89f1c7b3" containerName="collect-profiles" Oct 09 16:17:00 crc kubenswrapper[4719]: I1009 16:17:00.286703 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="044dad3e-af09-4af6-bc36-5e0d89f1c7b3" containerName="collect-profiles" Oct 09 16:17:00 crc kubenswrapper[4719]: I1009 16:17:00.286974 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="044dad3e-af09-4af6-bc36-5e0d89f1c7b3" containerName="collect-profiles" Oct 09 16:17:00 crc kubenswrapper[4719]: I1009 16:17:00.289160 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x9fhk" Oct 09 16:17:00 crc kubenswrapper[4719]: I1009 16:17:00.348214 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x9fhk"] Oct 09 16:17:00 crc kubenswrapper[4719]: I1009 16:17:00.420393 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdab02b0-5ee8-46ee-945d-71b221ed280f-catalog-content\") pod \"redhat-marketplace-x9fhk\" (UID: \"cdab02b0-5ee8-46ee-945d-71b221ed280f\") " pod="openshift-marketplace/redhat-marketplace-x9fhk" Oct 09 16:17:00 crc kubenswrapper[4719]: I1009 16:17:00.420458 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdab02b0-5ee8-46ee-945d-71b221ed280f-utilities\") pod \"redhat-marketplace-x9fhk\" (UID: \"cdab02b0-5ee8-46ee-945d-71b221ed280f\") " pod="openshift-marketplace/redhat-marketplace-x9fhk" Oct 09 16:17:00 crc kubenswrapper[4719]: I1009 16:17:00.420874 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4h6x\" (UniqueName: \"kubernetes.io/projected/cdab02b0-5ee8-46ee-945d-71b221ed280f-kube-api-access-z4h6x\") pod \"redhat-marketplace-x9fhk\" (UID: \"cdab02b0-5ee8-46ee-945d-71b221ed280f\") " pod="openshift-marketplace/redhat-marketplace-x9fhk" Oct 09 16:17:00 crc kubenswrapper[4719]: I1009 16:17:00.523045 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdab02b0-5ee8-46ee-945d-71b221ed280f-catalog-content\") pod \"redhat-marketplace-x9fhk\" (UID: \"cdab02b0-5ee8-46ee-945d-71b221ed280f\") " pod="openshift-marketplace/redhat-marketplace-x9fhk" Oct 09 16:17:00 crc kubenswrapper[4719]: I1009 16:17:00.523127 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdab02b0-5ee8-46ee-945d-71b221ed280f-utilities\") pod \"redhat-marketplace-x9fhk\" (UID: \"cdab02b0-5ee8-46ee-945d-71b221ed280f\") " pod="openshift-marketplace/redhat-marketplace-x9fhk" Oct 09 16:17:00 crc kubenswrapper[4719]: I1009 16:17:00.523306 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4h6x\" (UniqueName: \"kubernetes.io/projected/cdab02b0-5ee8-46ee-945d-71b221ed280f-kube-api-access-z4h6x\") pod \"redhat-marketplace-x9fhk\" (UID: \"cdab02b0-5ee8-46ee-945d-71b221ed280f\") " pod="openshift-marketplace/redhat-marketplace-x9fhk" Oct 09 16:17:00 crc kubenswrapper[4719]: I1009 16:17:00.523633 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdab02b0-5ee8-46ee-945d-71b221ed280f-catalog-content\") pod \"redhat-marketplace-x9fhk\" (UID: \"cdab02b0-5ee8-46ee-945d-71b221ed280f\") " pod="openshift-marketplace/redhat-marketplace-x9fhk" Oct 09 16:17:00 crc kubenswrapper[4719]: I1009 16:17:00.523904 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdab02b0-5ee8-46ee-945d-71b221ed280f-utilities\") pod \"redhat-marketplace-x9fhk\" (UID: \"cdab02b0-5ee8-46ee-945d-71b221ed280f\") " pod="openshift-marketplace/redhat-marketplace-x9fhk" Oct 09 16:17:00 crc kubenswrapper[4719]: I1009 16:17:00.548131 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4h6x\" (UniqueName: \"kubernetes.io/projected/cdab02b0-5ee8-46ee-945d-71b221ed280f-kube-api-access-z4h6x\") pod \"redhat-marketplace-x9fhk\" (UID: \"cdab02b0-5ee8-46ee-945d-71b221ed280f\") " pod="openshift-marketplace/redhat-marketplace-x9fhk" Oct 09 16:17:00 crc kubenswrapper[4719]: I1009 16:17:00.660807 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x9fhk" Oct 09 16:17:01 crc kubenswrapper[4719]: I1009 16:17:01.103811 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x9fhk"] Oct 09 16:17:02 crc kubenswrapper[4719]: I1009 16:17:02.062340 4719 generic.go:334] "Generic (PLEG): container finished" podID="cdab02b0-5ee8-46ee-945d-71b221ed280f" containerID="e153a1ab51ba9d6f30cb5f315c7fa7fe43b8f7fba35a38d7188d391fd3a4824e" exitCode=0 Oct 09 16:17:02 crc kubenswrapper[4719]: I1009 16:17:02.062555 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x9fhk" event={"ID":"cdab02b0-5ee8-46ee-945d-71b221ed280f","Type":"ContainerDied","Data":"e153a1ab51ba9d6f30cb5f315c7fa7fe43b8f7fba35a38d7188d391fd3a4824e"} Oct 09 16:17:02 crc kubenswrapper[4719]: I1009 16:17:02.062676 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x9fhk" event={"ID":"cdab02b0-5ee8-46ee-945d-71b221ed280f","Type":"ContainerStarted","Data":"e3c650d0b700196385fcc8913de88b266d32222ed75ac4b57af2c05a68ff696b"} Oct 09 16:17:03 crc kubenswrapper[4719]: E1009 16:17:03.784325 4719 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdab02b0_5ee8_46ee_945d_71b221ed280f.slice/crio-conmon-7711d40b242bd8dd49bb1c36558fc9594efe73c29040f3f64da380c7585ca55f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdab02b0_5ee8_46ee_945d_71b221ed280f.slice/crio-7711d40b242bd8dd49bb1c36558fc9594efe73c29040f3f64da380c7585ca55f.scope\": RecentStats: unable to find data in memory cache]" Oct 09 16:17:04 crc kubenswrapper[4719]: I1009 16:17:04.079801 4719 generic.go:334] "Generic (PLEG): container finished" podID="cdab02b0-5ee8-46ee-945d-71b221ed280f" containerID="7711d40b242bd8dd49bb1c36558fc9594efe73c29040f3f64da380c7585ca55f" exitCode=0 Oct 09 16:17:04 crc kubenswrapper[4719]: I1009 16:17:04.079866 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x9fhk" event={"ID":"cdab02b0-5ee8-46ee-945d-71b221ed280f","Type":"ContainerDied","Data":"7711d40b242bd8dd49bb1c36558fc9594efe73c29040f3f64da380c7585ca55f"} Oct 09 16:17:05 crc kubenswrapper[4719]: I1009 16:17:05.092125 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x9fhk" event={"ID":"cdab02b0-5ee8-46ee-945d-71b221ed280f","Type":"ContainerStarted","Data":"d479a332a24bc4ad8fb3c99a5f4c8bc99987e8581cc3f841daa78c24584b8fcb"} Oct 09 16:17:05 crc kubenswrapper[4719]: I1009 16:17:05.117679 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x9fhk" podStartSLOduration=2.45518532 podStartE2EDuration="5.117657479s" podCreationTimestamp="2025-10-09 16:17:00 +0000 UTC" firstStartedPulling="2025-10-09 16:17:02.064170991 +0000 UTC m=+3527.573882276" lastFinishedPulling="2025-10-09 16:17:04.72664315 +0000 UTC m=+3530.236354435" observedRunningTime="2025-10-09 16:17:05.107447173 +0000 UTC m=+3530.617158468" watchObservedRunningTime="2025-10-09 16:17:05.117657479 +0000 UTC m=+3530.627368784" Oct 09 16:17:09 crc kubenswrapper[4719]: I1009 16:17:09.813627 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-64gb7"] Oct 09 16:17:09 crc kubenswrapper[4719]: I1009 16:17:09.816045 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-64gb7" Oct 09 16:17:09 crc kubenswrapper[4719]: I1009 16:17:09.840325 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-64gb7"] Oct 09 16:17:09 crc kubenswrapper[4719]: I1009 16:17:09.913517 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0811dbeb-b3a8-4d50-a4b3-0e85290f57f4-utilities\") pod \"certified-operators-64gb7\" (UID: \"0811dbeb-b3a8-4d50-a4b3-0e85290f57f4\") " pod="openshift-marketplace/certified-operators-64gb7" Oct 09 16:17:09 crc kubenswrapper[4719]: I1009 16:17:09.914155 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99r2s\" (UniqueName: \"kubernetes.io/projected/0811dbeb-b3a8-4d50-a4b3-0e85290f57f4-kube-api-access-99r2s\") pod \"certified-operators-64gb7\" (UID: \"0811dbeb-b3a8-4d50-a4b3-0e85290f57f4\") " pod="openshift-marketplace/certified-operators-64gb7" Oct 09 16:17:09 crc kubenswrapper[4719]: I1009 16:17:09.914281 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0811dbeb-b3a8-4d50-a4b3-0e85290f57f4-catalog-content\") pod \"certified-operators-64gb7\" (UID: \"0811dbeb-b3a8-4d50-a4b3-0e85290f57f4\") " pod="openshift-marketplace/certified-operators-64gb7" Oct 09 16:17:10 crc kubenswrapper[4719]: I1009 16:17:10.016196 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0811dbeb-b3a8-4d50-a4b3-0e85290f57f4-utilities\") pod \"certified-operators-64gb7\" (UID: \"0811dbeb-b3a8-4d50-a4b3-0e85290f57f4\") " pod="openshift-marketplace/certified-operators-64gb7" Oct 09 16:17:10 crc kubenswrapper[4719]: I1009 16:17:10.016538 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99r2s\" (UniqueName: \"kubernetes.io/projected/0811dbeb-b3a8-4d50-a4b3-0e85290f57f4-kube-api-access-99r2s\") pod \"certified-operators-64gb7\" (UID: \"0811dbeb-b3a8-4d50-a4b3-0e85290f57f4\") " pod="openshift-marketplace/certified-operators-64gb7" Oct 09 16:17:10 crc kubenswrapper[4719]: I1009 16:17:10.016662 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0811dbeb-b3a8-4d50-a4b3-0e85290f57f4-catalog-content\") pod \"certified-operators-64gb7\" (UID: \"0811dbeb-b3a8-4d50-a4b3-0e85290f57f4\") " pod="openshift-marketplace/certified-operators-64gb7" Oct 09 16:17:10 crc kubenswrapper[4719]: I1009 16:17:10.016722 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0811dbeb-b3a8-4d50-a4b3-0e85290f57f4-utilities\") pod \"certified-operators-64gb7\" (UID: \"0811dbeb-b3a8-4d50-a4b3-0e85290f57f4\") " pod="openshift-marketplace/certified-operators-64gb7" Oct 09 16:17:10 crc kubenswrapper[4719]: I1009 16:17:10.017099 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0811dbeb-b3a8-4d50-a4b3-0e85290f57f4-catalog-content\") pod \"certified-operators-64gb7\" (UID: \"0811dbeb-b3a8-4d50-a4b3-0e85290f57f4\") " pod="openshift-marketplace/certified-operators-64gb7" Oct 09 16:17:10 crc kubenswrapper[4719]: I1009 16:17:10.035372 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99r2s\" (UniqueName: \"kubernetes.io/projected/0811dbeb-b3a8-4d50-a4b3-0e85290f57f4-kube-api-access-99r2s\") pod \"certified-operators-64gb7\" (UID: \"0811dbeb-b3a8-4d50-a4b3-0e85290f57f4\") " pod="openshift-marketplace/certified-operators-64gb7" Oct 09 16:17:10 crc kubenswrapper[4719]: I1009 16:17:10.135170 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-64gb7" Oct 09 16:17:10 crc kubenswrapper[4719]: I1009 16:17:10.661721 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x9fhk" Oct 09 16:17:10 crc kubenswrapper[4719]: I1009 16:17:10.662138 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x9fhk" Oct 09 16:17:10 crc kubenswrapper[4719]: I1009 16:17:10.710104 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x9fhk" Oct 09 16:17:10 crc kubenswrapper[4719]: I1009 16:17:10.732018 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-64gb7"] Oct 09 16:17:11 crc kubenswrapper[4719]: I1009 16:17:11.168235 4719 generic.go:334] "Generic (PLEG): container finished" podID="0811dbeb-b3a8-4d50-a4b3-0e85290f57f4" containerID="4b1f6998221606ac8e1347b13395b174c57227040a8db72937c44961662e066d" exitCode=0 Oct 09 16:17:11 crc kubenswrapper[4719]: I1009 16:17:11.174299 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-64gb7" event={"ID":"0811dbeb-b3a8-4d50-a4b3-0e85290f57f4","Type":"ContainerDied","Data":"4b1f6998221606ac8e1347b13395b174c57227040a8db72937c44961662e066d"} Oct 09 16:17:11 crc kubenswrapper[4719]: I1009 16:17:11.174334 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-64gb7" event={"ID":"0811dbeb-b3a8-4d50-a4b3-0e85290f57f4","Type":"ContainerStarted","Data":"d87f8ee6ae0d45d38adda7dd11892e4f0dbf7e6c2932ff6fefad198180a4088d"} Oct 09 16:17:11 crc kubenswrapper[4719]: I1009 16:17:11.226615 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x9fhk" Oct 09 16:17:12 crc kubenswrapper[4719]: I1009 16:17:12.989411 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x9fhk"] Oct 09 16:17:13 crc kubenswrapper[4719]: I1009 16:17:13.199907 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-64gb7" event={"ID":"0811dbeb-b3a8-4d50-a4b3-0e85290f57f4","Type":"ContainerStarted","Data":"96bde09a865cccbe6e52c60ec023d2cd49acb5dbd7640d1a362b2e53a957a269"} Oct 09 16:17:14 crc kubenswrapper[4719]: I1009 16:17:14.211200 4719 generic.go:334] "Generic (PLEG): container finished" podID="0811dbeb-b3a8-4d50-a4b3-0e85290f57f4" containerID="96bde09a865cccbe6e52c60ec023d2cd49acb5dbd7640d1a362b2e53a957a269" exitCode=0 Oct 09 16:17:14 crc kubenswrapper[4719]: I1009 16:17:14.211274 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-64gb7" event={"ID":"0811dbeb-b3a8-4d50-a4b3-0e85290f57f4","Type":"ContainerDied","Data":"96bde09a865cccbe6e52c60ec023d2cd49acb5dbd7640d1a362b2e53a957a269"} Oct 09 16:17:14 crc kubenswrapper[4719]: I1009 16:17:14.212512 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x9fhk" podUID="cdab02b0-5ee8-46ee-945d-71b221ed280f" containerName="registry-server" containerID="cri-o://d479a332a24bc4ad8fb3c99a5f4c8bc99987e8581cc3f841daa78c24584b8fcb" gracePeriod=2 Oct 09 16:17:14 crc kubenswrapper[4719]: I1009 16:17:14.690299 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x9fhk" Oct 09 16:17:14 crc kubenswrapper[4719]: I1009 16:17:14.829228 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdab02b0-5ee8-46ee-945d-71b221ed280f-catalog-content\") pod \"cdab02b0-5ee8-46ee-945d-71b221ed280f\" (UID: \"cdab02b0-5ee8-46ee-945d-71b221ed280f\") " Oct 09 16:17:14 crc kubenswrapper[4719]: I1009 16:17:14.829289 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdab02b0-5ee8-46ee-945d-71b221ed280f-utilities\") pod \"cdab02b0-5ee8-46ee-945d-71b221ed280f\" (UID: \"cdab02b0-5ee8-46ee-945d-71b221ed280f\") " Oct 09 16:17:14 crc kubenswrapper[4719]: I1009 16:17:14.829463 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4h6x\" (UniqueName: \"kubernetes.io/projected/cdab02b0-5ee8-46ee-945d-71b221ed280f-kube-api-access-z4h6x\") pod \"cdab02b0-5ee8-46ee-945d-71b221ed280f\" (UID: \"cdab02b0-5ee8-46ee-945d-71b221ed280f\") " Oct 09 16:17:14 crc kubenswrapper[4719]: I1009 16:17:14.831915 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdab02b0-5ee8-46ee-945d-71b221ed280f-utilities" (OuterVolumeSpecName: "utilities") pod "cdab02b0-5ee8-46ee-945d-71b221ed280f" (UID: "cdab02b0-5ee8-46ee-945d-71b221ed280f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:17:14 crc kubenswrapper[4719]: I1009 16:17:14.840283 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdab02b0-5ee8-46ee-945d-71b221ed280f-kube-api-access-z4h6x" (OuterVolumeSpecName: "kube-api-access-z4h6x") pod "cdab02b0-5ee8-46ee-945d-71b221ed280f" (UID: "cdab02b0-5ee8-46ee-945d-71b221ed280f"). InnerVolumeSpecName "kube-api-access-z4h6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:17:14 crc kubenswrapper[4719]: I1009 16:17:14.846620 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdab02b0-5ee8-46ee-945d-71b221ed280f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cdab02b0-5ee8-46ee-945d-71b221ed280f" (UID: "cdab02b0-5ee8-46ee-945d-71b221ed280f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:17:14 crc kubenswrapper[4719]: I1009 16:17:14.946610 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4h6x\" (UniqueName: \"kubernetes.io/projected/cdab02b0-5ee8-46ee-945d-71b221ed280f-kube-api-access-z4h6x\") on node \"crc\" DevicePath \"\"" Oct 09 16:17:14 crc kubenswrapper[4719]: I1009 16:17:14.946955 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdab02b0-5ee8-46ee-945d-71b221ed280f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 16:17:14 crc kubenswrapper[4719]: I1009 16:17:14.946978 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdab02b0-5ee8-46ee-945d-71b221ed280f-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 16:17:15 crc kubenswrapper[4719]: I1009 16:17:15.222338 4719 generic.go:334] "Generic (PLEG): container finished" podID="cdab02b0-5ee8-46ee-945d-71b221ed280f" containerID="d479a332a24bc4ad8fb3c99a5f4c8bc99987e8581cc3f841daa78c24584b8fcb" exitCode=0 Oct 09 16:17:15 crc kubenswrapper[4719]: I1009 16:17:15.222419 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x9fhk" event={"ID":"cdab02b0-5ee8-46ee-945d-71b221ed280f","Type":"ContainerDied","Data":"d479a332a24bc4ad8fb3c99a5f4c8bc99987e8581cc3f841daa78c24584b8fcb"} Oct 09 16:17:15 crc kubenswrapper[4719]: I1009 16:17:15.222449 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x9fhk" event={"ID":"cdab02b0-5ee8-46ee-945d-71b221ed280f","Type":"ContainerDied","Data":"e3c650d0b700196385fcc8913de88b266d32222ed75ac4b57af2c05a68ff696b"} Oct 09 16:17:15 crc kubenswrapper[4719]: I1009 16:17:15.222470 4719 scope.go:117] "RemoveContainer" containerID="d479a332a24bc4ad8fb3c99a5f4c8bc99987e8581cc3f841daa78c24584b8fcb" Oct 09 16:17:15 crc kubenswrapper[4719]: I1009 16:17:15.222621 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x9fhk" Oct 09 16:17:15 crc kubenswrapper[4719]: I1009 16:17:15.226710 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-64gb7" event={"ID":"0811dbeb-b3a8-4d50-a4b3-0e85290f57f4","Type":"ContainerStarted","Data":"9fa2dee753f86a9d3ca943514f246dfb7060a4f35ce4f5e779721c7a5f900f52"} Oct 09 16:17:15 crc kubenswrapper[4719]: I1009 16:17:15.254629 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x9fhk"] Oct 09 16:17:15 crc kubenswrapper[4719]: I1009 16:17:15.257797 4719 scope.go:117] "RemoveContainer" containerID="7711d40b242bd8dd49bb1c36558fc9594efe73c29040f3f64da380c7585ca55f" Oct 09 16:17:15 crc kubenswrapper[4719]: I1009 16:17:15.264477 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x9fhk"] Oct 09 16:17:15 crc kubenswrapper[4719]: I1009 16:17:15.268772 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-64gb7" podStartSLOduration=2.803789662 podStartE2EDuration="6.268747313s" podCreationTimestamp="2025-10-09 16:17:09 +0000 UTC" firstStartedPulling="2025-10-09 16:17:11.170672762 +0000 UTC m=+3536.680384047" lastFinishedPulling="2025-10-09 16:17:14.635630413 +0000 UTC m=+3540.145341698" observedRunningTime="2025-10-09 16:17:15.260105908 +0000 UTC m=+3540.769817193" watchObservedRunningTime="2025-10-09 16:17:15.268747313 +0000 UTC m=+3540.778458598" Oct 09 16:17:15 crc kubenswrapper[4719]: I1009 16:17:15.284034 4719 scope.go:117] "RemoveContainer" containerID="e153a1ab51ba9d6f30cb5f315c7fa7fe43b8f7fba35a38d7188d391fd3a4824e" Oct 09 16:17:15 crc kubenswrapper[4719]: I1009 16:17:15.305316 4719 scope.go:117] "RemoveContainer" containerID="d479a332a24bc4ad8fb3c99a5f4c8bc99987e8581cc3f841daa78c24584b8fcb" Oct 09 16:17:15 crc kubenswrapper[4719]: E1009 16:17:15.305837 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d479a332a24bc4ad8fb3c99a5f4c8bc99987e8581cc3f841daa78c24584b8fcb\": container with ID starting with d479a332a24bc4ad8fb3c99a5f4c8bc99987e8581cc3f841daa78c24584b8fcb not found: ID does not exist" containerID="d479a332a24bc4ad8fb3c99a5f4c8bc99987e8581cc3f841daa78c24584b8fcb" Oct 09 16:17:15 crc kubenswrapper[4719]: I1009 16:17:15.305869 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d479a332a24bc4ad8fb3c99a5f4c8bc99987e8581cc3f841daa78c24584b8fcb"} err="failed to get container status \"d479a332a24bc4ad8fb3c99a5f4c8bc99987e8581cc3f841daa78c24584b8fcb\": rpc error: code = NotFound desc = could not find container \"d479a332a24bc4ad8fb3c99a5f4c8bc99987e8581cc3f841daa78c24584b8fcb\": container with ID starting with d479a332a24bc4ad8fb3c99a5f4c8bc99987e8581cc3f841daa78c24584b8fcb not found: ID does not exist" Oct 09 16:17:15 crc kubenswrapper[4719]: I1009 16:17:15.305890 4719 scope.go:117] "RemoveContainer" containerID="7711d40b242bd8dd49bb1c36558fc9594efe73c29040f3f64da380c7585ca55f" Oct 09 16:17:15 crc kubenswrapper[4719]: E1009 16:17:15.306258 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7711d40b242bd8dd49bb1c36558fc9594efe73c29040f3f64da380c7585ca55f\": container with ID starting with 7711d40b242bd8dd49bb1c36558fc9594efe73c29040f3f64da380c7585ca55f not found: ID does not exist" containerID="7711d40b242bd8dd49bb1c36558fc9594efe73c29040f3f64da380c7585ca55f" Oct 09 16:17:15 crc kubenswrapper[4719]: I1009 16:17:15.306310 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7711d40b242bd8dd49bb1c36558fc9594efe73c29040f3f64da380c7585ca55f"} err="failed to get container status \"7711d40b242bd8dd49bb1c36558fc9594efe73c29040f3f64da380c7585ca55f\": rpc error: code = NotFound desc = could not find container \"7711d40b242bd8dd49bb1c36558fc9594efe73c29040f3f64da380c7585ca55f\": container with ID starting with 7711d40b242bd8dd49bb1c36558fc9594efe73c29040f3f64da380c7585ca55f not found: ID does not exist" Oct 09 16:17:15 crc kubenswrapper[4719]: I1009 16:17:15.306344 4719 scope.go:117] "RemoveContainer" containerID="e153a1ab51ba9d6f30cb5f315c7fa7fe43b8f7fba35a38d7188d391fd3a4824e" Oct 09 16:17:15 crc kubenswrapper[4719]: E1009 16:17:15.306662 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e153a1ab51ba9d6f30cb5f315c7fa7fe43b8f7fba35a38d7188d391fd3a4824e\": container with ID starting with e153a1ab51ba9d6f30cb5f315c7fa7fe43b8f7fba35a38d7188d391fd3a4824e not found: ID does not exist" containerID="e153a1ab51ba9d6f30cb5f315c7fa7fe43b8f7fba35a38d7188d391fd3a4824e" Oct 09 16:17:15 crc kubenswrapper[4719]: I1009 16:17:15.306689 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e153a1ab51ba9d6f30cb5f315c7fa7fe43b8f7fba35a38d7188d391fd3a4824e"} err="failed to get container status \"e153a1ab51ba9d6f30cb5f315c7fa7fe43b8f7fba35a38d7188d391fd3a4824e\": rpc error: code = NotFound desc = could not find container \"e153a1ab51ba9d6f30cb5f315c7fa7fe43b8f7fba35a38d7188d391fd3a4824e\": container with ID starting with e153a1ab51ba9d6f30cb5f315c7fa7fe43b8f7fba35a38d7188d391fd3a4824e not found: ID does not exist" Oct 09 16:17:17 crc kubenswrapper[4719]: I1009 16:17:17.172908 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdab02b0-5ee8-46ee-945d-71b221ed280f" path="/var/lib/kubelet/pods/cdab02b0-5ee8-46ee-945d-71b221ed280f/volumes" Oct 09 16:17:20 crc kubenswrapper[4719]: I1009 16:17:20.135846 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-64gb7" Oct 09 16:17:20 crc kubenswrapper[4719]: I1009 16:17:20.136313 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-64gb7" Oct 09 16:17:20 crc kubenswrapper[4719]: I1009 16:17:20.189141 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-64gb7" Oct 09 16:17:20 crc kubenswrapper[4719]: I1009 16:17:20.317584 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-64gb7" Oct 09 16:17:21 crc kubenswrapper[4719]: I1009 16:17:21.390132 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-64gb7"] Oct 09 16:17:22 crc kubenswrapper[4719]: I1009 16:17:22.289819 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-64gb7" podUID="0811dbeb-b3a8-4d50-a4b3-0e85290f57f4" containerName="registry-server" containerID="cri-o://9fa2dee753f86a9d3ca943514f246dfb7060a4f35ce4f5e779721c7a5f900f52" gracePeriod=2 Oct 09 16:17:22 crc kubenswrapper[4719]: I1009 16:17:22.823268 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-64gb7" Oct 09 16:17:22 crc kubenswrapper[4719]: I1009 16:17:22.903588 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99r2s\" (UniqueName: \"kubernetes.io/projected/0811dbeb-b3a8-4d50-a4b3-0e85290f57f4-kube-api-access-99r2s\") pod \"0811dbeb-b3a8-4d50-a4b3-0e85290f57f4\" (UID: \"0811dbeb-b3a8-4d50-a4b3-0e85290f57f4\") " Oct 09 16:17:22 crc kubenswrapper[4719]: I1009 16:17:22.903903 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0811dbeb-b3a8-4d50-a4b3-0e85290f57f4-catalog-content\") pod \"0811dbeb-b3a8-4d50-a4b3-0e85290f57f4\" (UID: \"0811dbeb-b3a8-4d50-a4b3-0e85290f57f4\") " Oct 09 16:17:22 crc kubenswrapper[4719]: I1009 16:17:22.903961 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0811dbeb-b3a8-4d50-a4b3-0e85290f57f4-utilities\") pod \"0811dbeb-b3a8-4d50-a4b3-0e85290f57f4\" (UID: \"0811dbeb-b3a8-4d50-a4b3-0e85290f57f4\") " Oct 09 16:17:22 crc kubenswrapper[4719]: I1009 16:17:22.904843 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0811dbeb-b3a8-4d50-a4b3-0e85290f57f4-utilities" (OuterVolumeSpecName: "utilities") pod "0811dbeb-b3a8-4d50-a4b3-0e85290f57f4" (UID: "0811dbeb-b3a8-4d50-a4b3-0e85290f57f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:17:22 crc kubenswrapper[4719]: I1009 16:17:22.918628 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0811dbeb-b3a8-4d50-a4b3-0e85290f57f4-kube-api-access-99r2s" (OuterVolumeSpecName: "kube-api-access-99r2s") pod "0811dbeb-b3a8-4d50-a4b3-0e85290f57f4" (UID: "0811dbeb-b3a8-4d50-a4b3-0e85290f57f4"). InnerVolumeSpecName "kube-api-access-99r2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:17:22 crc kubenswrapper[4719]: I1009 16:17:22.947852 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0811dbeb-b3a8-4d50-a4b3-0e85290f57f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0811dbeb-b3a8-4d50-a4b3-0e85290f57f4" (UID: "0811dbeb-b3a8-4d50-a4b3-0e85290f57f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:17:23 crc kubenswrapper[4719]: I1009 16:17:23.006040 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99r2s\" (UniqueName: \"kubernetes.io/projected/0811dbeb-b3a8-4d50-a4b3-0e85290f57f4-kube-api-access-99r2s\") on node \"crc\" DevicePath \"\"" Oct 09 16:17:23 crc kubenswrapper[4719]: I1009 16:17:23.006081 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0811dbeb-b3a8-4d50-a4b3-0e85290f57f4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 16:17:23 crc kubenswrapper[4719]: I1009 16:17:23.006093 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0811dbeb-b3a8-4d50-a4b3-0e85290f57f4-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 16:17:23 crc kubenswrapper[4719]: I1009 16:17:23.312794 4719 generic.go:334] "Generic (PLEG): container finished" podID="0811dbeb-b3a8-4d50-a4b3-0e85290f57f4" containerID="9fa2dee753f86a9d3ca943514f246dfb7060a4f35ce4f5e779721c7a5f900f52" exitCode=0 Oct 09 16:17:23 crc kubenswrapper[4719]: I1009 16:17:23.312845 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-64gb7" event={"ID":"0811dbeb-b3a8-4d50-a4b3-0e85290f57f4","Type":"ContainerDied","Data":"9fa2dee753f86a9d3ca943514f246dfb7060a4f35ce4f5e779721c7a5f900f52"} Oct 09 16:17:23 crc kubenswrapper[4719]: I1009 16:17:23.312867 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-64gb7" Oct 09 16:17:23 crc kubenswrapper[4719]: I1009 16:17:23.312909 4719 scope.go:117] "RemoveContainer" containerID="9fa2dee753f86a9d3ca943514f246dfb7060a4f35ce4f5e779721c7a5f900f52" Oct 09 16:17:23 crc kubenswrapper[4719]: I1009 16:17:23.312877 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-64gb7" event={"ID":"0811dbeb-b3a8-4d50-a4b3-0e85290f57f4","Type":"ContainerDied","Data":"d87f8ee6ae0d45d38adda7dd11892e4f0dbf7e6c2932ff6fefad198180a4088d"} Oct 09 16:17:23 crc kubenswrapper[4719]: I1009 16:17:23.340617 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-64gb7"] Oct 09 16:17:23 crc kubenswrapper[4719]: I1009 16:17:23.348867 4719 scope.go:117] "RemoveContainer" containerID="96bde09a865cccbe6e52c60ec023d2cd49acb5dbd7640d1a362b2e53a957a269" Oct 09 16:17:23 crc kubenswrapper[4719]: I1009 16:17:23.372652 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-64gb7"] Oct 09 16:17:23 crc kubenswrapper[4719]: I1009 16:17:23.393197 4719 scope.go:117] "RemoveContainer" containerID="4b1f6998221606ac8e1347b13395b174c57227040a8db72937c44961662e066d" Oct 09 16:17:23 crc kubenswrapper[4719]: I1009 16:17:23.431618 4719 scope.go:117] "RemoveContainer" containerID="9fa2dee753f86a9d3ca943514f246dfb7060a4f35ce4f5e779721c7a5f900f52" Oct 09 16:17:23 crc kubenswrapper[4719]: E1009 16:17:23.432222 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fa2dee753f86a9d3ca943514f246dfb7060a4f35ce4f5e779721c7a5f900f52\": container with ID starting with 9fa2dee753f86a9d3ca943514f246dfb7060a4f35ce4f5e779721c7a5f900f52 not found: ID does not exist" containerID="9fa2dee753f86a9d3ca943514f246dfb7060a4f35ce4f5e779721c7a5f900f52" Oct 09 16:17:23 crc kubenswrapper[4719]: I1009 16:17:23.432313 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fa2dee753f86a9d3ca943514f246dfb7060a4f35ce4f5e779721c7a5f900f52"} err="failed to get container status \"9fa2dee753f86a9d3ca943514f246dfb7060a4f35ce4f5e779721c7a5f900f52\": rpc error: code = NotFound desc = could not find container \"9fa2dee753f86a9d3ca943514f246dfb7060a4f35ce4f5e779721c7a5f900f52\": container with ID starting with 9fa2dee753f86a9d3ca943514f246dfb7060a4f35ce4f5e779721c7a5f900f52 not found: ID does not exist" Oct 09 16:17:23 crc kubenswrapper[4719]: I1009 16:17:23.432380 4719 scope.go:117] "RemoveContainer" containerID="96bde09a865cccbe6e52c60ec023d2cd49acb5dbd7640d1a362b2e53a957a269" Oct 09 16:17:23 crc kubenswrapper[4719]: E1009 16:17:23.433006 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96bde09a865cccbe6e52c60ec023d2cd49acb5dbd7640d1a362b2e53a957a269\": container with ID starting with 96bde09a865cccbe6e52c60ec023d2cd49acb5dbd7640d1a362b2e53a957a269 not found: ID does not exist" containerID="96bde09a865cccbe6e52c60ec023d2cd49acb5dbd7640d1a362b2e53a957a269" Oct 09 16:17:23 crc kubenswrapper[4719]: I1009 16:17:23.433104 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96bde09a865cccbe6e52c60ec023d2cd49acb5dbd7640d1a362b2e53a957a269"} err="failed to get container status \"96bde09a865cccbe6e52c60ec023d2cd49acb5dbd7640d1a362b2e53a957a269\": rpc error: code = NotFound desc = could not find container \"96bde09a865cccbe6e52c60ec023d2cd49acb5dbd7640d1a362b2e53a957a269\": container with ID starting with 96bde09a865cccbe6e52c60ec023d2cd49acb5dbd7640d1a362b2e53a957a269 not found: ID does not exist" Oct 09 16:17:23 crc kubenswrapper[4719]: I1009 16:17:23.433185 4719 scope.go:117] "RemoveContainer" containerID="4b1f6998221606ac8e1347b13395b174c57227040a8db72937c44961662e066d" Oct 09 16:17:23 crc kubenswrapper[4719]: E1009 16:17:23.433588 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b1f6998221606ac8e1347b13395b174c57227040a8db72937c44961662e066d\": container with ID starting with 4b1f6998221606ac8e1347b13395b174c57227040a8db72937c44961662e066d not found: ID does not exist" containerID="4b1f6998221606ac8e1347b13395b174c57227040a8db72937c44961662e066d" Oct 09 16:17:23 crc kubenswrapper[4719]: I1009 16:17:23.433624 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b1f6998221606ac8e1347b13395b174c57227040a8db72937c44961662e066d"} err="failed to get container status \"4b1f6998221606ac8e1347b13395b174c57227040a8db72937c44961662e066d\": rpc error: code = NotFound desc = could not find container \"4b1f6998221606ac8e1347b13395b174c57227040a8db72937c44961662e066d\": container with ID starting with 4b1f6998221606ac8e1347b13395b174c57227040a8db72937c44961662e066d not found: ID does not exist" Oct 09 16:17:24 crc kubenswrapper[4719]: E1009 16:17:24.329510 4719 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdab02b0_5ee8_46ee_945d_71b221ed280f.slice/crio-e3c650d0b700196385fcc8913de88b266d32222ed75ac4b57af2c05a68ff696b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdab02b0_5ee8_46ee_945d_71b221ed280f.slice\": RecentStats: unable to find data in memory cache]" Oct 09 16:17:25 crc kubenswrapper[4719]: I1009 16:17:25.173748 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0811dbeb-b3a8-4d50-a4b3-0e85290f57f4" path="/var/lib/kubelet/pods/0811dbeb-b3a8-4d50-a4b3-0e85290f57f4/volumes" Oct 09 16:17:34 crc kubenswrapper[4719]: E1009 16:17:34.584947 4719 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdab02b0_5ee8_46ee_945d_71b221ed280f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdab02b0_5ee8_46ee_945d_71b221ed280f.slice/crio-e3c650d0b700196385fcc8913de88b266d32222ed75ac4b57af2c05a68ff696b\": RecentStats: unable to find data in memory cache]" Oct 09 16:17:36 crc kubenswrapper[4719]: I1009 16:17:36.976594 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:17:36 crc kubenswrapper[4719]: I1009 16:17:36.977144 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:17:44 crc kubenswrapper[4719]: E1009 16:17:44.833964 4719 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdab02b0_5ee8_46ee_945d_71b221ed280f.slice/crio-e3c650d0b700196385fcc8913de88b266d32222ed75ac4b57af2c05a68ff696b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdab02b0_5ee8_46ee_945d_71b221ed280f.slice\": RecentStats: unable to find data in memory cache]" Oct 09 16:17:55 crc kubenswrapper[4719]: E1009 16:17:55.077793 4719 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdab02b0_5ee8_46ee_945d_71b221ed280f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdab02b0_5ee8_46ee_945d_71b221ed280f.slice/crio-e3c650d0b700196385fcc8913de88b266d32222ed75ac4b57af2c05a68ff696b\": RecentStats: unable to find data in memory cache]" Oct 09 16:18:05 crc kubenswrapper[4719]: E1009 16:18:05.415765 4719 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdab02b0_5ee8_46ee_945d_71b221ed280f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdab02b0_5ee8_46ee_945d_71b221ed280f.slice/crio-e3c650d0b700196385fcc8913de88b266d32222ed75ac4b57af2c05a68ff696b\": RecentStats: unable to find data in memory cache]" Oct 09 16:18:06 crc kubenswrapper[4719]: I1009 16:18:06.976951 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:18:06 crc kubenswrapper[4719]: I1009 16:18:06.977821 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:18:36 crc kubenswrapper[4719]: I1009 16:18:36.977095 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:18:36 crc kubenswrapper[4719]: I1009 16:18:36.977713 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:18:36 crc kubenswrapper[4719]: I1009 16:18:36.977764 4719 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 16:18:36 crc kubenswrapper[4719]: I1009 16:18:36.978613 4719 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0d408dab1638d77830ba4de58904c6a39c353da082d525becd504533d9c11701"} pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 16:18:36 crc kubenswrapper[4719]: I1009 16:18:36.978673 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" containerID="cri-o://0d408dab1638d77830ba4de58904c6a39c353da082d525becd504533d9c11701" gracePeriod=600 Oct 09 16:18:37 crc kubenswrapper[4719]: I1009 16:18:37.999446 4719 generic.go:334] "Generic (PLEG): container finished" podID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerID="0d408dab1638d77830ba4de58904c6a39c353da082d525becd504533d9c11701" exitCode=0 Oct 09 16:18:37 crc kubenswrapper[4719]: I1009 16:18:37.999534 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerDied","Data":"0d408dab1638d77830ba4de58904c6a39c353da082d525becd504533d9c11701"} Oct 09 16:18:37 crc kubenswrapper[4719]: I1009 16:18:37.999993 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerStarted","Data":"31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d"} Oct 09 16:18:38 crc kubenswrapper[4719]: I1009 16:18:38.000015 4719 scope.go:117] "RemoveContainer" containerID="e3e34c67e5c773c761040d5a93136e182c5c1d1a01f09e8ee183de43fdb02e6e" Oct 09 16:21:06 crc kubenswrapper[4719]: I1009 16:21:06.976545 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:21:06 crc kubenswrapper[4719]: I1009 16:21:06.977143 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:21:36 crc kubenswrapper[4719]: I1009 16:21:36.976909 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:21:36 crc kubenswrapper[4719]: I1009 16:21:36.979228 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:22:06 crc kubenswrapper[4719]: I1009 16:22:06.976716 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:22:06 crc kubenswrapper[4719]: I1009 16:22:06.977607 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:22:06 crc kubenswrapper[4719]: I1009 16:22:06.977702 4719 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 16:22:06 crc kubenswrapper[4719]: I1009 16:22:06.979413 4719 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d"} pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 16:22:06 crc kubenswrapper[4719]: I1009 16:22:06.979523 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" containerID="cri-o://31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" gracePeriod=600 Oct 09 16:22:07 crc kubenswrapper[4719]: E1009 16:22:07.109469 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:22:08 crc kubenswrapper[4719]: I1009 16:22:08.021657 4719 generic.go:334] "Generic (PLEG): container finished" podID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" exitCode=0 Oct 09 16:22:08 crc kubenswrapper[4719]: I1009 16:22:08.021715 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerDied","Data":"31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d"} Oct 09 16:22:08 crc kubenswrapper[4719]: I1009 16:22:08.022029 4719 scope.go:117] "RemoveContainer" containerID="0d408dab1638d77830ba4de58904c6a39c353da082d525becd504533d9c11701" Oct 09 16:22:08 crc kubenswrapper[4719]: I1009 16:22:08.022769 4719 scope.go:117] "RemoveContainer" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" Oct 09 16:22:08 crc kubenswrapper[4719]: E1009 16:22:08.023096 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:22:20 crc kubenswrapper[4719]: I1009 16:22:20.162225 4719 scope.go:117] "RemoveContainer" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" Oct 09 16:22:20 crc kubenswrapper[4719]: E1009 16:22:20.163650 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:22:33 crc kubenswrapper[4719]: I1009 16:22:33.161240 4719 scope.go:117] "RemoveContainer" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" Oct 09 16:22:33 crc kubenswrapper[4719]: E1009 16:22:33.162038 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:22:48 crc kubenswrapper[4719]: I1009 16:22:48.161191 4719 scope.go:117] "RemoveContainer" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" Oct 09 16:22:48 crc kubenswrapper[4719]: E1009 16:22:48.162290 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:23:02 crc kubenswrapper[4719]: I1009 16:23:02.161085 4719 scope.go:117] "RemoveContainer" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" Oct 09 16:23:02 crc kubenswrapper[4719]: E1009 16:23:02.161835 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:23:17 crc kubenswrapper[4719]: I1009 16:23:17.161282 4719 scope.go:117] "RemoveContainer" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" Oct 09 16:23:17 crc kubenswrapper[4719]: E1009 16:23:17.162054 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:23:28 crc kubenswrapper[4719]: I1009 16:23:28.163064 4719 scope.go:117] "RemoveContainer" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" Oct 09 16:23:28 crc kubenswrapper[4719]: E1009 16:23:28.163929 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:23:41 crc kubenswrapper[4719]: I1009 16:23:41.161948 4719 scope.go:117] "RemoveContainer" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" Oct 09 16:23:41 crc kubenswrapper[4719]: E1009 16:23:41.162666 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:23:49 crc kubenswrapper[4719]: I1009 16:23:49.888375 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c6rmz"] Oct 09 16:23:49 crc kubenswrapper[4719]: E1009 16:23:49.890793 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0811dbeb-b3a8-4d50-a4b3-0e85290f57f4" containerName="registry-server" Oct 09 16:23:49 crc kubenswrapper[4719]: I1009 16:23:49.890898 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="0811dbeb-b3a8-4d50-a4b3-0e85290f57f4" containerName="registry-server" Oct 09 16:23:49 crc kubenswrapper[4719]: E1009 16:23:49.890993 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdab02b0-5ee8-46ee-945d-71b221ed280f" containerName="registry-server" Oct 09 16:23:49 crc kubenswrapper[4719]: I1009 16:23:49.891062 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdab02b0-5ee8-46ee-945d-71b221ed280f" containerName="registry-server" Oct 09 16:23:49 crc kubenswrapper[4719]: E1009 16:23:49.891132 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0811dbeb-b3a8-4d50-a4b3-0e85290f57f4" containerName="extract-content" Oct 09 16:23:49 crc kubenswrapper[4719]: I1009 16:23:49.891208 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="0811dbeb-b3a8-4d50-a4b3-0e85290f57f4" containerName="extract-content" Oct 09 16:23:49 crc kubenswrapper[4719]: E1009 16:23:49.891286 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0811dbeb-b3a8-4d50-a4b3-0e85290f57f4" containerName="extract-utilities" Oct 09 16:23:49 crc kubenswrapper[4719]: I1009 16:23:49.891377 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="0811dbeb-b3a8-4d50-a4b3-0e85290f57f4" containerName="extract-utilities" Oct 09 16:23:49 crc kubenswrapper[4719]: E1009 16:23:49.891468 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdab02b0-5ee8-46ee-945d-71b221ed280f" containerName="extract-utilities" Oct 09 16:23:49 crc kubenswrapper[4719]: I1009 16:23:49.891537 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdab02b0-5ee8-46ee-945d-71b221ed280f" containerName="extract-utilities" Oct 09 16:23:49 crc kubenswrapper[4719]: E1009 16:23:49.891606 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdab02b0-5ee8-46ee-945d-71b221ed280f" containerName="extract-content" Oct 09 16:23:49 crc kubenswrapper[4719]: I1009 16:23:49.891671 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdab02b0-5ee8-46ee-945d-71b221ed280f" containerName="extract-content" Oct 09 16:23:49 crc kubenswrapper[4719]: I1009 16:23:49.892009 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="0811dbeb-b3a8-4d50-a4b3-0e85290f57f4" containerName="registry-server" Oct 09 16:23:49 crc kubenswrapper[4719]: I1009 16:23:49.892106 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdab02b0-5ee8-46ee-945d-71b221ed280f" containerName="registry-server" Oct 09 16:23:49 crc kubenswrapper[4719]: I1009 16:23:49.894063 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c6rmz" Oct 09 16:23:49 crc kubenswrapper[4719]: I1009 16:23:49.899868 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c6rmz"] Oct 09 16:23:49 crc kubenswrapper[4719]: I1009 16:23:49.972055 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7fa8082-4819-4eff-aa72-50dc990d7cae-utilities\") pod \"redhat-operators-c6rmz\" (UID: \"a7fa8082-4819-4eff-aa72-50dc990d7cae\") " pod="openshift-marketplace/redhat-operators-c6rmz" Oct 09 16:23:49 crc kubenswrapper[4719]: I1009 16:23:49.972184 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tpj7\" (UniqueName: \"kubernetes.io/projected/a7fa8082-4819-4eff-aa72-50dc990d7cae-kube-api-access-5tpj7\") pod \"redhat-operators-c6rmz\" (UID: \"a7fa8082-4819-4eff-aa72-50dc990d7cae\") " pod="openshift-marketplace/redhat-operators-c6rmz" Oct 09 16:23:49 crc kubenswrapper[4719]: I1009 16:23:49.972312 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7fa8082-4819-4eff-aa72-50dc990d7cae-catalog-content\") pod \"redhat-operators-c6rmz\" (UID: \"a7fa8082-4819-4eff-aa72-50dc990d7cae\") " pod="openshift-marketplace/redhat-operators-c6rmz" Oct 09 16:23:50 crc kubenswrapper[4719]: I1009 16:23:50.074661 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7fa8082-4819-4eff-aa72-50dc990d7cae-utilities\") pod \"redhat-operators-c6rmz\" (UID: \"a7fa8082-4819-4eff-aa72-50dc990d7cae\") " pod="openshift-marketplace/redhat-operators-c6rmz" Oct 09 16:23:50 crc kubenswrapper[4719]: I1009 16:23:50.074729 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tpj7\" (UniqueName: \"kubernetes.io/projected/a7fa8082-4819-4eff-aa72-50dc990d7cae-kube-api-access-5tpj7\") pod \"redhat-operators-c6rmz\" (UID: \"a7fa8082-4819-4eff-aa72-50dc990d7cae\") " pod="openshift-marketplace/redhat-operators-c6rmz" Oct 09 16:23:50 crc kubenswrapper[4719]: I1009 16:23:50.074811 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7fa8082-4819-4eff-aa72-50dc990d7cae-catalog-content\") pod \"redhat-operators-c6rmz\" (UID: \"a7fa8082-4819-4eff-aa72-50dc990d7cae\") " pod="openshift-marketplace/redhat-operators-c6rmz" Oct 09 16:23:50 crc kubenswrapper[4719]: I1009 16:23:50.075516 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7fa8082-4819-4eff-aa72-50dc990d7cae-utilities\") pod \"redhat-operators-c6rmz\" (UID: \"a7fa8082-4819-4eff-aa72-50dc990d7cae\") " pod="openshift-marketplace/redhat-operators-c6rmz" Oct 09 16:23:50 crc kubenswrapper[4719]: I1009 16:23:50.075556 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7fa8082-4819-4eff-aa72-50dc990d7cae-catalog-content\") pod \"redhat-operators-c6rmz\" (UID: \"a7fa8082-4819-4eff-aa72-50dc990d7cae\") " pod="openshift-marketplace/redhat-operators-c6rmz" Oct 09 16:23:50 crc kubenswrapper[4719]: I1009 16:23:50.083041 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fxtl7"] Oct 09 16:23:50 crc kubenswrapper[4719]: I1009 16:23:50.085609 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fxtl7" Oct 09 16:23:50 crc kubenswrapper[4719]: I1009 16:23:50.100071 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fxtl7"] Oct 09 16:23:50 crc kubenswrapper[4719]: I1009 16:23:50.101205 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tpj7\" (UniqueName: \"kubernetes.io/projected/a7fa8082-4819-4eff-aa72-50dc990d7cae-kube-api-access-5tpj7\") pod \"redhat-operators-c6rmz\" (UID: \"a7fa8082-4819-4eff-aa72-50dc990d7cae\") " pod="openshift-marketplace/redhat-operators-c6rmz" Oct 09 16:23:50 crc kubenswrapper[4719]: I1009 16:23:50.177950 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gg8k\" (UniqueName: \"kubernetes.io/projected/627f3657-5b07-4cd7-bb54-7df64de2cd96-kube-api-access-2gg8k\") pod \"community-operators-fxtl7\" (UID: \"627f3657-5b07-4cd7-bb54-7df64de2cd96\") " pod="openshift-marketplace/community-operators-fxtl7" Oct 09 16:23:50 crc kubenswrapper[4719]: I1009 16:23:50.180398 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/627f3657-5b07-4cd7-bb54-7df64de2cd96-utilities\") pod \"community-operators-fxtl7\" (UID: \"627f3657-5b07-4cd7-bb54-7df64de2cd96\") " pod="openshift-marketplace/community-operators-fxtl7" Oct 09 16:23:50 crc kubenswrapper[4719]: I1009 16:23:50.180625 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/627f3657-5b07-4cd7-bb54-7df64de2cd96-catalog-content\") pod \"community-operators-fxtl7\" (UID: \"627f3657-5b07-4cd7-bb54-7df64de2cd96\") " pod="openshift-marketplace/community-operators-fxtl7" Oct 09 16:23:50 crc kubenswrapper[4719]: I1009 16:23:50.237024 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c6rmz" Oct 09 16:23:50 crc kubenswrapper[4719]: I1009 16:23:50.283000 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gg8k\" (UniqueName: \"kubernetes.io/projected/627f3657-5b07-4cd7-bb54-7df64de2cd96-kube-api-access-2gg8k\") pod \"community-operators-fxtl7\" (UID: \"627f3657-5b07-4cd7-bb54-7df64de2cd96\") " pod="openshift-marketplace/community-operators-fxtl7" Oct 09 16:23:50 crc kubenswrapper[4719]: I1009 16:23:50.283060 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/627f3657-5b07-4cd7-bb54-7df64de2cd96-utilities\") pod \"community-operators-fxtl7\" (UID: \"627f3657-5b07-4cd7-bb54-7df64de2cd96\") " pod="openshift-marketplace/community-operators-fxtl7" Oct 09 16:23:50 crc kubenswrapper[4719]: I1009 16:23:50.283095 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/627f3657-5b07-4cd7-bb54-7df64de2cd96-catalog-content\") pod \"community-operators-fxtl7\" (UID: \"627f3657-5b07-4cd7-bb54-7df64de2cd96\") " pod="openshift-marketplace/community-operators-fxtl7" Oct 09 16:23:50 crc kubenswrapper[4719]: I1009 16:23:50.283748 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/627f3657-5b07-4cd7-bb54-7df64de2cd96-utilities\") pod \"community-operators-fxtl7\" (UID: \"627f3657-5b07-4cd7-bb54-7df64de2cd96\") " pod="openshift-marketplace/community-operators-fxtl7" Oct 09 16:23:50 crc kubenswrapper[4719]: I1009 16:23:50.284071 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/627f3657-5b07-4cd7-bb54-7df64de2cd96-catalog-content\") pod \"community-operators-fxtl7\" (UID: \"627f3657-5b07-4cd7-bb54-7df64de2cd96\") " pod="openshift-marketplace/community-operators-fxtl7" Oct 09 16:23:50 crc kubenswrapper[4719]: I1009 16:23:50.302706 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gg8k\" (UniqueName: \"kubernetes.io/projected/627f3657-5b07-4cd7-bb54-7df64de2cd96-kube-api-access-2gg8k\") pod \"community-operators-fxtl7\" (UID: \"627f3657-5b07-4cd7-bb54-7df64de2cd96\") " pod="openshift-marketplace/community-operators-fxtl7" Oct 09 16:23:50 crc kubenswrapper[4719]: I1009 16:23:50.505836 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fxtl7" Oct 09 16:23:50 crc kubenswrapper[4719]: I1009 16:23:50.856049 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c6rmz"] Oct 09 16:23:51 crc kubenswrapper[4719]: I1009 16:23:51.032089 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6rmz" event={"ID":"a7fa8082-4819-4eff-aa72-50dc990d7cae","Type":"ContainerStarted","Data":"ac62507acd5b36eae716f261733fce21e64d0d243571bb0bb2ff3ab836a1e9fd"} Oct 09 16:23:51 crc kubenswrapper[4719]: I1009 16:23:51.159268 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fxtl7"] Oct 09 16:23:52 crc kubenswrapper[4719]: I1009 16:23:52.045419 4719 generic.go:334] "Generic (PLEG): container finished" podID="627f3657-5b07-4cd7-bb54-7df64de2cd96" containerID="7ee60b7380a8d1033c9111a0e5226ebc6b742959586face7f167405a0f351b62" exitCode=0 Oct 09 16:23:52 crc kubenswrapper[4719]: I1009 16:23:52.045494 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxtl7" event={"ID":"627f3657-5b07-4cd7-bb54-7df64de2cd96","Type":"ContainerDied","Data":"7ee60b7380a8d1033c9111a0e5226ebc6b742959586face7f167405a0f351b62"} Oct 09 16:23:52 crc kubenswrapper[4719]: I1009 16:23:52.045786 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxtl7" event={"ID":"627f3657-5b07-4cd7-bb54-7df64de2cd96","Type":"ContainerStarted","Data":"8e9b5ad9b9bf27271892b3cf58d6b061f6a0184ed0a6acbaba777008a25da347"} Oct 09 16:23:52 crc kubenswrapper[4719]: I1009 16:23:52.048190 4719 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 16:23:52 crc kubenswrapper[4719]: I1009 16:23:52.048867 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6rmz" event={"ID":"a7fa8082-4819-4eff-aa72-50dc990d7cae","Type":"ContainerDied","Data":"868ce66dc060e753de901adaad0cbb2e4ab28c74d9c53c293b26896af63f80c2"} Oct 09 16:23:52 crc kubenswrapper[4719]: I1009 16:23:52.048457 4719 generic.go:334] "Generic (PLEG): container finished" podID="a7fa8082-4819-4eff-aa72-50dc990d7cae" containerID="868ce66dc060e753de901adaad0cbb2e4ab28c74d9c53c293b26896af63f80c2" exitCode=0 Oct 09 16:23:53 crc kubenswrapper[4719]: I1009 16:23:53.062243 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6rmz" event={"ID":"a7fa8082-4819-4eff-aa72-50dc990d7cae","Type":"ContainerStarted","Data":"7b1148e8de0f00297e93bcce4c5688e1639f1d7e3b70a2cad7c5f094a818cd5d"} Oct 09 16:23:53 crc kubenswrapper[4719]: I1009 16:23:53.065072 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxtl7" event={"ID":"627f3657-5b07-4cd7-bb54-7df64de2cd96","Type":"ContainerStarted","Data":"75bdd38d829c14f6bc0426ee1f6ef8d4731a71aad362f322f18fcf2181c44f3a"} Oct 09 16:23:54 crc kubenswrapper[4719]: I1009 16:23:54.162590 4719 scope.go:117] "RemoveContainer" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" Oct 09 16:23:54 crc kubenswrapper[4719]: E1009 16:23:54.163439 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:23:56 crc kubenswrapper[4719]: I1009 16:23:56.094278 4719 generic.go:334] "Generic (PLEG): container finished" podID="627f3657-5b07-4cd7-bb54-7df64de2cd96" containerID="75bdd38d829c14f6bc0426ee1f6ef8d4731a71aad362f322f18fcf2181c44f3a" exitCode=0 Oct 09 16:23:56 crc kubenswrapper[4719]: I1009 16:23:56.094340 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxtl7" event={"ID":"627f3657-5b07-4cd7-bb54-7df64de2cd96","Type":"ContainerDied","Data":"75bdd38d829c14f6bc0426ee1f6ef8d4731a71aad362f322f18fcf2181c44f3a"} Oct 09 16:23:57 crc kubenswrapper[4719]: I1009 16:23:57.108504 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxtl7" event={"ID":"627f3657-5b07-4cd7-bb54-7df64de2cd96","Type":"ContainerStarted","Data":"6a078255db3208c4dfccda0923f1c1327f5da9350e93f28db52873553ad881c4"} Oct 09 16:23:57 crc kubenswrapper[4719]: I1009 16:23:57.135518 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fxtl7" podStartSLOduration=2.433332456 podStartE2EDuration="7.135502511s" podCreationTimestamp="2025-10-09 16:23:50 +0000 UTC" firstStartedPulling="2025-10-09 16:23:52.047978512 +0000 UTC m=+3937.557689797" lastFinishedPulling="2025-10-09 16:23:56.750148577 +0000 UTC m=+3942.259859852" observedRunningTime="2025-10-09 16:23:57.134097525 +0000 UTC m=+3942.643808820" watchObservedRunningTime="2025-10-09 16:23:57.135502511 +0000 UTC m=+3942.645213786" Oct 09 16:23:59 crc kubenswrapper[4719]: I1009 16:23:59.130452 4719 generic.go:334] "Generic (PLEG): container finished" podID="a7fa8082-4819-4eff-aa72-50dc990d7cae" containerID="7b1148e8de0f00297e93bcce4c5688e1639f1d7e3b70a2cad7c5f094a818cd5d" exitCode=0 Oct 09 16:23:59 crc kubenswrapper[4719]: I1009 16:23:59.130549 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6rmz" event={"ID":"a7fa8082-4819-4eff-aa72-50dc990d7cae","Type":"ContainerDied","Data":"7b1148e8de0f00297e93bcce4c5688e1639f1d7e3b70a2cad7c5f094a818cd5d"} Oct 09 16:24:00 crc kubenswrapper[4719]: I1009 16:24:00.148936 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6rmz" event={"ID":"a7fa8082-4819-4eff-aa72-50dc990d7cae","Type":"ContainerStarted","Data":"eac75bde19db701016137e0586e692564e78db076ac1f3196d4c0771b783de74"} Oct 09 16:24:00 crc kubenswrapper[4719]: I1009 16:24:00.178440 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c6rmz" podStartSLOduration=3.602897046 podStartE2EDuration="11.178409512s" podCreationTimestamp="2025-10-09 16:23:49 +0000 UTC" firstStartedPulling="2025-10-09 16:23:52.049759788 +0000 UTC m=+3937.559471073" lastFinishedPulling="2025-10-09 16:23:59.625272244 +0000 UTC m=+3945.134983539" observedRunningTime="2025-10-09 16:24:00.172148992 +0000 UTC m=+3945.681860277" watchObservedRunningTime="2025-10-09 16:24:00.178409512 +0000 UTC m=+3945.688120807" Oct 09 16:24:00 crc kubenswrapper[4719]: I1009 16:24:00.238068 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c6rmz" Oct 09 16:24:00 crc kubenswrapper[4719]: I1009 16:24:00.238126 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c6rmz" Oct 09 16:24:00 crc kubenswrapper[4719]: I1009 16:24:00.507727 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fxtl7" Oct 09 16:24:00 crc kubenswrapper[4719]: I1009 16:24:00.507789 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fxtl7" Oct 09 16:24:01 crc kubenswrapper[4719]: I1009 16:24:01.288338 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c6rmz" podUID="a7fa8082-4819-4eff-aa72-50dc990d7cae" containerName="registry-server" probeResult="failure" output=< Oct 09 16:24:01 crc kubenswrapper[4719]: timeout: failed to connect service ":50051" within 1s Oct 09 16:24:01 crc kubenswrapper[4719]: > Oct 09 16:24:01 crc kubenswrapper[4719]: I1009 16:24:01.561296 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-fxtl7" podUID="627f3657-5b07-4cd7-bb54-7df64de2cd96" containerName="registry-server" probeResult="failure" output=< Oct 09 16:24:01 crc kubenswrapper[4719]: timeout: failed to connect service ":50051" within 1s Oct 09 16:24:01 crc kubenswrapper[4719]: > Oct 09 16:24:06 crc kubenswrapper[4719]: I1009 16:24:06.160982 4719 scope.go:117] "RemoveContainer" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" Oct 09 16:24:06 crc kubenswrapper[4719]: E1009 16:24:06.161815 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:24:10 crc kubenswrapper[4719]: I1009 16:24:10.558788 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fxtl7" Oct 09 16:24:10 crc kubenswrapper[4719]: I1009 16:24:10.625914 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fxtl7" Oct 09 16:24:10 crc kubenswrapper[4719]: I1009 16:24:10.805479 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fxtl7"] Oct 09 16:24:11 crc kubenswrapper[4719]: I1009 16:24:11.285014 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c6rmz" podUID="a7fa8082-4819-4eff-aa72-50dc990d7cae" containerName="registry-server" probeResult="failure" output=< Oct 09 16:24:11 crc kubenswrapper[4719]: timeout: failed to connect service ":50051" within 1s Oct 09 16:24:11 crc kubenswrapper[4719]: > Oct 09 16:24:12 crc kubenswrapper[4719]: I1009 16:24:12.294231 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fxtl7" podUID="627f3657-5b07-4cd7-bb54-7df64de2cd96" containerName="registry-server" containerID="cri-o://6a078255db3208c4dfccda0923f1c1327f5da9350e93f28db52873553ad881c4" gracePeriod=2 Oct 09 16:24:12 crc kubenswrapper[4719]: I1009 16:24:12.736865 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fxtl7" Oct 09 16:24:12 crc kubenswrapper[4719]: I1009 16:24:12.855679 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/627f3657-5b07-4cd7-bb54-7df64de2cd96-catalog-content\") pod \"627f3657-5b07-4cd7-bb54-7df64de2cd96\" (UID: \"627f3657-5b07-4cd7-bb54-7df64de2cd96\") " Oct 09 16:24:12 crc kubenswrapper[4719]: I1009 16:24:12.855917 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/627f3657-5b07-4cd7-bb54-7df64de2cd96-utilities\") pod \"627f3657-5b07-4cd7-bb54-7df64de2cd96\" (UID: \"627f3657-5b07-4cd7-bb54-7df64de2cd96\") " Oct 09 16:24:12 crc kubenswrapper[4719]: I1009 16:24:12.856035 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gg8k\" (UniqueName: \"kubernetes.io/projected/627f3657-5b07-4cd7-bb54-7df64de2cd96-kube-api-access-2gg8k\") pod \"627f3657-5b07-4cd7-bb54-7df64de2cd96\" (UID: \"627f3657-5b07-4cd7-bb54-7df64de2cd96\") " Oct 09 16:24:12 crc kubenswrapper[4719]: I1009 16:24:12.856605 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/627f3657-5b07-4cd7-bb54-7df64de2cd96-utilities" (OuterVolumeSpecName: "utilities") pod "627f3657-5b07-4cd7-bb54-7df64de2cd96" (UID: "627f3657-5b07-4cd7-bb54-7df64de2cd96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:24:12 crc kubenswrapper[4719]: I1009 16:24:12.856950 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/627f3657-5b07-4cd7-bb54-7df64de2cd96-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 16:24:12 crc kubenswrapper[4719]: I1009 16:24:12.864614 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/627f3657-5b07-4cd7-bb54-7df64de2cd96-kube-api-access-2gg8k" (OuterVolumeSpecName: "kube-api-access-2gg8k") pod "627f3657-5b07-4cd7-bb54-7df64de2cd96" (UID: "627f3657-5b07-4cd7-bb54-7df64de2cd96"). InnerVolumeSpecName "kube-api-access-2gg8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:24:12 crc kubenswrapper[4719]: I1009 16:24:12.912369 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/627f3657-5b07-4cd7-bb54-7df64de2cd96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "627f3657-5b07-4cd7-bb54-7df64de2cd96" (UID: "627f3657-5b07-4cd7-bb54-7df64de2cd96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:24:12 crc kubenswrapper[4719]: I1009 16:24:12.961938 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gg8k\" (UniqueName: \"kubernetes.io/projected/627f3657-5b07-4cd7-bb54-7df64de2cd96-kube-api-access-2gg8k\") on node \"crc\" DevicePath \"\"" Oct 09 16:24:12 crc kubenswrapper[4719]: I1009 16:24:12.961992 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/627f3657-5b07-4cd7-bb54-7df64de2cd96-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 16:24:13 crc kubenswrapper[4719]: I1009 16:24:13.306784 4719 generic.go:334] "Generic (PLEG): container finished" podID="627f3657-5b07-4cd7-bb54-7df64de2cd96" containerID="6a078255db3208c4dfccda0923f1c1327f5da9350e93f28db52873553ad881c4" exitCode=0 Oct 09 16:24:13 crc kubenswrapper[4719]: I1009 16:24:13.306830 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxtl7" event={"ID":"627f3657-5b07-4cd7-bb54-7df64de2cd96","Type":"ContainerDied","Data":"6a078255db3208c4dfccda0923f1c1327f5da9350e93f28db52873553ad881c4"} Oct 09 16:24:13 crc kubenswrapper[4719]: I1009 16:24:13.307160 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxtl7" event={"ID":"627f3657-5b07-4cd7-bb54-7df64de2cd96","Type":"ContainerDied","Data":"8e9b5ad9b9bf27271892b3cf58d6b061f6a0184ed0a6acbaba777008a25da347"} Oct 09 16:24:13 crc kubenswrapper[4719]: I1009 16:24:13.306889 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fxtl7" Oct 09 16:24:13 crc kubenswrapper[4719]: I1009 16:24:13.307422 4719 scope.go:117] "RemoveContainer" containerID="6a078255db3208c4dfccda0923f1c1327f5da9350e93f28db52873553ad881c4" Oct 09 16:24:13 crc kubenswrapper[4719]: I1009 16:24:13.335292 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fxtl7"] Oct 09 16:24:13 crc kubenswrapper[4719]: I1009 16:24:13.347063 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fxtl7"] Oct 09 16:24:13 crc kubenswrapper[4719]: I1009 16:24:13.353326 4719 scope.go:117] "RemoveContainer" containerID="75bdd38d829c14f6bc0426ee1f6ef8d4731a71aad362f322f18fcf2181c44f3a" Oct 09 16:24:13 crc kubenswrapper[4719]: I1009 16:24:13.382667 4719 scope.go:117] "RemoveContainer" containerID="7ee60b7380a8d1033c9111a0e5226ebc6b742959586face7f167405a0f351b62" Oct 09 16:24:13 crc kubenswrapper[4719]: I1009 16:24:13.904593 4719 scope.go:117] "RemoveContainer" containerID="6a078255db3208c4dfccda0923f1c1327f5da9350e93f28db52873553ad881c4" Oct 09 16:24:13 crc kubenswrapper[4719]: E1009 16:24:13.905364 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a078255db3208c4dfccda0923f1c1327f5da9350e93f28db52873553ad881c4\": container with ID starting with 6a078255db3208c4dfccda0923f1c1327f5da9350e93f28db52873553ad881c4 not found: ID does not exist" containerID="6a078255db3208c4dfccda0923f1c1327f5da9350e93f28db52873553ad881c4" Oct 09 16:24:13 crc kubenswrapper[4719]: I1009 16:24:13.905429 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a078255db3208c4dfccda0923f1c1327f5da9350e93f28db52873553ad881c4"} err="failed to get container status \"6a078255db3208c4dfccda0923f1c1327f5da9350e93f28db52873553ad881c4\": rpc error: code = NotFound desc = could not find container \"6a078255db3208c4dfccda0923f1c1327f5da9350e93f28db52873553ad881c4\": container with ID starting with 6a078255db3208c4dfccda0923f1c1327f5da9350e93f28db52873553ad881c4 not found: ID does not exist" Oct 09 16:24:13 crc kubenswrapper[4719]: I1009 16:24:13.905463 4719 scope.go:117] "RemoveContainer" containerID="75bdd38d829c14f6bc0426ee1f6ef8d4731a71aad362f322f18fcf2181c44f3a" Oct 09 16:24:13 crc kubenswrapper[4719]: E1009 16:24:13.905822 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75bdd38d829c14f6bc0426ee1f6ef8d4731a71aad362f322f18fcf2181c44f3a\": container with ID starting with 75bdd38d829c14f6bc0426ee1f6ef8d4731a71aad362f322f18fcf2181c44f3a not found: ID does not exist" containerID="75bdd38d829c14f6bc0426ee1f6ef8d4731a71aad362f322f18fcf2181c44f3a" Oct 09 16:24:13 crc kubenswrapper[4719]: I1009 16:24:13.905930 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75bdd38d829c14f6bc0426ee1f6ef8d4731a71aad362f322f18fcf2181c44f3a"} err="failed to get container status \"75bdd38d829c14f6bc0426ee1f6ef8d4731a71aad362f322f18fcf2181c44f3a\": rpc error: code = NotFound desc = could not find container \"75bdd38d829c14f6bc0426ee1f6ef8d4731a71aad362f322f18fcf2181c44f3a\": container with ID starting with 75bdd38d829c14f6bc0426ee1f6ef8d4731a71aad362f322f18fcf2181c44f3a not found: ID does not exist" Oct 09 16:24:13 crc kubenswrapper[4719]: I1009 16:24:13.906219 4719 scope.go:117] "RemoveContainer" containerID="7ee60b7380a8d1033c9111a0e5226ebc6b742959586face7f167405a0f351b62" Oct 09 16:24:13 crc kubenswrapper[4719]: E1009 16:24:13.906676 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ee60b7380a8d1033c9111a0e5226ebc6b742959586face7f167405a0f351b62\": container with ID starting with 7ee60b7380a8d1033c9111a0e5226ebc6b742959586face7f167405a0f351b62 not found: ID does not exist" containerID="7ee60b7380a8d1033c9111a0e5226ebc6b742959586face7f167405a0f351b62" Oct 09 16:24:13 crc kubenswrapper[4719]: I1009 16:24:13.906705 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ee60b7380a8d1033c9111a0e5226ebc6b742959586face7f167405a0f351b62"} err="failed to get container status \"7ee60b7380a8d1033c9111a0e5226ebc6b742959586face7f167405a0f351b62\": rpc error: code = NotFound desc = could not find container \"7ee60b7380a8d1033c9111a0e5226ebc6b742959586face7f167405a0f351b62\": container with ID starting with 7ee60b7380a8d1033c9111a0e5226ebc6b742959586face7f167405a0f351b62 not found: ID does not exist" Oct 09 16:24:15 crc kubenswrapper[4719]: I1009 16:24:15.176018 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="627f3657-5b07-4cd7-bb54-7df64de2cd96" path="/var/lib/kubelet/pods/627f3657-5b07-4cd7-bb54-7df64de2cd96/volumes" Oct 09 16:24:20 crc kubenswrapper[4719]: I1009 16:24:20.162610 4719 scope.go:117] "RemoveContainer" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" Oct 09 16:24:20 crc kubenswrapper[4719]: E1009 16:24:20.164226 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:24:20 crc kubenswrapper[4719]: I1009 16:24:20.295839 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c6rmz" Oct 09 16:24:20 crc kubenswrapper[4719]: I1009 16:24:20.346042 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c6rmz" Oct 09 16:24:21 crc kubenswrapper[4719]: I1009 16:24:21.277458 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c6rmz"] Oct 09 16:24:21 crc kubenswrapper[4719]: I1009 16:24:21.384187 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c6rmz" podUID="a7fa8082-4819-4eff-aa72-50dc990d7cae" containerName="registry-server" containerID="cri-o://eac75bde19db701016137e0586e692564e78db076ac1f3196d4c0771b783de74" gracePeriod=2 Oct 09 16:24:21 crc kubenswrapper[4719]: I1009 16:24:21.873555 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c6rmz" Oct 09 16:24:21 crc kubenswrapper[4719]: I1009 16:24:21.977206 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7fa8082-4819-4eff-aa72-50dc990d7cae-catalog-content\") pod \"a7fa8082-4819-4eff-aa72-50dc990d7cae\" (UID: \"a7fa8082-4819-4eff-aa72-50dc990d7cae\") " Oct 09 16:24:21 crc kubenswrapper[4719]: I1009 16:24:21.977333 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7fa8082-4819-4eff-aa72-50dc990d7cae-utilities\") pod \"a7fa8082-4819-4eff-aa72-50dc990d7cae\" (UID: \"a7fa8082-4819-4eff-aa72-50dc990d7cae\") " Oct 09 16:24:21 crc kubenswrapper[4719]: I1009 16:24:21.977420 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tpj7\" (UniqueName: \"kubernetes.io/projected/a7fa8082-4819-4eff-aa72-50dc990d7cae-kube-api-access-5tpj7\") pod \"a7fa8082-4819-4eff-aa72-50dc990d7cae\" (UID: \"a7fa8082-4819-4eff-aa72-50dc990d7cae\") " Oct 09 16:24:21 crc kubenswrapper[4719]: I1009 16:24:21.978046 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7fa8082-4819-4eff-aa72-50dc990d7cae-utilities" (OuterVolumeSpecName: "utilities") pod "a7fa8082-4819-4eff-aa72-50dc990d7cae" (UID: "a7fa8082-4819-4eff-aa72-50dc990d7cae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:24:21 crc kubenswrapper[4719]: I1009 16:24:21.988221 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7fa8082-4819-4eff-aa72-50dc990d7cae-kube-api-access-5tpj7" (OuterVolumeSpecName: "kube-api-access-5tpj7") pod "a7fa8082-4819-4eff-aa72-50dc990d7cae" (UID: "a7fa8082-4819-4eff-aa72-50dc990d7cae"). InnerVolumeSpecName "kube-api-access-5tpj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:24:22 crc kubenswrapper[4719]: I1009 16:24:22.068315 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7fa8082-4819-4eff-aa72-50dc990d7cae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7fa8082-4819-4eff-aa72-50dc990d7cae" (UID: "a7fa8082-4819-4eff-aa72-50dc990d7cae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:24:22 crc kubenswrapper[4719]: I1009 16:24:22.079404 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7fa8082-4819-4eff-aa72-50dc990d7cae-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 16:24:22 crc kubenswrapper[4719]: I1009 16:24:22.079434 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7fa8082-4819-4eff-aa72-50dc990d7cae-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 16:24:22 crc kubenswrapper[4719]: I1009 16:24:22.079444 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tpj7\" (UniqueName: \"kubernetes.io/projected/a7fa8082-4819-4eff-aa72-50dc990d7cae-kube-api-access-5tpj7\") on node \"crc\" DevicePath \"\"" Oct 09 16:24:22 crc kubenswrapper[4719]: I1009 16:24:22.401204 4719 generic.go:334] "Generic (PLEG): container finished" podID="a7fa8082-4819-4eff-aa72-50dc990d7cae" containerID="eac75bde19db701016137e0586e692564e78db076ac1f3196d4c0771b783de74" exitCode=0 Oct 09 16:24:22 crc kubenswrapper[4719]: I1009 16:24:22.401250 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6rmz" event={"ID":"a7fa8082-4819-4eff-aa72-50dc990d7cae","Type":"ContainerDied","Data":"eac75bde19db701016137e0586e692564e78db076ac1f3196d4c0771b783de74"} Oct 09 16:24:22 crc kubenswrapper[4719]: I1009 16:24:22.401279 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6rmz" event={"ID":"a7fa8082-4819-4eff-aa72-50dc990d7cae","Type":"ContainerDied","Data":"ac62507acd5b36eae716f261733fce21e64d0d243571bb0bb2ff3ab836a1e9fd"} Oct 09 16:24:22 crc kubenswrapper[4719]: I1009 16:24:22.401298 4719 scope.go:117] "RemoveContainer" containerID="eac75bde19db701016137e0586e692564e78db076ac1f3196d4c0771b783de74" Oct 09 16:24:22 crc kubenswrapper[4719]: I1009 16:24:22.403125 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c6rmz" Oct 09 16:24:22 crc kubenswrapper[4719]: I1009 16:24:22.443125 4719 scope.go:117] "RemoveContainer" containerID="7b1148e8de0f00297e93bcce4c5688e1639f1d7e3b70a2cad7c5f094a818cd5d" Oct 09 16:24:22 crc kubenswrapper[4719]: I1009 16:24:22.447532 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c6rmz"] Oct 09 16:24:22 crc kubenswrapper[4719]: I1009 16:24:22.456597 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c6rmz"] Oct 09 16:24:22 crc kubenswrapper[4719]: I1009 16:24:22.469744 4719 scope.go:117] "RemoveContainer" containerID="868ce66dc060e753de901adaad0cbb2e4ab28c74d9c53c293b26896af63f80c2" Oct 09 16:24:22 crc kubenswrapper[4719]: I1009 16:24:22.535960 4719 scope.go:117] "RemoveContainer" containerID="eac75bde19db701016137e0586e692564e78db076ac1f3196d4c0771b783de74" Oct 09 16:24:22 crc kubenswrapper[4719]: E1009 16:24:22.536713 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eac75bde19db701016137e0586e692564e78db076ac1f3196d4c0771b783de74\": container with ID starting with eac75bde19db701016137e0586e692564e78db076ac1f3196d4c0771b783de74 not found: ID does not exist" containerID="eac75bde19db701016137e0586e692564e78db076ac1f3196d4c0771b783de74" Oct 09 16:24:22 crc kubenswrapper[4719]: I1009 16:24:22.536761 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eac75bde19db701016137e0586e692564e78db076ac1f3196d4c0771b783de74"} err="failed to get container status \"eac75bde19db701016137e0586e692564e78db076ac1f3196d4c0771b783de74\": rpc error: code = NotFound desc = could not find container \"eac75bde19db701016137e0586e692564e78db076ac1f3196d4c0771b783de74\": container with ID starting with eac75bde19db701016137e0586e692564e78db076ac1f3196d4c0771b783de74 not found: ID does not exist" Oct 09 16:24:22 crc kubenswrapper[4719]: I1009 16:24:22.536791 4719 scope.go:117] "RemoveContainer" containerID="7b1148e8de0f00297e93bcce4c5688e1639f1d7e3b70a2cad7c5f094a818cd5d" Oct 09 16:24:22 crc kubenswrapper[4719]: E1009 16:24:22.537562 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b1148e8de0f00297e93bcce4c5688e1639f1d7e3b70a2cad7c5f094a818cd5d\": container with ID starting with 7b1148e8de0f00297e93bcce4c5688e1639f1d7e3b70a2cad7c5f094a818cd5d not found: ID does not exist" containerID="7b1148e8de0f00297e93bcce4c5688e1639f1d7e3b70a2cad7c5f094a818cd5d" Oct 09 16:24:22 crc kubenswrapper[4719]: I1009 16:24:22.537621 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b1148e8de0f00297e93bcce4c5688e1639f1d7e3b70a2cad7c5f094a818cd5d"} err="failed to get container status \"7b1148e8de0f00297e93bcce4c5688e1639f1d7e3b70a2cad7c5f094a818cd5d\": rpc error: code = NotFound desc = could not find container \"7b1148e8de0f00297e93bcce4c5688e1639f1d7e3b70a2cad7c5f094a818cd5d\": container with ID starting with 7b1148e8de0f00297e93bcce4c5688e1639f1d7e3b70a2cad7c5f094a818cd5d not found: ID does not exist" Oct 09 16:24:22 crc kubenswrapper[4719]: I1009 16:24:22.537662 4719 scope.go:117] "RemoveContainer" containerID="868ce66dc060e753de901adaad0cbb2e4ab28c74d9c53c293b26896af63f80c2" Oct 09 16:24:22 crc kubenswrapper[4719]: E1009 16:24:22.538174 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"868ce66dc060e753de901adaad0cbb2e4ab28c74d9c53c293b26896af63f80c2\": container with ID starting with 868ce66dc060e753de901adaad0cbb2e4ab28c74d9c53c293b26896af63f80c2 not found: ID does not exist" containerID="868ce66dc060e753de901adaad0cbb2e4ab28c74d9c53c293b26896af63f80c2" Oct 09 16:24:22 crc kubenswrapper[4719]: I1009 16:24:22.538235 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"868ce66dc060e753de901adaad0cbb2e4ab28c74d9c53c293b26896af63f80c2"} err="failed to get container status \"868ce66dc060e753de901adaad0cbb2e4ab28c74d9c53c293b26896af63f80c2\": rpc error: code = NotFound desc = could not find container \"868ce66dc060e753de901adaad0cbb2e4ab28c74d9c53c293b26896af63f80c2\": container with ID starting with 868ce66dc060e753de901adaad0cbb2e4ab28c74d9c53c293b26896af63f80c2 not found: ID does not exist" Oct 09 16:24:23 crc kubenswrapper[4719]: I1009 16:24:23.171929 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7fa8082-4819-4eff-aa72-50dc990d7cae" path="/var/lib/kubelet/pods/a7fa8082-4819-4eff-aa72-50dc990d7cae/volumes" Oct 09 16:24:35 crc kubenswrapper[4719]: I1009 16:24:35.170840 4719 scope.go:117] "RemoveContainer" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" Oct 09 16:24:35 crc kubenswrapper[4719]: E1009 16:24:35.172611 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:24:48 crc kubenswrapper[4719]: I1009 16:24:48.161524 4719 scope.go:117] "RemoveContainer" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" Oct 09 16:24:48 crc kubenswrapper[4719]: E1009 16:24:48.163324 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:25:02 crc kubenswrapper[4719]: I1009 16:25:02.161894 4719 scope.go:117] "RemoveContainer" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" Oct 09 16:25:02 crc kubenswrapper[4719]: E1009 16:25:02.163326 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:25:17 crc kubenswrapper[4719]: I1009 16:25:17.161586 4719 scope.go:117] "RemoveContainer" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" Oct 09 16:25:17 crc kubenswrapper[4719]: E1009 16:25:17.162227 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:25:29 crc kubenswrapper[4719]: I1009 16:25:29.161881 4719 scope.go:117] "RemoveContainer" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" Oct 09 16:25:29 crc kubenswrapper[4719]: E1009 16:25:29.162660 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:25:41 crc kubenswrapper[4719]: I1009 16:25:41.161819 4719 scope.go:117] "RemoveContainer" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" Oct 09 16:25:41 crc kubenswrapper[4719]: E1009 16:25:41.162803 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:25:54 crc kubenswrapper[4719]: I1009 16:25:54.162384 4719 scope.go:117] "RemoveContainer" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" Oct 09 16:25:54 crc kubenswrapper[4719]: E1009 16:25:54.163738 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:26:08 crc kubenswrapper[4719]: I1009 16:26:08.161791 4719 scope.go:117] "RemoveContainer" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" Oct 09 16:26:08 crc kubenswrapper[4719]: E1009 16:26:08.162889 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:26:20 crc kubenswrapper[4719]: I1009 16:26:20.162443 4719 scope.go:117] "RemoveContainer" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" Oct 09 16:26:20 crc kubenswrapper[4719]: E1009 16:26:20.163387 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:26:32 crc kubenswrapper[4719]: I1009 16:26:32.162118 4719 scope.go:117] "RemoveContainer" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" Oct 09 16:26:32 crc kubenswrapper[4719]: E1009 16:26:32.163029 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:26:44 crc kubenswrapper[4719]: I1009 16:26:44.161515 4719 scope.go:117] "RemoveContainer" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" Oct 09 16:26:44 crc kubenswrapper[4719]: E1009 16:26:44.162368 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:26:57 crc kubenswrapper[4719]: I1009 16:26:57.161890 4719 scope.go:117] "RemoveContainer" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" Oct 09 16:26:57 crc kubenswrapper[4719]: E1009 16:26:57.162951 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:27:11 crc kubenswrapper[4719]: I1009 16:27:11.161941 4719 scope.go:117] "RemoveContainer" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" Oct 09 16:27:12 crc kubenswrapper[4719]: I1009 16:27:12.129819 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerStarted","Data":"5c33815915bf14db268853cbe5a0e13208e21886bddc6ff5675bb91c3610b017"} Oct 09 16:27:21 crc kubenswrapper[4719]: I1009 16:27:21.396590 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xlkll"] Oct 09 16:27:21 crc kubenswrapper[4719]: E1009 16:27:21.397626 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7fa8082-4819-4eff-aa72-50dc990d7cae" containerName="extract-utilities" Oct 09 16:27:21 crc kubenswrapper[4719]: I1009 16:27:21.397643 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7fa8082-4819-4eff-aa72-50dc990d7cae" containerName="extract-utilities" Oct 09 16:27:21 crc kubenswrapper[4719]: E1009 16:27:21.397672 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627f3657-5b07-4cd7-bb54-7df64de2cd96" containerName="extract-content" Oct 09 16:27:21 crc kubenswrapper[4719]: I1009 16:27:21.397678 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="627f3657-5b07-4cd7-bb54-7df64de2cd96" containerName="extract-content" Oct 09 16:27:21 crc kubenswrapper[4719]: E1009 16:27:21.397696 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7fa8082-4819-4eff-aa72-50dc990d7cae" containerName="registry-server" Oct 09 16:27:21 crc kubenswrapper[4719]: I1009 16:27:21.397703 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7fa8082-4819-4eff-aa72-50dc990d7cae" containerName="registry-server" Oct 09 16:27:21 crc kubenswrapper[4719]: E1009 16:27:21.397711 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627f3657-5b07-4cd7-bb54-7df64de2cd96" containerName="registry-server" Oct 09 16:27:21 crc kubenswrapper[4719]: I1009 16:27:21.397718 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="627f3657-5b07-4cd7-bb54-7df64de2cd96" containerName="registry-server" Oct 09 16:27:21 crc kubenswrapper[4719]: E1009 16:27:21.397733 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627f3657-5b07-4cd7-bb54-7df64de2cd96" containerName="extract-utilities" Oct 09 16:27:21 crc kubenswrapper[4719]: I1009 16:27:21.397740 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="627f3657-5b07-4cd7-bb54-7df64de2cd96" containerName="extract-utilities" Oct 09 16:27:21 crc kubenswrapper[4719]: E1009 16:27:21.397754 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7fa8082-4819-4eff-aa72-50dc990d7cae" containerName="extract-content" Oct 09 16:27:21 crc kubenswrapper[4719]: I1009 16:27:21.397761 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7fa8082-4819-4eff-aa72-50dc990d7cae" containerName="extract-content" Oct 09 16:27:21 crc kubenswrapper[4719]: I1009 16:27:21.398002 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="627f3657-5b07-4cd7-bb54-7df64de2cd96" containerName="registry-server" Oct 09 16:27:21 crc kubenswrapper[4719]: I1009 16:27:21.398033 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7fa8082-4819-4eff-aa72-50dc990d7cae" containerName="registry-server" Oct 09 16:27:21 crc kubenswrapper[4719]: I1009 16:27:21.399507 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xlkll" Oct 09 16:27:21 crc kubenswrapper[4719]: I1009 16:27:21.409623 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xlkll"] Oct 09 16:27:21 crc kubenswrapper[4719]: I1009 16:27:21.462559 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bqch\" (UniqueName: \"kubernetes.io/projected/203fbd21-ed42-449f-94e8-938408210166-kube-api-access-2bqch\") pod \"certified-operators-xlkll\" (UID: \"203fbd21-ed42-449f-94e8-938408210166\") " pod="openshift-marketplace/certified-operators-xlkll" Oct 09 16:27:21 crc kubenswrapper[4719]: I1009 16:27:21.462772 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/203fbd21-ed42-449f-94e8-938408210166-catalog-content\") pod \"certified-operators-xlkll\" (UID: \"203fbd21-ed42-449f-94e8-938408210166\") " pod="openshift-marketplace/certified-operators-xlkll" Oct 09 16:27:21 crc kubenswrapper[4719]: I1009 16:27:21.462832 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/203fbd21-ed42-449f-94e8-938408210166-utilities\") pod \"certified-operators-xlkll\" (UID: \"203fbd21-ed42-449f-94e8-938408210166\") " pod="openshift-marketplace/certified-operators-xlkll" Oct 09 16:27:21 crc kubenswrapper[4719]: I1009 16:27:21.565081 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/203fbd21-ed42-449f-94e8-938408210166-catalog-content\") pod \"certified-operators-xlkll\" (UID: \"203fbd21-ed42-449f-94e8-938408210166\") " pod="openshift-marketplace/certified-operators-xlkll" Oct 09 16:27:21 crc kubenswrapper[4719]: I1009 16:27:21.565138 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/203fbd21-ed42-449f-94e8-938408210166-utilities\") pod \"certified-operators-xlkll\" (UID: \"203fbd21-ed42-449f-94e8-938408210166\") " pod="openshift-marketplace/certified-operators-xlkll" Oct 09 16:27:21 crc kubenswrapper[4719]: I1009 16:27:21.565265 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bqch\" (UniqueName: \"kubernetes.io/projected/203fbd21-ed42-449f-94e8-938408210166-kube-api-access-2bqch\") pod \"certified-operators-xlkll\" (UID: \"203fbd21-ed42-449f-94e8-938408210166\") " pod="openshift-marketplace/certified-operators-xlkll" Oct 09 16:27:21 crc kubenswrapper[4719]: I1009 16:27:21.566026 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/203fbd21-ed42-449f-94e8-938408210166-catalog-content\") pod \"certified-operators-xlkll\" (UID: \"203fbd21-ed42-449f-94e8-938408210166\") " pod="openshift-marketplace/certified-operators-xlkll" Oct 09 16:27:21 crc kubenswrapper[4719]: I1009 16:27:21.566251 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/203fbd21-ed42-449f-94e8-938408210166-utilities\") pod \"certified-operators-xlkll\" (UID: \"203fbd21-ed42-449f-94e8-938408210166\") " pod="openshift-marketplace/certified-operators-xlkll" Oct 09 16:27:21 crc kubenswrapper[4719]: I1009 16:27:21.587667 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bqch\" (UniqueName: \"kubernetes.io/projected/203fbd21-ed42-449f-94e8-938408210166-kube-api-access-2bqch\") pod \"certified-operators-xlkll\" (UID: \"203fbd21-ed42-449f-94e8-938408210166\") " pod="openshift-marketplace/certified-operators-xlkll" Oct 09 16:27:21 crc kubenswrapper[4719]: I1009 16:27:21.726118 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xlkll" Oct 09 16:27:22 crc kubenswrapper[4719]: I1009 16:27:22.231971 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xlkll"] Oct 09 16:27:23 crc kubenswrapper[4719]: I1009 16:27:23.268951 4719 generic.go:334] "Generic (PLEG): container finished" podID="203fbd21-ed42-449f-94e8-938408210166" containerID="e3fdbb1d7e3e8fd284b87cc8f05303a6c1248bbc946b0ae5835297fe049f1b02" exitCode=0 Oct 09 16:27:23 crc kubenswrapper[4719]: I1009 16:27:23.269048 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xlkll" event={"ID":"203fbd21-ed42-449f-94e8-938408210166","Type":"ContainerDied","Data":"e3fdbb1d7e3e8fd284b87cc8f05303a6c1248bbc946b0ae5835297fe049f1b02"} Oct 09 16:27:23 crc kubenswrapper[4719]: I1009 16:27:23.269493 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xlkll" event={"ID":"203fbd21-ed42-449f-94e8-938408210166","Type":"ContainerStarted","Data":"f857c78f410fad3f53b352652d3dcf33b10428eb442c74afb3f5bd16aeb7b334"} Oct 09 16:27:24 crc kubenswrapper[4719]: I1009 16:27:24.282305 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xlkll" event={"ID":"203fbd21-ed42-449f-94e8-938408210166","Type":"ContainerStarted","Data":"8cc82d93f337de34ed3f156e4f93766080c1ad509d019ac708a0336f85c3b1c5"} Oct 09 16:27:25 crc kubenswrapper[4719]: I1009 16:27:25.292799 4719 generic.go:334] "Generic (PLEG): container finished" podID="203fbd21-ed42-449f-94e8-938408210166" containerID="8cc82d93f337de34ed3f156e4f93766080c1ad509d019ac708a0336f85c3b1c5" exitCode=0 Oct 09 16:27:25 crc kubenswrapper[4719]: I1009 16:27:25.292872 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xlkll" event={"ID":"203fbd21-ed42-449f-94e8-938408210166","Type":"ContainerDied","Data":"8cc82d93f337de34ed3f156e4f93766080c1ad509d019ac708a0336f85c3b1c5"} Oct 09 16:27:26 crc kubenswrapper[4719]: I1009 16:27:26.306989 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xlkll" event={"ID":"203fbd21-ed42-449f-94e8-938408210166","Type":"ContainerStarted","Data":"c7b5bba9c8b52e01c3794fa52341b8ffed36753165000cb3ce1db417f88840db"} Oct 09 16:27:26 crc kubenswrapper[4719]: I1009 16:27:26.334957 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xlkll" podStartSLOduration=2.836782912 podStartE2EDuration="5.334937474s" podCreationTimestamp="2025-10-09 16:27:21 +0000 UTC" firstStartedPulling="2025-10-09 16:27:23.271413606 +0000 UTC m=+4148.781124891" lastFinishedPulling="2025-10-09 16:27:25.769568168 +0000 UTC m=+4151.279279453" observedRunningTime="2025-10-09 16:27:26.329051247 +0000 UTC m=+4151.838762532" watchObservedRunningTime="2025-10-09 16:27:26.334937474 +0000 UTC m=+4151.844648769" Oct 09 16:27:31 crc kubenswrapper[4719]: I1009 16:27:31.727104 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xlkll" Oct 09 16:27:31 crc kubenswrapper[4719]: I1009 16:27:31.728164 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xlkll" Oct 09 16:27:31 crc kubenswrapper[4719]: I1009 16:27:31.786201 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xlkll" Oct 09 16:27:32 crc kubenswrapper[4719]: I1009 16:27:32.487293 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xlkll" Oct 09 16:27:32 crc kubenswrapper[4719]: I1009 16:27:32.541005 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xlkll"] Oct 09 16:27:34 crc kubenswrapper[4719]: I1009 16:27:34.407600 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xlkll" podUID="203fbd21-ed42-449f-94e8-938408210166" containerName="registry-server" containerID="cri-o://c7b5bba9c8b52e01c3794fa52341b8ffed36753165000cb3ce1db417f88840db" gracePeriod=2 Oct 09 16:27:34 crc kubenswrapper[4719]: I1009 16:27:34.913099 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xlkll" Oct 09 16:27:35 crc kubenswrapper[4719]: I1009 16:27:35.023815 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/203fbd21-ed42-449f-94e8-938408210166-catalog-content\") pod \"203fbd21-ed42-449f-94e8-938408210166\" (UID: \"203fbd21-ed42-449f-94e8-938408210166\") " Oct 09 16:27:35 crc kubenswrapper[4719]: I1009 16:27:35.023920 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/203fbd21-ed42-449f-94e8-938408210166-utilities\") pod \"203fbd21-ed42-449f-94e8-938408210166\" (UID: \"203fbd21-ed42-449f-94e8-938408210166\") " Oct 09 16:27:35 crc kubenswrapper[4719]: I1009 16:27:35.024010 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bqch\" (UniqueName: \"kubernetes.io/projected/203fbd21-ed42-449f-94e8-938408210166-kube-api-access-2bqch\") pod \"203fbd21-ed42-449f-94e8-938408210166\" (UID: \"203fbd21-ed42-449f-94e8-938408210166\") " Oct 09 16:27:35 crc kubenswrapper[4719]: I1009 16:27:35.025108 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/203fbd21-ed42-449f-94e8-938408210166-utilities" (OuterVolumeSpecName: "utilities") pod "203fbd21-ed42-449f-94e8-938408210166" (UID: "203fbd21-ed42-449f-94e8-938408210166"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:27:35 crc kubenswrapper[4719]: I1009 16:27:35.032571 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/203fbd21-ed42-449f-94e8-938408210166-kube-api-access-2bqch" (OuterVolumeSpecName: "kube-api-access-2bqch") pod "203fbd21-ed42-449f-94e8-938408210166" (UID: "203fbd21-ed42-449f-94e8-938408210166"). InnerVolumeSpecName "kube-api-access-2bqch". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:27:35 crc kubenswrapper[4719]: I1009 16:27:35.076588 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/203fbd21-ed42-449f-94e8-938408210166-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "203fbd21-ed42-449f-94e8-938408210166" (UID: "203fbd21-ed42-449f-94e8-938408210166"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:27:35 crc kubenswrapper[4719]: I1009 16:27:35.127200 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/203fbd21-ed42-449f-94e8-938408210166-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 16:27:35 crc kubenswrapper[4719]: I1009 16:27:35.127230 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/203fbd21-ed42-449f-94e8-938408210166-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 16:27:35 crc kubenswrapper[4719]: I1009 16:27:35.127243 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bqch\" (UniqueName: \"kubernetes.io/projected/203fbd21-ed42-449f-94e8-938408210166-kube-api-access-2bqch\") on node \"crc\" DevicePath \"\"" Oct 09 16:27:35 crc kubenswrapper[4719]: I1009 16:27:35.417738 4719 generic.go:334] "Generic (PLEG): container finished" podID="203fbd21-ed42-449f-94e8-938408210166" containerID="c7b5bba9c8b52e01c3794fa52341b8ffed36753165000cb3ce1db417f88840db" exitCode=0 Oct 09 16:27:35 crc kubenswrapper[4719]: I1009 16:27:35.417811 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xlkll" Oct 09 16:27:35 crc kubenswrapper[4719]: I1009 16:27:35.417827 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xlkll" event={"ID":"203fbd21-ed42-449f-94e8-938408210166","Type":"ContainerDied","Data":"c7b5bba9c8b52e01c3794fa52341b8ffed36753165000cb3ce1db417f88840db"} Oct 09 16:27:35 crc kubenswrapper[4719]: I1009 16:27:35.418167 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xlkll" event={"ID":"203fbd21-ed42-449f-94e8-938408210166","Type":"ContainerDied","Data":"f857c78f410fad3f53b352652d3dcf33b10428eb442c74afb3f5bd16aeb7b334"} Oct 09 16:27:35 crc kubenswrapper[4719]: I1009 16:27:35.418191 4719 scope.go:117] "RemoveContainer" containerID="c7b5bba9c8b52e01c3794fa52341b8ffed36753165000cb3ce1db417f88840db" Oct 09 16:27:35 crc kubenswrapper[4719]: I1009 16:27:35.440687 4719 scope.go:117] "RemoveContainer" containerID="8cc82d93f337de34ed3f156e4f93766080c1ad509d019ac708a0336f85c3b1c5" Oct 09 16:27:35 crc kubenswrapper[4719]: I1009 16:27:35.446076 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xlkll"] Oct 09 16:27:35 crc kubenswrapper[4719]: I1009 16:27:35.457834 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xlkll"] Oct 09 16:27:35 crc kubenswrapper[4719]: I1009 16:27:35.470091 4719 scope.go:117] "RemoveContainer" containerID="e3fdbb1d7e3e8fd284b87cc8f05303a6c1248bbc946b0ae5835297fe049f1b02" Oct 09 16:27:35 crc kubenswrapper[4719]: I1009 16:27:35.514117 4719 scope.go:117] "RemoveContainer" containerID="c7b5bba9c8b52e01c3794fa52341b8ffed36753165000cb3ce1db417f88840db" Oct 09 16:27:35 crc kubenswrapper[4719]: E1009 16:27:35.514595 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7b5bba9c8b52e01c3794fa52341b8ffed36753165000cb3ce1db417f88840db\": container with ID starting with c7b5bba9c8b52e01c3794fa52341b8ffed36753165000cb3ce1db417f88840db not found: ID does not exist" containerID="c7b5bba9c8b52e01c3794fa52341b8ffed36753165000cb3ce1db417f88840db" Oct 09 16:27:35 crc kubenswrapper[4719]: I1009 16:27:35.514647 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7b5bba9c8b52e01c3794fa52341b8ffed36753165000cb3ce1db417f88840db"} err="failed to get container status \"c7b5bba9c8b52e01c3794fa52341b8ffed36753165000cb3ce1db417f88840db\": rpc error: code = NotFound desc = could not find container \"c7b5bba9c8b52e01c3794fa52341b8ffed36753165000cb3ce1db417f88840db\": container with ID starting with c7b5bba9c8b52e01c3794fa52341b8ffed36753165000cb3ce1db417f88840db not found: ID does not exist" Oct 09 16:27:35 crc kubenswrapper[4719]: I1009 16:27:35.514678 4719 scope.go:117] "RemoveContainer" containerID="8cc82d93f337de34ed3f156e4f93766080c1ad509d019ac708a0336f85c3b1c5" Oct 09 16:27:35 crc kubenswrapper[4719]: E1009 16:27:35.515034 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cc82d93f337de34ed3f156e4f93766080c1ad509d019ac708a0336f85c3b1c5\": container with ID starting with 8cc82d93f337de34ed3f156e4f93766080c1ad509d019ac708a0336f85c3b1c5 not found: ID does not exist" containerID="8cc82d93f337de34ed3f156e4f93766080c1ad509d019ac708a0336f85c3b1c5" Oct 09 16:27:35 crc kubenswrapper[4719]: I1009 16:27:35.515124 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cc82d93f337de34ed3f156e4f93766080c1ad509d019ac708a0336f85c3b1c5"} err="failed to get container status \"8cc82d93f337de34ed3f156e4f93766080c1ad509d019ac708a0336f85c3b1c5\": rpc error: code = NotFound desc = could not find container \"8cc82d93f337de34ed3f156e4f93766080c1ad509d019ac708a0336f85c3b1c5\": container with ID starting with 8cc82d93f337de34ed3f156e4f93766080c1ad509d019ac708a0336f85c3b1c5 not found: ID does not exist" Oct 09 16:27:35 crc kubenswrapper[4719]: I1009 16:27:35.515172 4719 scope.go:117] "RemoveContainer" containerID="e3fdbb1d7e3e8fd284b87cc8f05303a6c1248bbc946b0ae5835297fe049f1b02" Oct 09 16:27:35 crc kubenswrapper[4719]: E1009 16:27:35.515905 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3fdbb1d7e3e8fd284b87cc8f05303a6c1248bbc946b0ae5835297fe049f1b02\": container with ID starting with e3fdbb1d7e3e8fd284b87cc8f05303a6c1248bbc946b0ae5835297fe049f1b02 not found: ID does not exist" containerID="e3fdbb1d7e3e8fd284b87cc8f05303a6c1248bbc946b0ae5835297fe049f1b02" Oct 09 16:27:35 crc kubenswrapper[4719]: I1009 16:27:35.515946 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3fdbb1d7e3e8fd284b87cc8f05303a6c1248bbc946b0ae5835297fe049f1b02"} err="failed to get container status \"e3fdbb1d7e3e8fd284b87cc8f05303a6c1248bbc946b0ae5835297fe049f1b02\": rpc error: code = NotFound desc = could not find container \"e3fdbb1d7e3e8fd284b87cc8f05303a6c1248bbc946b0ae5835297fe049f1b02\": container with ID starting with e3fdbb1d7e3e8fd284b87cc8f05303a6c1248bbc946b0ae5835297fe049f1b02 not found: ID does not exist" Oct 09 16:27:37 crc kubenswrapper[4719]: I1009 16:27:37.175500 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="203fbd21-ed42-449f-94e8-938408210166" path="/var/lib/kubelet/pods/203fbd21-ed42-449f-94e8-938408210166/volumes" Oct 09 16:27:58 crc kubenswrapper[4719]: I1009 16:27:58.486856 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6457k"] Oct 09 16:27:58 crc kubenswrapper[4719]: E1009 16:27:58.487990 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="203fbd21-ed42-449f-94e8-938408210166" containerName="extract-content" Oct 09 16:27:58 crc kubenswrapper[4719]: I1009 16:27:58.488007 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="203fbd21-ed42-449f-94e8-938408210166" containerName="extract-content" Oct 09 16:27:58 crc kubenswrapper[4719]: E1009 16:27:58.488026 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="203fbd21-ed42-449f-94e8-938408210166" containerName="extract-utilities" Oct 09 16:27:58 crc kubenswrapper[4719]: I1009 16:27:58.488032 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="203fbd21-ed42-449f-94e8-938408210166" containerName="extract-utilities" Oct 09 16:27:58 crc kubenswrapper[4719]: E1009 16:27:58.488056 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="203fbd21-ed42-449f-94e8-938408210166" containerName="registry-server" Oct 09 16:27:58 crc kubenswrapper[4719]: I1009 16:27:58.488065 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="203fbd21-ed42-449f-94e8-938408210166" containerName="registry-server" Oct 09 16:27:58 crc kubenswrapper[4719]: I1009 16:27:58.488305 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="203fbd21-ed42-449f-94e8-938408210166" containerName="registry-server" Oct 09 16:27:58 crc kubenswrapper[4719]: I1009 16:27:58.490086 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6457k" Oct 09 16:27:58 crc kubenswrapper[4719]: I1009 16:27:58.505272 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6457k"] Oct 09 16:27:58 crc kubenswrapper[4719]: I1009 16:27:58.622055 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29e54f85-6f40-4f97-b774-342a170651df-utilities\") pod \"redhat-marketplace-6457k\" (UID: \"29e54f85-6f40-4f97-b774-342a170651df\") " pod="openshift-marketplace/redhat-marketplace-6457k" Oct 09 16:27:58 crc kubenswrapper[4719]: I1009 16:27:58.622162 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq7g9\" (UniqueName: \"kubernetes.io/projected/29e54f85-6f40-4f97-b774-342a170651df-kube-api-access-cq7g9\") pod \"redhat-marketplace-6457k\" (UID: \"29e54f85-6f40-4f97-b774-342a170651df\") " pod="openshift-marketplace/redhat-marketplace-6457k" Oct 09 16:27:58 crc kubenswrapper[4719]: I1009 16:27:58.622519 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29e54f85-6f40-4f97-b774-342a170651df-catalog-content\") pod \"redhat-marketplace-6457k\" (UID: \"29e54f85-6f40-4f97-b774-342a170651df\") " pod="openshift-marketplace/redhat-marketplace-6457k" Oct 09 16:27:58 crc kubenswrapper[4719]: I1009 16:27:58.724964 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29e54f85-6f40-4f97-b774-342a170651df-utilities\") pod \"redhat-marketplace-6457k\" (UID: \"29e54f85-6f40-4f97-b774-342a170651df\") " pod="openshift-marketplace/redhat-marketplace-6457k" Oct 09 16:27:58 crc kubenswrapper[4719]: I1009 16:27:58.725075 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq7g9\" (UniqueName: \"kubernetes.io/projected/29e54f85-6f40-4f97-b774-342a170651df-kube-api-access-cq7g9\") pod \"redhat-marketplace-6457k\" (UID: \"29e54f85-6f40-4f97-b774-342a170651df\") " pod="openshift-marketplace/redhat-marketplace-6457k" Oct 09 16:27:58 crc kubenswrapper[4719]: I1009 16:27:58.725191 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29e54f85-6f40-4f97-b774-342a170651df-catalog-content\") pod \"redhat-marketplace-6457k\" (UID: \"29e54f85-6f40-4f97-b774-342a170651df\") " pod="openshift-marketplace/redhat-marketplace-6457k" Oct 09 16:27:58 crc kubenswrapper[4719]: I1009 16:27:58.725529 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29e54f85-6f40-4f97-b774-342a170651df-utilities\") pod \"redhat-marketplace-6457k\" (UID: \"29e54f85-6f40-4f97-b774-342a170651df\") " pod="openshift-marketplace/redhat-marketplace-6457k" Oct 09 16:27:58 crc kubenswrapper[4719]: I1009 16:27:58.725851 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29e54f85-6f40-4f97-b774-342a170651df-catalog-content\") pod \"redhat-marketplace-6457k\" (UID: \"29e54f85-6f40-4f97-b774-342a170651df\") " pod="openshift-marketplace/redhat-marketplace-6457k" Oct 09 16:27:58 crc kubenswrapper[4719]: I1009 16:27:58.751794 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq7g9\" (UniqueName: \"kubernetes.io/projected/29e54f85-6f40-4f97-b774-342a170651df-kube-api-access-cq7g9\") pod \"redhat-marketplace-6457k\" (UID: \"29e54f85-6f40-4f97-b774-342a170651df\") " pod="openshift-marketplace/redhat-marketplace-6457k" Oct 09 16:27:58 crc kubenswrapper[4719]: I1009 16:27:58.821888 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6457k" Oct 09 16:27:59 crc kubenswrapper[4719]: I1009 16:27:59.324125 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6457k"] Oct 09 16:27:59 crc kubenswrapper[4719]: I1009 16:27:59.690190 4719 generic.go:334] "Generic (PLEG): container finished" podID="29e54f85-6f40-4f97-b774-342a170651df" containerID="1cd5272e723f15b7fbdcf745bce54845074b5d1a5dc019199d480b4228918ff3" exitCode=0 Oct 09 16:27:59 crc kubenswrapper[4719]: I1009 16:27:59.690315 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6457k" event={"ID":"29e54f85-6f40-4f97-b774-342a170651df","Type":"ContainerDied","Data":"1cd5272e723f15b7fbdcf745bce54845074b5d1a5dc019199d480b4228918ff3"} Oct 09 16:27:59 crc kubenswrapper[4719]: I1009 16:27:59.690806 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6457k" event={"ID":"29e54f85-6f40-4f97-b774-342a170651df","Type":"ContainerStarted","Data":"8739a9cd2d192c0c5daec264c6c74b153e39dfa6c6fd21c90109cc85a773b9a8"} Oct 09 16:28:00 crc kubenswrapper[4719]: I1009 16:28:00.704883 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6457k" event={"ID":"29e54f85-6f40-4f97-b774-342a170651df","Type":"ContainerStarted","Data":"78feae67f2b01ac75852b5c7ccbfb56c95295d0d2c792474475c6ab7007dcb92"} Oct 09 16:28:01 crc kubenswrapper[4719]: I1009 16:28:01.716779 4719 generic.go:334] "Generic (PLEG): container finished" podID="29e54f85-6f40-4f97-b774-342a170651df" containerID="78feae67f2b01ac75852b5c7ccbfb56c95295d0d2c792474475c6ab7007dcb92" exitCode=0 Oct 09 16:28:01 crc kubenswrapper[4719]: I1009 16:28:01.716817 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6457k" event={"ID":"29e54f85-6f40-4f97-b774-342a170651df","Type":"ContainerDied","Data":"78feae67f2b01ac75852b5c7ccbfb56c95295d0d2c792474475c6ab7007dcb92"} Oct 09 16:28:02 crc kubenswrapper[4719]: I1009 16:28:02.743947 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6457k" event={"ID":"29e54f85-6f40-4f97-b774-342a170651df","Type":"ContainerStarted","Data":"e47e6d9b9664f65a27cce1b5d4b75ba574de7b33754207495fbe6d72680d0f28"} Oct 09 16:28:02 crc kubenswrapper[4719]: I1009 16:28:02.771165 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6457k" podStartSLOduration=2.312535979 podStartE2EDuration="4.771141921s" podCreationTimestamp="2025-10-09 16:27:58 +0000 UTC" firstStartedPulling="2025-10-09 16:27:59.694296589 +0000 UTC m=+4185.204007874" lastFinishedPulling="2025-10-09 16:28:02.152902531 +0000 UTC m=+4187.662613816" observedRunningTime="2025-10-09 16:28:02.763018792 +0000 UTC m=+4188.272730077" watchObservedRunningTime="2025-10-09 16:28:02.771141921 +0000 UTC m=+4188.280853206" Oct 09 16:28:08 crc kubenswrapper[4719]: I1009 16:28:08.822797 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6457k" Oct 09 16:28:08 crc kubenswrapper[4719]: I1009 16:28:08.823817 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6457k" Oct 09 16:28:08 crc kubenswrapper[4719]: I1009 16:28:08.882764 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6457k" Oct 09 16:28:09 crc kubenswrapper[4719]: I1009 16:28:09.871072 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6457k" Oct 09 16:28:09 crc kubenswrapper[4719]: I1009 16:28:09.952081 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6457k"] Oct 09 16:28:11 crc kubenswrapper[4719]: I1009 16:28:11.844053 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6457k" podUID="29e54f85-6f40-4f97-b774-342a170651df" containerName="registry-server" containerID="cri-o://e47e6d9b9664f65a27cce1b5d4b75ba574de7b33754207495fbe6d72680d0f28" gracePeriod=2 Oct 09 16:28:12 crc kubenswrapper[4719]: I1009 16:28:12.360106 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6457k" Oct 09 16:28:12 crc kubenswrapper[4719]: I1009 16:28:12.448610 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq7g9\" (UniqueName: \"kubernetes.io/projected/29e54f85-6f40-4f97-b774-342a170651df-kube-api-access-cq7g9\") pod \"29e54f85-6f40-4f97-b774-342a170651df\" (UID: \"29e54f85-6f40-4f97-b774-342a170651df\") " Oct 09 16:28:12 crc kubenswrapper[4719]: I1009 16:28:12.449165 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29e54f85-6f40-4f97-b774-342a170651df-utilities\") pod \"29e54f85-6f40-4f97-b774-342a170651df\" (UID: \"29e54f85-6f40-4f97-b774-342a170651df\") " Oct 09 16:28:12 crc kubenswrapper[4719]: I1009 16:28:12.449824 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29e54f85-6f40-4f97-b774-342a170651df-catalog-content\") pod \"29e54f85-6f40-4f97-b774-342a170651df\" (UID: \"29e54f85-6f40-4f97-b774-342a170651df\") " Oct 09 16:28:12 crc kubenswrapper[4719]: I1009 16:28:12.451619 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29e54f85-6f40-4f97-b774-342a170651df-utilities" (OuterVolumeSpecName: "utilities") pod "29e54f85-6f40-4f97-b774-342a170651df" (UID: "29e54f85-6f40-4f97-b774-342a170651df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:28:12 crc kubenswrapper[4719]: I1009 16:28:12.458909 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29e54f85-6f40-4f97-b774-342a170651df-kube-api-access-cq7g9" (OuterVolumeSpecName: "kube-api-access-cq7g9") pod "29e54f85-6f40-4f97-b774-342a170651df" (UID: "29e54f85-6f40-4f97-b774-342a170651df"). InnerVolumeSpecName "kube-api-access-cq7g9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:28:12 crc kubenswrapper[4719]: I1009 16:28:12.464981 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29e54f85-6f40-4f97-b774-342a170651df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29e54f85-6f40-4f97-b774-342a170651df" (UID: "29e54f85-6f40-4f97-b774-342a170651df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:28:12 crc kubenswrapper[4719]: I1009 16:28:12.555011 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq7g9\" (UniqueName: \"kubernetes.io/projected/29e54f85-6f40-4f97-b774-342a170651df-kube-api-access-cq7g9\") on node \"crc\" DevicePath \"\"" Oct 09 16:28:12 crc kubenswrapper[4719]: I1009 16:28:12.555065 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29e54f85-6f40-4f97-b774-342a170651df-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 16:28:12 crc kubenswrapper[4719]: I1009 16:28:12.555079 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29e54f85-6f40-4f97-b774-342a170651df-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 16:28:12 crc kubenswrapper[4719]: I1009 16:28:12.856615 4719 generic.go:334] "Generic (PLEG): container finished" podID="29e54f85-6f40-4f97-b774-342a170651df" containerID="e47e6d9b9664f65a27cce1b5d4b75ba574de7b33754207495fbe6d72680d0f28" exitCode=0 Oct 09 16:28:12 crc kubenswrapper[4719]: I1009 16:28:12.856685 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6457k" event={"ID":"29e54f85-6f40-4f97-b774-342a170651df","Type":"ContainerDied","Data":"e47e6d9b9664f65a27cce1b5d4b75ba574de7b33754207495fbe6d72680d0f28"} Oct 09 16:28:12 crc kubenswrapper[4719]: I1009 16:28:12.856709 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6457k" Oct 09 16:28:12 crc kubenswrapper[4719]: I1009 16:28:12.856738 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6457k" event={"ID":"29e54f85-6f40-4f97-b774-342a170651df","Type":"ContainerDied","Data":"8739a9cd2d192c0c5daec264c6c74b153e39dfa6c6fd21c90109cc85a773b9a8"} Oct 09 16:28:12 crc kubenswrapper[4719]: I1009 16:28:12.856765 4719 scope.go:117] "RemoveContainer" containerID="e47e6d9b9664f65a27cce1b5d4b75ba574de7b33754207495fbe6d72680d0f28" Oct 09 16:28:12 crc kubenswrapper[4719]: I1009 16:28:12.879997 4719 scope.go:117] "RemoveContainer" containerID="78feae67f2b01ac75852b5c7ccbfb56c95295d0d2c792474475c6ab7007dcb92" Oct 09 16:28:12 crc kubenswrapper[4719]: I1009 16:28:12.897232 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6457k"] Oct 09 16:28:12 crc kubenswrapper[4719]: I1009 16:28:12.904487 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6457k"] Oct 09 16:28:12 crc kubenswrapper[4719]: I1009 16:28:12.910175 4719 scope.go:117] "RemoveContainer" containerID="1cd5272e723f15b7fbdcf745bce54845074b5d1a5dc019199d480b4228918ff3" Oct 09 16:28:12 crc kubenswrapper[4719]: I1009 16:28:12.966803 4719 scope.go:117] "RemoveContainer" containerID="e47e6d9b9664f65a27cce1b5d4b75ba574de7b33754207495fbe6d72680d0f28" Oct 09 16:28:12 crc kubenswrapper[4719]: E1009 16:28:12.967758 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e47e6d9b9664f65a27cce1b5d4b75ba574de7b33754207495fbe6d72680d0f28\": container with ID starting with e47e6d9b9664f65a27cce1b5d4b75ba574de7b33754207495fbe6d72680d0f28 not found: ID does not exist" containerID="e47e6d9b9664f65a27cce1b5d4b75ba574de7b33754207495fbe6d72680d0f28" Oct 09 16:28:12 crc kubenswrapper[4719]: I1009 16:28:12.967869 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e47e6d9b9664f65a27cce1b5d4b75ba574de7b33754207495fbe6d72680d0f28"} err="failed to get container status \"e47e6d9b9664f65a27cce1b5d4b75ba574de7b33754207495fbe6d72680d0f28\": rpc error: code = NotFound desc = could not find container \"e47e6d9b9664f65a27cce1b5d4b75ba574de7b33754207495fbe6d72680d0f28\": container with ID starting with e47e6d9b9664f65a27cce1b5d4b75ba574de7b33754207495fbe6d72680d0f28 not found: ID does not exist" Oct 09 16:28:12 crc kubenswrapper[4719]: I1009 16:28:12.967995 4719 scope.go:117] "RemoveContainer" containerID="78feae67f2b01ac75852b5c7ccbfb56c95295d0d2c792474475c6ab7007dcb92" Oct 09 16:28:12 crc kubenswrapper[4719]: E1009 16:28:12.968884 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78feae67f2b01ac75852b5c7ccbfb56c95295d0d2c792474475c6ab7007dcb92\": container with ID starting with 78feae67f2b01ac75852b5c7ccbfb56c95295d0d2c792474475c6ab7007dcb92 not found: ID does not exist" containerID="78feae67f2b01ac75852b5c7ccbfb56c95295d0d2c792474475c6ab7007dcb92" Oct 09 16:28:12 crc kubenswrapper[4719]: I1009 16:28:12.968948 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78feae67f2b01ac75852b5c7ccbfb56c95295d0d2c792474475c6ab7007dcb92"} err="failed to get container status \"78feae67f2b01ac75852b5c7ccbfb56c95295d0d2c792474475c6ab7007dcb92\": rpc error: code = NotFound desc = could not find container \"78feae67f2b01ac75852b5c7ccbfb56c95295d0d2c792474475c6ab7007dcb92\": container with ID starting with 78feae67f2b01ac75852b5c7ccbfb56c95295d0d2c792474475c6ab7007dcb92 not found: ID does not exist" Oct 09 16:28:12 crc kubenswrapper[4719]: I1009 16:28:12.968989 4719 scope.go:117] "RemoveContainer" containerID="1cd5272e723f15b7fbdcf745bce54845074b5d1a5dc019199d480b4228918ff3" Oct 09 16:28:12 crc kubenswrapper[4719]: E1009 16:28:12.969535 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cd5272e723f15b7fbdcf745bce54845074b5d1a5dc019199d480b4228918ff3\": container with ID starting with 1cd5272e723f15b7fbdcf745bce54845074b5d1a5dc019199d480b4228918ff3 not found: ID does not exist" containerID="1cd5272e723f15b7fbdcf745bce54845074b5d1a5dc019199d480b4228918ff3" Oct 09 16:28:12 crc kubenswrapper[4719]: I1009 16:28:12.969570 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cd5272e723f15b7fbdcf745bce54845074b5d1a5dc019199d480b4228918ff3"} err="failed to get container status \"1cd5272e723f15b7fbdcf745bce54845074b5d1a5dc019199d480b4228918ff3\": rpc error: code = NotFound desc = could not find container \"1cd5272e723f15b7fbdcf745bce54845074b5d1a5dc019199d480b4228918ff3\": container with ID starting with 1cd5272e723f15b7fbdcf745bce54845074b5d1a5dc019199d480b4228918ff3 not found: ID does not exist" Oct 09 16:28:13 crc kubenswrapper[4719]: I1009 16:28:13.176988 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29e54f85-6f40-4f97-b774-342a170651df" path="/var/lib/kubelet/pods/29e54f85-6f40-4f97-b774-342a170651df/volumes" Oct 09 16:29:36 crc kubenswrapper[4719]: I1009 16:29:36.976918 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:29:36 crc kubenswrapper[4719]: I1009 16:29:36.977520 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:30:00 crc kubenswrapper[4719]: I1009 16:30:00.148929 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333790-ggw29"] Oct 09 16:30:00 crc kubenswrapper[4719]: E1009 16:30:00.149926 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29e54f85-6f40-4f97-b774-342a170651df" containerName="registry-server" Oct 09 16:30:00 crc kubenswrapper[4719]: I1009 16:30:00.149941 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e54f85-6f40-4f97-b774-342a170651df" containerName="registry-server" Oct 09 16:30:00 crc kubenswrapper[4719]: E1009 16:30:00.149958 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29e54f85-6f40-4f97-b774-342a170651df" containerName="extract-utilities" Oct 09 16:30:00 crc kubenswrapper[4719]: I1009 16:30:00.149965 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e54f85-6f40-4f97-b774-342a170651df" containerName="extract-utilities" Oct 09 16:30:00 crc kubenswrapper[4719]: E1009 16:30:00.149989 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29e54f85-6f40-4f97-b774-342a170651df" containerName="extract-content" Oct 09 16:30:00 crc kubenswrapper[4719]: I1009 16:30:00.149995 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e54f85-6f40-4f97-b774-342a170651df" containerName="extract-content" Oct 09 16:30:00 crc kubenswrapper[4719]: I1009 16:30:00.150200 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="29e54f85-6f40-4f97-b774-342a170651df" containerName="registry-server" Oct 09 16:30:00 crc kubenswrapper[4719]: I1009 16:30:00.151021 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333790-ggw29" Oct 09 16:30:00 crc kubenswrapper[4719]: I1009 16:30:00.153726 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 16:30:00 crc kubenswrapper[4719]: I1009 16:30:00.154317 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 16:30:00 crc kubenswrapper[4719]: I1009 16:30:00.161002 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333790-ggw29"] Oct 09 16:30:00 crc kubenswrapper[4719]: I1009 16:30:00.318790 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/267e49d5-666a-4b06-a1f5-fd4e0a01bf5b-secret-volume\") pod \"collect-profiles-29333790-ggw29\" (UID: \"267e49d5-666a-4b06-a1f5-fd4e0a01bf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333790-ggw29" Oct 09 16:30:00 crc kubenswrapper[4719]: I1009 16:30:00.318831 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/267e49d5-666a-4b06-a1f5-fd4e0a01bf5b-config-volume\") pod \"collect-profiles-29333790-ggw29\" (UID: \"267e49d5-666a-4b06-a1f5-fd4e0a01bf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333790-ggw29" Oct 09 16:30:00 crc kubenswrapper[4719]: I1009 16:30:00.318934 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnpdn\" (UniqueName: \"kubernetes.io/projected/267e49d5-666a-4b06-a1f5-fd4e0a01bf5b-kube-api-access-xnpdn\") pod \"collect-profiles-29333790-ggw29\" (UID: \"267e49d5-666a-4b06-a1f5-fd4e0a01bf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333790-ggw29" Oct 09 16:30:00 crc kubenswrapper[4719]: I1009 16:30:00.421126 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnpdn\" (UniqueName: \"kubernetes.io/projected/267e49d5-666a-4b06-a1f5-fd4e0a01bf5b-kube-api-access-xnpdn\") pod \"collect-profiles-29333790-ggw29\" (UID: \"267e49d5-666a-4b06-a1f5-fd4e0a01bf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333790-ggw29" Oct 09 16:30:00 crc kubenswrapper[4719]: I1009 16:30:00.421326 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/267e49d5-666a-4b06-a1f5-fd4e0a01bf5b-secret-volume\") pod \"collect-profiles-29333790-ggw29\" (UID: \"267e49d5-666a-4b06-a1f5-fd4e0a01bf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333790-ggw29" Oct 09 16:30:00 crc kubenswrapper[4719]: I1009 16:30:00.421364 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/267e49d5-666a-4b06-a1f5-fd4e0a01bf5b-config-volume\") pod \"collect-profiles-29333790-ggw29\" (UID: \"267e49d5-666a-4b06-a1f5-fd4e0a01bf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333790-ggw29" Oct 09 16:30:00 crc kubenswrapper[4719]: I1009 16:30:00.422200 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/267e49d5-666a-4b06-a1f5-fd4e0a01bf5b-config-volume\") pod \"collect-profiles-29333790-ggw29\" (UID: \"267e49d5-666a-4b06-a1f5-fd4e0a01bf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333790-ggw29" Oct 09 16:30:00 crc kubenswrapper[4719]: I1009 16:30:00.694560 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/267e49d5-666a-4b06-a1f5-fd4e0a01bf5b-secret-volume\") pod \"collect-profiles-29333790-ggw29\" (UID: \"267e49d5-666a-4b06-a1f5-fd4e0a01bf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333790-ggw29" Oct 09 16:30:00 crc kubenswrapper[4719]: I1009 16:30:00.705184 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnpdn\" (UniqueName: \"kubernetes.io/projected/267e49d5-666a-4b06-a1f5-fd4e0a01bf5b-kube-api-access-xnpdn\") pod \"collect-profiles-29333790-ggw29\" (UID: \"267e49d5-666a-4b06-a1f5-fd4e0a01bf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333790-ggw29" Oct 09 16:30:00 crc kubenswrapper[4719]: I1009 16:30:00.815101 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333790-ggw29" Oct 09 16:30:01 crc kubenswrapper[4719]: I1009 16:30:01.358019 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333790-ggw29"] Oct 09 16:30:01 crc kubenswrapper[4719]: I1009 16:30:01.901305 4719 generic.go:334] "Generic (PLEG): container finished" podID="267e49d5-666a-4b06-a1f5-fd4e0a01bf5b" containerID="18e5bd3eba3c3ea28cf2516b58f48b827eeb396dd08b07e59f13e89b88334bda" exitCode=0 Oct 09 16:30:01 crc kubenswrapper[4719]: I1009 16:30:01.901379 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333790-ggw29" event={"ID":"267e49d5-666a-4b06-a1f5-fd4e0a01bf5b","Type":"ContainerDied","Data":"18e5bd3eba3c3ea28cf2516b58f48b827eeb396dd08b07e59f13e89b88334bda"} Oct 09 16:30:01 crc kubenswrapper[4719]: I1009 16:30:01.901697 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333790-ggw29" event={"ID":"267e49d5-666a-4b06-a1f5-fd4e0a01bf5b","Type":"ContainerStarted","Data":"d959f9ece7e1a53c5dbdb02cf460a6aa5f195a61c0bd526a9acc0cfa9c9676cc"} Oct 09 16:30:03 crc kubenswrapper[4719]: I1009 16:30:03.301911 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333790-ggw29" Oct 09 16:30:03 crc kubenswrapper[4719]: I1009 16:30:03.402251 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/267e49d5-666a-4b06-a1f5-fd4e0a01bf5b-config-volume\") pod \"267e49d5-666a-4b06-a1f5-fd4e0a01bf5b\" (UID: \"267e49d5-666a-4b06-a1f5-fd4e0a01bf5b\") " Oct 09 16:30:03 crc kubenswrapper[4719]: I1009 16:30:03.402928 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnpdn\" (UniqueName: \"kubernetes.io/projected/267e49d5-666a-4b06-a1f5-fd4e0a01bf5b-kube-api-access-xnpdn\") pod \"267e49d5-666a-4b06-a1f5-fd4e0a01bf5b\" (UID: \"267e49d5-666a-4b06-a1f5-fd4e0a01bf5b\") " Oct 09 16:30:03 crc kubenswrapper[4719]: I1009 16:30:03.403066 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/267e49d5-666a-4b06-a1f5-fd4e0a01bf5b-secret-volume\") pod \"267e49d5-666a-4b06-a1f5-fd4e0a01bf5b\" (UID: \"267e49d5-666a-4b06-a1f5-fd4e0a01bf5b\") " Oct 09 16:30:03 crc kubenswrapper[4719]: I1009 16:30:03.403851 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267e49d5-666a-4b06-a1f5-fd4e0a01bf5b-config-volume" (OuterVolumeSpecName: "config-volume") pod "267e49d5-666a-4b06-a1f5-fd4e0a01bf5b" (UID: "267e49d5-666a-4b06-a1f5-fd4e0a01bf5b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 16:30:03 crc kubenswrapper[4719]: I1009 16:30:03.408609 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/267e49d5-666a-4b06-a1f5-fd4e0a01bf5b-kube-api-access-xnpdn" (OuterVolumeSpecName: "kube-api-access-xnpdn") pod "267e49d5-666a-4b06-a1f5-fd4e0a01bf5b" (UID: "267e49d5-666a-4b06-a1f5-fd4e0a01bf5b"). InnerVolumeSpecName "kube-api-access-xnpdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:30:03 crc kubenswrapper[4719]: I1009 16:30:03.412530 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/267e49d5-666a-4b06-a1f5-fd4e0a01bf5b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "267e49d5-666a-4b06-a1f5-fd4e0a01bf5b" (UID: "267e49d5-666a-4b06-a1f5-fd4e0a01bf5b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:30:03 crc kubenswrapper[4719]: I1009 16:30:03.506680 4719 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/267e49d5-666a-4b06-a1f5-fd4e0a01bf5b-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 16:30:03 crc kubenswrapper[4719]: I1009 16:30:03.506768 4719 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/267e49d5-666a-4b06-a1f5-fd4e0a01bf5b-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 16:30:03 crc kubenswrapper[4719]: I1009 16:30:03.506908 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnpdn\" (UniqueName: \"kubernetes.io/projected/267e49d5-666a-4b06-a1f5-fd4e0a01bf5b-kube-api-access-xnpdn\") on node \"crc\" DevicePath \"\"" Oct 09 16:30:03 crc kubenswrapper[4719]: I1009 16:30:03.921001 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333790-ggw29" event={"ID":"267e49d5-666a-4b06-a1f5-fd4e0a01bf5b","Type":"ContainerDied","Data":"d959f9ece7e1a53c5dbdb02cf460a6aa5f195a61c0bd526a9acc0cfa9c9676cc"} Oct 09 16:30:03 crc kubenswrapper[4719]: I1009 16:30:03.921045 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d959f9ece7e1a53c5dbdb02cf460a6aa5f195a61c0bd526a9acc0cfa9c9676cc" Oct 09 16:30:03 crc kubenswrapper[4719]: I1009 16:30:03.921076 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333790-ggw29" Oct 09 16:30:04 crc kubenswrapper[4719]: I1009 16:30:04.379115 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333745-wfzhh"] Oct 09 16:30:04 crc kubenswrapper[4719]: I1009 16:30:04.391190 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333745-wfzhh"] Oct 09 16:30:05 crc kubenswrapper[4719]: I1009 16:30:05.173284 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ff99c39-7c61-46a2-bb84-05cb745323bf" path="/var/lib/kubelet/pods/8ff99c39-7c61-46a2-bb84-05cb745323bf/volumes" Oct 09 16:30:06 crc kubenswrapper[4719]: I1009 16:30:06.976647 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:30:06 crc kubenswrapper[4719]: I1009 16:30:06.976708 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:30:27 crc kubenswrapper[4719]: I1009 16:30:27.745858 4719 scope.go:117] "RemoveContainer" containerID="8515f28740ec0dc7d4038f5effd2910f167b9b8df00ae6c2177db8c435b574cf" Oct 09 16:30:36 crc kubenswrapper[4719]: I1009 16:30:36.977004 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:30:36 crc kubenswrapper[4719]: I1009 16:30:36.977528 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:30:36 crc kubenswrapper[4719]: I1009 16:30:36.977581 4719 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 16:30:36 crc kubenswrapper[4719]: I1009 16:30:36.978413 4719 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c33815915bf14db268853cbe5a0e13208e21886bddc6ff5675bb91c3610b017"} pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 16:30:36 crc kubenswrapper[4719]: I1009 16:30:36.978472 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" containerID="cri-o://5c33815915bf14db268853cbe5a0e13208e21886bddc6ff5675bb91c3610b017" gracePeriod=600 Oct 09 16:30:37 crc kubenswrapper[4719]: I1009 16:30:37.290001 4719 generic.go:334] "Generic (PLEG): container finished" podID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerID="5c33815915bf14db268853cbe5a0e13208e21886bddc6ff5675bb91c3610b017" exitCode=0 Oct 09 16:30:37 crc kubenswrapper[4719]: I1009 16:30:37.290139 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerDied","Data":"5c33815915bf14db268853cbe5a0e13208e21886bddc6ff5675bb91c3610b017"} Oct 09 16:30:37 crc kubenswrapper[4719]: I1009 16:30:37.290173 4719 scope.go:117] "RemoveContainer" containerID="31f3269a588ea5effde5d0917de9295e18690433f2f29283415e4ee95b65702d" Oct 09 16:30:38 crc kubenswrapper[4719]: I1009 16:30:38.299965 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerStarted","Data":"ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb"} Oct 09 16:33:06 crc kubenswrapper[4719]: I1009 16:33:06.976870 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:33:06 crc kubenswrapper[4719]: I1009 16:33:06.977635 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:33:36 crc kubenswrapper[4719]: I1009 16:33:36.977334 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:33:36 crc kubenswrapper[4719]: I1009 16:33:36.978169 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:34:06 crc kubenswrapper[4719]: I1009 16:34:06.976172 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:34:06 crc kubenswrapper[4719]: I1009 16:34:06.976810 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:34:06 crc kubenswrapper[4719]: I1009 16:34:06.976854 4719 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 16:34:06 crc kubenswrapper[4719]: I1009 16:34:06.977561 4719 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb"} pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 16:34:06 crc kubenswrapper[4719]: I1009 16:34:06.977616 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" containerID="cri-o://ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" gracePeriod=600 Oct 09 16:34:07 crc kubenswrapper[4719]: E1009 16:34:07.129892 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:34:07 crc kubenswrapper[4719]: I1009 16:34:07.358814 4719 generic.go:334] "Generic (PLEG): container finished" podID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" exitCode=0 Oct 09 16:34:07 crc kubenswrapper[4719]: I1009 16:34:07.358856 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerDied","Data":"ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb"} Oct 09 16:34:07 crc kubenswrapper[4719]: I1009 16:34:07.358889 4719 scope.go:117] "RemoveContainer" containerID="5c33815915bf14db268853cbe5a0e13208e21886bddc6ff5675bb91c3610b017" Oct 09 16:34:07 crc kubenswrapper[4719]: I1009 16:34:07.359697 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:34:07 crc kubenswrapper[4719]: E1009 16:34:07.360146 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:34:19 crc kubenswrapper[4719]: I1009 16:34:19.161018 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:34:19 crc kubenswrapper[4719]: E1009 16:34:19.162585 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:34:32 crc kubenswrapper[4719]: I1009 16:34:32.545170 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cl2zj"] Oct 09 16:34:32 crc kubenswrapper[4719]: E1009 16:34:32.546301 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267e49d5-666a-4b06-a1f5-fd4e0a01bf5b" containerName="collect-profiles" Oct 09 16:34:32 crc kubenswrapper[4719]: I1009 16:34:32.546320 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="267e49d5-666a-4b06-a1f5-fd4e0a01bf5b" containerName="collect-profiles" Oct 09 16:34:32 crc kubenswrapper[4719]: I1009 16:34:32.546601 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="267e49d5-666a-4b06-a1f5-fd4e0a01bf5b" containerName="collect-profiles" Oct 09 16:34:32 crc kubenswrapper[4719]: I1009 16:34:32.549563 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cl2zj" Oct 09 16:34:32 crc kubenswrapper[4719]: I1009 16:34:32.559421 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cl2zj"] Oct 09 16:34:32 crc kubenswrapper[4719]: I1009 16:34:32.580010 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0960fc31-442a-4e72-ad99-ab001be39df9-catalog-content\") pod \"redhat-operators-cl2zj\" (UID: \"0960fc31-442a-4e72-ad99-ab001be39df9\") " pod="openshift-marketplace/redhat-operators-cl2zj" Oct 09 16:34:32 crc kubenswrapper[4719]: I1009 16:34:32.580180 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0960fc31-442a-4e72-ad99-ab001be39df9-utilities\") pod \"redhat-operators-cl2zj\" (UID: \"0960fc31-442a-4e72-ad99-ab001be39df9\") " pod="openshift-marketplace/redhat-operators-cl2zj" Oct 09 16:34:32 crc kubenswrapper[4719]: I1009 16:34:32.580269 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjtz5\" (UniqueName: \"kubernetes.io/projected/0960fc31-442a-4e72-ad99-ab001be39df9-kube-api-access-mjtz5\") pod \"redhat-operators-cl2zj\" (UID: \"0960fc31-442a-4e72-ad99-ab001be39df9\") " pod="openshift-marketplace/redhat-operators-cl2zj" Oct 09 16:34:32 crc kubenswrapper[4719]: I1009 16:34:32.681841 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0960fc31-442a-4e72-ad99-ab001be39df9-utilities\") pod \"redhat-operators-cl2zj\" (UID: \"0960fc31-442a-4e72-ad99-ab001be39df9\") " pod="openshift-marketplace/redhat-operators-cl2zj" Oct 09 16:34:32 crc kubenswrapper[4719]: I1009 16:34:32.682229 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjtz5\" (UniqueName: \"kubernetes.io/projected/0960fc31-442a-4e72-ad99-ab001be39df9-kube-api-access-mjtz5\") pod \"redhat-operators-cl2zj\" (UID: \"0960fc31-442a-4e72-ad99-ab001be39df9\") " pod="openshift-marketplace/redhat-operators-cl2zj" Oct 09 16:34:32 crc kubenswrapper[4719]: I1009 16:34:32.682292 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0960fc31-442a-4e72-ad99-ab001be39df9-catalog-content\") pod \"redhat-operators-cl2zj\" (UID: \"0960fc31-442a-4e72-ad99-ab001be39df9\") " pod="openshift-marketplace/redhat-operators-cl2zj" Oct 09 16:34:32 crc kubenswrapper[4719]: I1009 16:34:32.682384 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0960fc31-442a-4e72-ad99-ab001be39df9-utilities\") pod \"redhat-operators-cl2zj\" (UID: \"0960fc31-442a-4e72-ad99-ab001be39df9\") " pod="openshift-marketplace/redhat-operators-cl2zj" Oct 09 16:34:32 crc kubenswrapper[4719]: I1009 16:34:32.682721 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0960fc31-442a-4e72-ad99-ab001be39df9-catalog-content\") pod \"redhat-operators-cl2zj\" (UID: \"0960fc31-442a-4e72-ad99-ab001be39df9\") " pod="openshift-marketplace/redhat-operators-cl2zj" Oct 09 16:34:32 crc kubenswrapper[4719]: I1009 16:34:32.708522 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjtz5\" (UniqueName: \"kubernetes.io/projected/0960fc31-442a-4e72-ad99-ab001be39df9-kube-api-access-mjtz5\") pod \"redhat-operators-cl2zj\" (UID: \"0960fc31-442a-4e72-ad99-ab001be39df9\") " pod="openshift-marketplace/redhat-operators-cl2zj" Oct 09 16:34:32 crc kubenswrapper[4719]: I1009 16:34:32.871751 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cl2zj" Oct 09 16:34:33 crc kubenswrapper[4719]: I1009 16:34:33.185250 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:34:33 crc kubenswrapper[4719]: E1009 16:34:33.185995 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:34:33 crc kubenswrapper[4719]: I1009 16:34:33.366922 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cl2zj"] Oct 09 16:34:33 crc kubenswrapper[4719]: I1009 16:34:33.650878 4719 generic.go:334] "Generic (PLEG): container finished" podID="0960fc31-442a-4e72-ad99-ab001be39df9" containerID="27f306841e7fe023f51afaf557477b8bc8fc8eabb183c7bd37ef0f593030af9f" exitCode=0 Oct 09 16:34:33 crc kubenswrapper[4719]: I1009 16:34:33.650985 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cl2zj" event={"ID":"0960fc31-442a-4e72-ad99-ab001be39df9","Type":"ContainerDied","Data":"27f306841e7fe023f51afaf557477b8bc8fc8eabb183c7bd37ef0f593030af9f"} Oct 09 16:34:33 crc kubenswrapper[4719]: I1009 16:34:33.651243 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cl2zj" event={"ID":"0960fc31-442a-4e72-ad99-ab001be39df9","Type":"ContainerStarted","Data":"b33077377f0dfc3e0319660916ad7c9efc235d471888b910e0e1fe1e954e43dc"} Oct 09 16:34:33 crc kubenswrapper[4719]: I1009 16:34:33.652996 4719 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 16:34:35 crc kubenswrapper[4719]: I1009 16:34:35.671820 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cl2zj" event={"ID":"0960fc31-442a-4e72-ad99-ab001be39df9","Type":"ContainerStarted","Data":"e386bb0101f8a2b750e62fc21a610c47ebcdbacf2cc8118744e994fefbe2115e"} Oct 09 16:34:38 crc kubenswrapper[4719]: I1009 16:34:38.708637 4719 generic.go:334] "Generic (PLEG): container finished" podID="0960fc31-442a-4e72-ad99-ab001be39df9" containerID="e386bb0101f8a2b750e62fc21a610c47ebcdbacf2cc8118744e994fefbe2115e" exitCode=0 Oct 09 16:34:38 crc kubenswrapper[4719]: I1009 16:34:38.708722 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cl2zj" event={"ID":"0960fc31-442a-4e72-ad99-ab001be39df9","Type":"ContainerDied","Data":"e386bb0101f8a2b750e62fc21a610c47ebcdbacf2cc8118744e994fefbe2115e"} Oct 09 16:34:39 crc kubenswrapper[4719]: I1009 16:34:39.721320 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cl2zj" event={"ID":"0960fc31-442a-4e72-ad99-ab001be39df9","Type":"ContainerStarted","Data":"df9b9826e619b5ca89ca4249df2aeda6b22fbc11f2c0b49beafca229d6ecdca4"} Oct 09 16:34:39 crc kubenswrapper[4719]: I1009 16:34:39.746164 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cl2zj" podStartSLOduration=1.9248631980000002 podStartE2EDuration="7.746147197s" podCreationTimestamp="2025-10-09 16:34:32 +0000 UTC" firstStartedPulling="2025-10-09 16:34:33.652594413 +0000 UTC m=+4579.162305698" lastFinishedPulling="2025-10-09 16:34:39.473878412 +0000 UTC m=+4584.983589697" observedRunningTime="2025-10-09 16:34:39.737129979 +0000 UTC m=+4585.246841274" watchObservedRunningTime="2025-10-09 16:34:39.746147197 +0000 UTC m=+4585.255858482" Oct 09 16:34:42 crc kubenswrapper[4719]: I1009 16:34:42.872427 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cl2zj" Oct 09 16:34:42 crc kubenswrapper[4719]: I1009 16:34:42.873013 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cl2zj" Oct 09 16:34:43 crc kubenswrapper[4719]: I1009 16:34:43.925000 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cl2zj" podUID="0960fc31-442a-4e72-ad99-ab001be39df9" containerName="registry-server" probeResult="failure" output=< Oct 09 16:34:43 crc kubenswrapper[4719]: timeout: failed to connect service ":50051" within 1s Oct 09 16:34:43 crc kubenswrapper[4719]: > Oct 09 16:34:45 crc kubenswrapper[4719]: I1009 16:34:45.171437 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:34:45 crc kubenswrapper[4719]: E1009 16:34:45.171668 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:34:54 crc kubenswrapper[4719]: I1009 16:34:54.433613 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cl2zj" podUID="0960fc31-442a-4e72-ad99-ab001be39df9" containerName="registry-server" probeResult="failure" output=< Oct 09 16:34:54 crc kubenswrapper[4719]: timeout: failed to connect service ":50051" within 1s Oct 09 16:34:54 crc kubenswrapper[4719]: > Oct 09 16:34:56 crc kubenswrapper[4719]: I1009 16:34:56.160803 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:34:56 crc kubenswrapper[4719]: E1009 16:34:56.161453 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:35:02 crc kubenswrapper[4719]: I1009 16:35:02.938734 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cl2zj" Oct 09 16:35:03 crc kubenswrapper[4719]: I1009 16:35:03.000240 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cl2zj" Oct 09 16:35:03 crc kubenswrapper[4719]: I1009 16:35:03.748076 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cl2zj"] Oct 09 16:35:03 crc kubenswrapper[4719]: I1009 16:35:03.964692 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cl2zj" podUID="0960fc31-442a-4e72-ad99-ab001be39df9" containerName="registry-server" containerID="cri-o://df9b9826e619b5ca89ca4249df2aeda6b22fbc11f2c0b49beafca229d6ecdca4" gracePeriod=2 Oct 09 16:35:04 crc kubenswrapper[4719]: I1009 16:35:04.980114 4719 generic.go:334] "Generic (PLEG): container finished" podID="0960fc31-442a-4e72-ad99-ab001be39df9" containerID="df9b9826e619b5ca89ca4249df2aeda6b22fbc11f2c0b49beafca229d6ecdca4" exitCode=0 Oct 09 16:35:04 crc kubenswrapper[4719]: I1009 16:35:04.980208 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cl2zj" event={"ID":"0960fc31-442a-4e72-ad99-ab001be39df9","Type":"ContainerDied","Data":"df9b9826e619b5ca89ca4249df2aeda6b22fbc11f2c0b49beafca229d6ecdca4"} Oct 09 16:35:05 crc kubenswrapper[4719]: I1009 16:35:05.237722 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cl2zj" Oct 09 16:35:05 crc kubenswrapper[4719]: I1009 16:35:05.391376 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0960fc31-442a-4e72-ad99-ab001be39df9-utilities\") pod \"0960fc31-442a-4e72-ad99-ab001be39df9\" (UID: \"0960fc31-442a-4e72-ad99-ab001be39df9\") " Oct 09 16:35:05 crc kubenswrapper[4719]: I1009 16:35:05.391459 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjtz5\" (UniqueName: \"kubernetes.io/projected/0960fc31-442a-4e72-ad99-ab001be39df9-kube-api-access-mjtz5\") pod \"0960fc31-442a-4e72-ad99-ab001be39df9\" (UID: \"0960fc31-442a-4e72-ad99-ab001be39df9\") " Oct 09 16:35:05 crc kubenswrapper[4719]: I1009 16:35:05.391524 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0960fc31-442a-4e72-ad99-ab001be39df9-catalog-content\") pod \"0960fc31-442a-4e72-ad99-ab001be39df9\" (UID: \"0960fc31-442a-4e72-ad99-ab001be39df9\") " Oct 09 16:35:05 crc kubenswrapper[4719]: I1009 16:35:05.392495 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0960fc31-442a-4e72-ad99-ab001be39df9-utilities" (OuterVolumeSpecName: "utilities") pod "0960fc31-442a-4e72-ad99-ab001be39df9" (UID: "0960fc31-442a-4e72-ad99-ab001be39df9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:35:05 crc kubenswrapper[4719]: I1009 16:35:05.423587 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0960fc31-442a-4e72-ad99-ab001be39df9-kube-api-access-mjtz5" (OuterVolumeSpecName: "kube-api-access-mjtz5") pod "0960fc31-442a-4e72-ad99-ab001be39df9" (UID: "0960fc31-442a-4e72-ad99-ab001be39df9"). InnerVolumeSpecName "kube-api-access-mjtz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:35:05 crc kubenswrapper[4719]: I1009 16:35:05.494930 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0960fc31-442a-4e72-ad99-ab001be39df9-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 16:35:05 crc kubenswrapper[4719]: I1009 16:35:05.494990 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjtz5\" (UniqueName: \"kubernetes.io/projected/0960fc31-442a-4e72-ad99-ab001be39df9-kube-api-access-mjtz5\") on node \"crc\" DevicePath \"\"" Oct 09 16:35:05 crc kubenswrapper[4719]: I1009 16:35:05.523062 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0960fc31-442a-4e72-ad99-ab001be39df9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0960fc31-442a-4e72-ad99-ab001be39df9" (UID: "0960fc31-442a-4e72-ad99-ab001be39df9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:35:05 crc kubenswrapper[4719]: I1009 16:35:05.597761 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0960fc31-442a-4e72-ad99-ab001be39df9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 16:35:06 crc kubenswrapper[4719]: I1009 16:35:05.999720 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cl2zj" event={"ID":"0960fc31-442a-4e72-ad99-ab001be39df9","Type":"ContainerDied","Data":"b33077377f0dfc3e0319660916ad7c9efc235d471888b910e0e1fe1e954e43dc"} Oct 09 16:35:06 crc kubenswrapper[4719]: I1009 16:35:05.999855 4719 scope.go:117] "RemoveContainer" containerID="df9b9826e619b5ca89ca4249df2aeda6b22fbc11f2c0b49beafca229d6ecdca4" Oct 09 16:35:06 crc kubenswrapper[4719]: I1009 16:35:05.999861 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cl2zj" Oct 09 16:35:06 crc kubenswrapper[4719]: I1009 16:35:06.023635 4719 scope.go:117] "RemoveContainer" containerID="e386bb0101f8a2b750e62fc21a610c47ebcdbacf2cc8118744e994fefbe2115e" Oct 09 16:35:06 crc kubenswrapper[4719]: I1009 16:35:06.055759 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cl2zj"] Oct 09 16:35:06 crc kubenswrapper[4719]: I1009 16:35:06.061822 4719 scope.go:117] "RemoveContainer" containerID="27f306841e7fe023f51afaf557477b8bc8fc8eabb183c7bd37ef0f593030af9f" Oct 09 16:35:06 crc kubenswrapper[4719]: I1009 16:35:06.065723 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cl2zj"] Oct 09 16:35:07 crc kubenswrapper[4719]: I1009 16:35:07.173893 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0960fc31-442a-4e72-ad99-ab001be39df9" path="/var/lib/kubelet/pods/0960fc31-442a-4e72-ad99-ab001be39df9/volumes" Oct 09 16:35:11 crc kubenswrapper[4719]: I1009 16:35:11.161480 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:35:11 crc kubenswrapper[4719]: E1009 16:35:11.162203 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:35:26 crc kubenswrapper[4719]: I1009 16:35:26.161657 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:35:26 crc kubenswrapper[4719]: E1009 16:35:26.162505 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:35:38 crc kubenswrapper[4719]: I1009 16:35:38.161832 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:35:38 crc kubenswrapper[4719]: E1009 16:35:38.162627 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:35:51 crc kubenswrapper[4719]: I1009 16:35:51.161168 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:35:51 crc kubenswrapper[4719]: E1009 16:35:51.161913 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:36:02 crc kubenswrapper[4719]: I1009 16:36:02.161041 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:36:02 crc kubenswrapper[4719]: E1009 16:36:02.161970 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:36:12 crc kubenswrapper[4719]: E1009 16:36:12.194944 4719 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.59:48530->38.102.83.59:45053: write tcp 38.102.83.59:48530->38.102.83.59:45053: write: broken pipe Oct 09 16:36:13 crc kubenswrapper[4719]: I1009 16:36:13.162294 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:36:13 crc kubenswrapper[4719]: E1009 16:36:13.163103 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:36:25 crc kubenswrapper[4719]: I1009 16:36:25.207366 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:36:25 crc kubenswrapper[4719]: E1009 16:36:25.208880 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:36:37 crc kubenswrapper[4719]: I1009 16:36:37.161775 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:36:37 crc kubenswrapper[4719]: E1009 16:36:37.162702 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:36:48 crc kubenswrapper[4719]: I1009 16:36:48.161131 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:36:48 crc kubenswrapper[4719]: E1009 16:36:48.162067 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:37:01 crc kubenswrapper[4719]: I1009 16:37:01.160973 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:37:01 crc kubenswrapper[4719]: E1009 16:37:01.161902 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:37:15 crc kubenswrapper[4719]: I1009 16:37:15.169767 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:37:15 crc kubenswrapper[4719]: E1009 16:37:15.170944 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:37:27 crc kubenswrapper[4719]: I1009 16:37:27.162962 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:37:27 crc kubenswrapper[4719]: E1009 16:37:27.163867 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:37:31 crc kubenswrapper[4719]: I1009 16:37:31.299932 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4sdsq"] Oct 09 16:37:31 crc kubenswrapper[4719]: E1009 16:37:31.301016 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0960fc31-442a-4e72-ad99-ab001be39df9" containerName="registry-server" Oct 09 16:37:31 crc kubenswrapper[4719]: I1009 16:37:31.301033 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="0960fc31-442a-4e72-ad99-ab001be39df9" containerName="registry-server" Oct 09 16:37:31 crc kubenswrapper[4719]: E1009 16:37:31.301049 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0960fc31-442a-4e72-ad99-ab001be39df9" containerName="extract-utilities" Oct 09 16:37:31 crc kubenswrapper[4719]: I1009 16:37:31.301057 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="0960fc31-442a-4e72-ad99-ab001be39df9" containerName="extract-utilities" Oct 09 16:37:31 crc kubenswrapper[4719]: E1009 16:37:31.301079 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0960fc31-442a-4e72-ad99-ab001be39df9" containerName="extract-content" Oct 09 16:37:31 crc kubenswrapper[4719]: I1009 16:37:31.301088 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="0960fc31-442a-4e72-ad99-ab001be39df9" containerName="extract-content" Oct 09 16:37:31 crc kubenswrapper[4719]: I1009 16:37:31.301368 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="0960fc31-442a-4e72-ad99-ab001be39df9" containerName="registry-server" Oct 09 16:37:31 crc kubenswrapper[4719]: I1009 16:37:31.307293 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4sdsq" Oct 09 16:37:31 crc kubenswrapper[4719]: I1009 16:37:31.324127 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4sdsq"] Oct 09 16:37:31 crc kubenswrapper[4719]: I1009 16:37:31.387321 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/672ad4ad-d4b3-44b4-bfdb-7fe0465acd14-catalog-content\") pod \"certified-operators-4sdsq\" (UID: \"672ad4ad-d4b3-44b4-bfdb-7fe0465acd14\") " pod="openshift-marketplace/certified-operators-4sdsq" Oct 09 16:37:31 crc kubenswrapper[4719]: I1009 16:37:31.387694 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/672ad4ad-d4b3-44b4-bfdb-7fe0465acd14-utilities\") pod \"certified-operators-4sdsq\" (UID: \"672ad4ad-d4b3-44b4-bfdb-7fe0465acd14\") " pod="openshift-marketplace/certified-operators-4sdsq" Oct 09 16:37:31 crc kubenswrapper[4719]: I1009 16:37:31.387885 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r9vg\" (UniqueName: \"kubernetes.io/projected/672ad4ad-d4b3-44b4-bfdb-7fe0465acd14-kube-api-access-8r9vg\") pod \"certified-operators-4sdsq\" (UID: \"672ad4ad-d4b3-44b4-bfdb-7fe0465acd14\") " pod="openshift-marketplace/certified-operators-4sdsq" Oct 09 16:37:31 crc kubenswrapper[4719]: I1009 16:37:31.490392 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/672ad4ad-d4b3-44b4-bfdb-7fe0465acd14-utilities\") pod \"certified-operators-4sdsq\" (UID: \"672ad4ad-d4b3-44b4-bfdb-7fe0465acd14\") " pod="openshift-marketplace/certified-operators-4sdsq" Oct 09 16:37:31 crc kubenswrapper[4719]: I1009 16:37:31.490725 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r9vg\" (UniqueName: \"kubernetes.io/projected/672ad4ad-d4b3-44b4-bfdb-7fe0465acd14-kube-api-access-8r9vg\") pod \"certified-operators-4sdsq\" (UID: \"672ad4ad-d4b3-44b4-bfdb-7fe0465acd14\") " pod="openshift-marketplace/certified-operators-4sdsq" Oct 09 16:37:31 crc kubenswrapper[4719]: I1009 16:37:31.490927 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/672ad4ad-d4b3-44b4-bfdb-7fe0465acd14-utilities\") pod \"certified-operators-4sdsq\" (UID: \"672ad4ad-d4b3-44b4-bfdb-7fe0465acd14\") " pod="openshift-marketplace/certified-operators-4sdsq" Oct 09 16:37:31 crc kubenswrapper[4719]: I1009 16:37:31.490937 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/672ad4ad-d4b3-44b4-bfdb-7fe0465acd14-catalog-content\") pod \"certified-operators-4sdsq\" (UID: \"672ad4ad-d4b3-44b4-bfdb-7fe0465acd14\") " pod="openshift-marketplace/certified-operators-4sdsq" Oct 09 16:37:31 crc kubenswrapper[4719]: I1009 16:37:31.491489 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/672ad4ad-d4b3-44b4-bfdb-7fe0465acd14-catalog-content\") pod \"certified-operators-4sdsq\" (UID: \"672ad4ad-d4b3-44b4-bfdb-7fe0465acd14\") " pod="openshift-marketplace/certified-operators-4sdsq" Oct 09 16:37:32 crc kubenswrapper[4719]: I1009 16:37:32.109496 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r9vg\" (UniqueName: \"kubernetes.io/projected/672ad4ad-d4b3-44b4-bfdb-7fe0465acd14-kube-api-access-8r9vg\") pod \"certified-operators-4sdsq\" (UID: \"672ad4ad-d4b3-44b4-bfdb-7fe0465acd14\") " pod="openshift-marketplace/certified-operators-4sdsq" Oct 09 16:37:32 crc kubenswrapper[4719]: I1009 16:37:32.262370 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4sdsq" Oct 09 16:37:32 crc kubenswrapper[4719]: I1009 16:37:32.735144 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4sdsq"] Oct 09 16:37:33 crc kubenswrapper[4719]: I1009 16:37:33.428508 4719 generic.go:334] "Generic (PLEG): container finished" podID="672ad4ad-d4b3-44b4-bfdb-7fe0465acd14" containerID="350e8cbf22d2edde5b5c4a1d4e8fe19cdff2ed4c526b854091a26f9ff4c6df68" exitCode=0 Oct 09 16:37:33 crc kubenswrapper[4719]: I1009 16:37:33.428636 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4sdsq" event={"ID":"672ad4ad-d4b3-44b4-bfdb-7fe0465acd14","Type":"ContainerDied","Data":"350e8cbf22d2edde5b5c4a1d4e8fe19cdff2ed4c526b854091a26f9ff4c6df68"} Oct 09 16:37:33 crc kubenswrapper[4719]: I1009 16:37:33.429066 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4sdsq" event={"ID":"672ad4ad-d4b3-44b4-bfdb-7fe0465acd14","Type":"ContainerStarted","Data":"6a22112cb5e14c0fe2d889f0c61a000ef05be29f783f3c399192183ca4fb1a4d"} Oct 09 16:37:33 crc kubenswrapper[4719]: I1009 16:37:33.695832 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ltd58"] Oct 09 16:37:33 crc kubenswrapper[4719]: I1009 16:37:33.698689 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ltd58" Oct 09 16:37:33 crc kubenswrapper[4719]: I1009 16:37:33.711667 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ltd58"] Oct 09 16:37:33 crc kubenswrapper[4719]: I1009 16:37:33.843935 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jrnf\" (UniqueName: \"kubernetes.io/projected/fcee18df-40d9-46d1-9208-5bb3feae69b7-kube-api-access-9jrnf\") pod \"community-operators-ltd58\" (UID: \"fcee18df-40d9-46d1-9208-5bb3feae69b7\") " pod="openshift-marketplace/community-operators-ltd58" Oct 09 16:37:33 crc kubenswrapper[4719]: I1009 16:37:33.844081 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcee18df-40d9-46d1-9208-5bb3feae69b7-utilities\") pod \"community-operators-ltd58\" (UID: \"fcee18df-40d9-46d1-9208-5bb3feae69b7\") " pod="openshift-marketplace/community-operators-ltd58" Oct 09 16:37:33 crc kubenswrapper[4719]: I1009 16:37:33.844142 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcee18df-40d9-46d1-9208-5bb3feae69b7-catalog-content\") pod \"community-operators-ltd58\" (UID: \"fcee18df-40d9-46d1-9208-5bb3feae69b7\") " pod="openshift-marketplace/community-operators-ltd58" Oct 09 16:37:33 crc kubenswrapper[4719]: I1009 16:37:33.946389 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jrnf\" (UniqueName: \"kubernetes.io/projected/fcee18df-40d9-46d1-9208-5bb3feae69b7-kube-api-access-9jrnf\") pod \"community-operators-ltd58\" (UID: \"fcee18df-40d9-46d1-9208-5bb3feae69b7\") " pod="openshift-marketplace/community-operators-ltd58" Oct 09 16:37:33 crc kubenswrapper[4719]: I1009 16:37:33.946538 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcee18df-40d9-46d1-9208-5bb3feae69b7-utilities\") pod \"community-operators-ltd58\" (UID: \"fcee18df-40d9-46d1-9208-5bb3feae69b7\") " pod="openshift-marketplace/community-operators-ltd58" Oct 09 16:37:33 crc kubenswrapper[4719]: I1009 16:37:33.946609 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcee18df-40d9-46d1-9208-5bb3feae69b7-catalog-content\") pod \"community-operators-ltd58\" (UID: \"fcee18df-40d9-46d1-9208-5bb3feae69b7\") " pod="openshift-marketplace/community-operators-ltd58" Oct 09 16:37:33 crc kubenswrapper[4719]: I1009 16:37:33.947080 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcee18df-40d9-46d1-9208-5bb3feae69b7-utilities\") pod \"community-operators-ltd58\" (UID: \"fcee18df-40d9-46d1-9208-5bb3feae69b7\") " pod="openshift-marketplace/community-operators-ltd58" Oct 09 16:37:33 crc kubenswrapper[4719]: I1009 16:37:33.947120 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcee18df-40d9-46d1-9208-5bb3feae69b7-catalog-content\") pod \"community-operators-ltd58\" (UID: \"fcee18df-40d9-46d1-9208-5bb3feae69b7\") " pod="openshift-marketplace/community-operators-ltd58" Oct 09 16:37:34 crc kubenswrapper[4719]: I1009 16:37:34.395560 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jrnf\" (UniqueName: \"kubernetes.io/projected/fcee18df-40d9-46d1-9208-5bb3feae69b7-kube-api-access-9jrnf\") pod \"community-operators-ltd58\" (UID: \"fcee18df-40d9-46d1-9208-5bb3feae69b7\") " pod="openshift-marketplace/community-operators-ltd58" Oct 09 16:37:34 crc kubenswrapper[4719]: I1009 16:37:34.618373 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ltd58" Oct 09 16:37:35 crc kubenswrapper[4719]: I1009 16:37:35.086627 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ltd58"] Oct 09 16:37:35 crc kubenswrapper[4719]: I1009 16:37:35.447454 4719 generic.go:334] "Generic (PLEG): container finished" podID="fcee18df-40d9-46d1-9208-5bb3feae69b7" containerID="9fc6efbb8ab65e0240f343ce77adcc399a6352a305e608f3cc650dbbb43fbb3f" exitCode=0 Oct 09 16:37:35 crc kubenswrapper[4719]: I1009 16:37:35.447507 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ltd58" event={"ID":"fcee18df-40d9-46d1-9208-5bb3feae69b7","Type":"ContainerDied","Data":"9fc6efbb8ab65e0240f343ce77adcc399a6352a305e608f3cc650dbbb43fbb3f"} Oct 09 16:37:35 crc kubenswrapper[4719]: I1009 16:37:35.447829 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ltd58" event={"ID":"fcee18df-40d9-46d1-9208-5bb3feae69b7","Type":"ContainerStarted","Data":"50ae92d096266b2766fc3583effe80c6ac697716be413e70e2c1cb2ab80ce96b"} Oct 09 16:37:36 crc kubenswrapper[4719]: I1009 16:37:36.458523 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ltd58" event={"ID":"fcee18df-40d9-46d1-9208-5bb3feae69b7","Type":"ContainerStarted","Data":"4a5b0141eaa649f2f5d7dad1466c92668481e646f6521782b089d0a93dbc205d"} Oct 09 16:37:37 crc kubenswrapper[4719]: I1009 16:37:37.468821 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4sdsq" event={"ID":"672ad4ad-d4b3-44b4-bfdb-7fe0465acd14","Type":"ContainerStarted","Data":"d5beb56404c738ee2b18959b9d21208b98005b14d158b9223aa1ede30b9dd1cb"} Oct 09 16:37:39 crc kubenswrapper[4719]: I1009 16:37:39.493580 4719 generic.go:334] "Generic (PLEG): container finished" podID="672ad4ad-d4b3-44b4-bfdb-7fe0465acd14" containerID="d5beb56404c738ee2b18959b9d21208b98005b14d158b9223aa1ede30b9dd1cb" exitCode=0 Oct 09 16:37:39 crc kubenswrapper[4719]: I1009 16:37:39.493691 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4sdsq" event={"ID":"672ad4ad-d4b3-44b4-bfdb-7fe0465acd14","Type":"ContainerDied","Data":"d5beb56404c738ee2b18959b9d21208b98005b14d158b9223aa1ede30b9dd1cb"} Oct 09 16:37:39 crc kubenswrapper[4719]: I1009 16:37:39.498304 4719 generic.go:334] "Generic (PLEG): container finished" podID="fcee18df-40d9-46d1-9208-5bb3feae69b7" containerID="4a5b0141eaa649f2f5d7dad1466c92668481e646f6521782b089d0a93dbc205d" exitCode=0 Oct 09 16:37:39 crc kubenswrapper[4719]: I1009 16:37:39.498339 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ltd58" event={"ID":"fcee18df-40d9-46d1-9208-5bb3feae69b7","Type":"ContainerDied","Data":"4a5b0141eaa649f2f5d7dad1466c92668481e646f6521782b089d0a93dbc205d"} Oct 09 16:37:40 crc kubenswrapper[4719]: I1009 16:37:40.527188 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4sdsq" event={"ID":"672ad4ad-d4b3-44b4-bfdb-7fe0465acd14","Type":"ContainerStarted","Data":"1ad6e553b3b6a8520da7ad6f35635410f158153b2bb6979eb0b3cdadd3d52f73"} Oct 09 16:37:40 crc kubenswrapper[4719]: I1009 16:37:40.538757 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ltd58" event={"ID":"fcee18df-40d9-46d1-9208-5bb3feae69b7","Type":"ContainerStarted","Data":"3676041bdb796a8bd8faae5ad99f4d70cc03a1d52f8302d873437fd3ff52a048"} Oct 09 16:37:40 crc kubenswrapper[4719]: I1009 16:37:40.553843 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4sdsq" podStartSLOduration=3.069760416 podStartE2EDuration="9.553811056s" podCreationTimestamp="2025-10-09 16:37:31 +0000 UTC" firstStartedPulling="2025-10-09 16:37:33.430977499 +0000 UTC m=+4758.940688784" lastFinishedPulling="2025-10-09 16:37:39.915028139 +0000 UTC m=+4765.424739424" observedRunningTime="2025-10-09 16:37:40.550094927 +0000 UTC m=+4766.059806212" watchObservedRunningTime="2025-10-09 16:37:40.553811056 +0000 UTC m=+4766.063522371" Oct 09 16:37:40 crc kubenswrapper[4719]: I1009 16:37:40.580853 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ltd58" podStartSLOduration=3.099871265 podStartE2EDuration="7.580824028s" podCreationTimestamp="2025-10-09 16:37:33 +0000 UTC" firstStartedPulling="2025-10-09 16:37:35.449194639 +0000 UTC m=+4760.958905924" lastFinishedPulling="2025-10-09 16:37:39.930147402 +0000 UTC m=+4765.439858687" observedRunningTime="2025-10-09 16:37:40.575638412 +0000 UTC m=+4766.085349697" watchObservedRunningTime="2025-10-09 16:37:40.580824028 +0000 UTC m=+4766.090535313" Oct 09 16:37:42 crc kubenswrapper[4719]: I1009 16:37:42.161894 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:37:42 crc kubenswrapper[4719]: E1009 16:37:42.162577 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:37:42 crc kubenswrapper[4719]: I1009 16:37:42.263310 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4sdsq" Oct 09 16:37:42 crc kubenswrapper[4719]: I1009 16:37:42.263790 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4sdsq" Oct 09 16:37:43 crc kubenswrapper[4719]: I1009 16:37:43.318317 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-4sdsq" podUID="672ad4ad-d4b3-44b4-bfdb-7fe0465acd14" containerName="registry-server" probeResult="failure" output=< Oct 09 16:37:43 crc kubenswrapper[4719]: timeout: failed to connect service ":50051" within 1s Oct 09 16:37:43 crc kubenswrapper[4719]: > Oct 09 16:37:44 crc kubenswrapper[4719]: I1009 16:37:44.619431 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ltd58" Oct 09 16:37:44 crc kubenswrapper[4719]: I1009 16:37:44.621123 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ltd58" Oct 09 16:37:45 crc kubenswrapper[4719]: I1009 16:37:45.668271 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-ltd58" podUID="fcee18df-40d9-46d1-9208-5bb3feae69b7" containerName="registry-server" probeResult="failure" output=< Oct 09 16:37:45 crc kubenswrapper[4719]: timeout: failed to connect service ":50051" within 1s Oct 09 16:37:45 crc kubenswrapper[4719]: > Oct 09 16:37:52 crc kubenswrapper[4719]: I1009 16:37:52.309791 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4sdsq" Oct 09 16:37:52 crc kubenswrapper[4719]: I1009 16:37:52.360705 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4sdsq" Oct 09 16:37:52 crc kubenswrapper[4719]: I1009 16:37:52.541161 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4sdsq"] Oct 09 16:37:53 crc kubenswrapper[4719]: I1009 16:37:53.642915 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4sdsq" podUID="672ad4ad-d4b3-44b4-bfdb-7fe0465acd14" containerName="registry-server" containerID="cri-o://1ad6e553b3b6a8520da7ad6f35635410f158153b2bb6979eb0b3cdadd3d52f73" gracePeriod=2 Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.134621 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4sdsq" Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.191865 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/672ad4ad-d4b3-44b4-bfdb-7fe0465acd14-catalog-content\") pod \"672ad4ad-d4b3-44b4-bfdb-7fe0465acd14\" (UID: \"672ad4ad-d4b3-44b4-bfdb-7fe0465acd14\") " Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.192432 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r9vg\" (UniqueName: \"kubernetes.io/projected/672ad4ad-d4b3-44b4-bfdb-7fe0465acd14-kube-api-access-8r9vg\") pod \"672ad4ad-d4b3-44b4-bfdb-7fe0465acd14\" (UID: \"672ad4ad-d4b3-44b4-bfdb-7fe0465acd14\") " Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.192576 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/672ad4ad-d4b3-44b4-bfdb-7fe0465acd14-utilities\") pod \"672ad4ad-d4b3-44b4-bfdb-7fe0465acd14\" (UID: \"672ad4ad-d4b3-44b4-bfdb-7fe0465acd14\") " Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.193302 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/672ad4ad-d4b3-44b4-bfdb-7fe0465acd14-utilities" (OuterVolumeSpecName: "utilities") pod "672ad4ad-d4b3-44b4-bfdb-7fe0465acd14" (UID: "672ad4ad-d4b3-44b4-bfdb-7fe0465acd14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.199298 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/672ad4ad-d4b3-44b4-bfdb-7fe0465acd14-kube-api-access-8r9vg" (OuterVolumeSpecName: "kube-api-access-8r9vg") pod "672ad4ad-d4b3-44b4-bfdb-7fe0465acd14" (UID: "672ad4ad-d4b3-44b4-bfdb-7fe0465acd14"). InnerVolumeSpecName "kube-api-access-8r9vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.247127 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/672ad4ad-d4b3-44b4-bfdb-7fe0465acd14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "672ad4ad-d4b3-44b4-bfdb-7fe0465acd14" (UID: "672ad4ad-d4b3-44b4-bfdb-7fe0465acd14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.294897 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/672ad4ad-d4b3-44b4-bfdb-7fe0465acd14-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.294945 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r9vg\" (UniqueName: \"kubernetes.io/projected/672ad4ad-d4b3-44b4-bfdb-7fe0465acd14-kube-api-access-8r9vg\") on node \"crc\" DevicePath \"\"" Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.294963 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/672ad4ad-d4b3-44b4-bfdb-7fe0465acd14-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.659547 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4sdsq" Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.659586 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4sdsq" event={"ID":"672ad4ad-d4b3-44b4-bfdb-7fe0465acd14","Type":"ContainerDied","Data":"1ad6e553b3b6a8520da7ad6f35635410f158153b2bb6979eb0b3cdadd3d52f73"} Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.659660 4719 scope.go:117] "RemoveContainer" containerID="1ad6e553b3b6a8520da7ad6f35635410f158153b2bb6979eb0b3cdadd3d52f73" Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.659381 4719 generic.go:334] "Generic (PLEG): container finished" podID="672ad4ad-d4b3-44b4-bfdb-7fe0465acd14" containerID="1ad6e553b3b6a8520da7ad6f35635410f158153b2bb6979eb0b3cdadd3d52f73" exitCode=0 Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.661465 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4sdsq" event={"ID":"672ad4ad-d4b3-44b4-bfdb-7fe0465acd14","Type":"ContainerDied","Data":"6a22112cb5e14c0fe2d889f0c61a000ef05be29f783f3c399192183ca4fb1a4d"} Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.669289 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ltd58" Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.680901 4719 scope.go:117] "RemoveContainer" containerID="d5beb56404c738ee2b18959b9d21208b98005b14d158b9223aa1ede30b9dd1cb" Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.724506 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4sdsq"] Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.741920 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ltd58" Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.742181 4719 scope.go:117] "RemoveContainer" containerID="350e8cbf22d2edde5b5c4a1d4e8fe19cdff2ed4c526b854091a26f9ff4c6df68" Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.747634 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4sdsq"] Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.784315 4719 scope.go:117] "RemoveContainer" containerID="1ad6e553b3b6a8520da7ad6f35635410f158153b2bb6979eb0b3cdadd3d52f73" Oct 09 16:37:54 crc kubenswrapper[4719]: E1009 16:37:54.785198 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ad6e553b3b6a8520da7ad6f35635410f158153b2bb6979eb0b3cdadd3d52f73\": container with ID starting with 1ad6e553b3b6a8520da7ad6f35635410f158153b2bb6979eb0b3cdadd3d52f73 not found: ID does not exist" containerID="1ad6e553b3b6a8520da7ad6f35635410f158153b2bb6979eb0b3cdadd3d52f73" Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.785266 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ad6e553b3b6a8520da7ad6f35635410f158153b2bb6979eb0b3cdadd3d52f73"} err="failed to get container status \"1ad6e553b3b6a8520da7ad6f35635410f158153b2bb6979eb0b3cdadd3d52f73\": rpc error: code = NotFound desc = could not find container \"1ad6e553b3b6a8520da7ad6f35635410f158153b2bb6979eb0b3cdadd3d52f73\": container with ID starting with 1ad6e553b3b6a8520da7ad6f35635410f158153b2bb6979eb0b3cdadd3d52f73 not found: ID does not exist" Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.785294 4719 scope.go:117] "RemoveContainer" containerID="d5beb56404c738ee2b18959b9d21208b98005b14d158b9223aa1ede30b9dd1cb" Oct 09 16:37:54 crc kubenswrapper[4719]: E1009 16:37:54.786654 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5beb56404c738ee2b18959b9d21208b98005b14d158b9223aa1ede30b9dd1cb\": container with ID starting with d5beb56404c738ee2b18959b9d21208b98005b14d158b9223aa1ede30b9dd1cb not found: ID does not exist" containerID="d5beb56404c738ee2b18959b9d21208b98005b14d158b9223aa1ede30b9dd1cb" Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.786712 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5beb56404c738ee2b18959b9d21208b98005b14d158b9223aa1ede30b9dd1cb"} err="failed to get container status \"d5beb56404c738ee2b18959b9d21208b98005b14d158b9223aa1ede30b9dd1cb\": rpc error: code = NotFound desc = could not find container \"d5beb56404c738ee2b18959b9d21208b98005b14d158b9223aa1ede30b9dd1cb\": container with ID starting with d5beb56404c738ee2b18959b9d21208b98005b14d158b9223aa1ede30b9dd1cb not found: ID does not exist" Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.786756 4719 scope.go:117] "RemoveContainer" containerID="350e8cbf22d2edde5b5c4a1d4e8fe19cdff2ed4c526b854091a26f9ff4c6df68" Oct 09 16:37:54 crc kubenswrapper[4719]: E1009 16:37:54.787223 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"350e8cbf22d2edde5b5c4a1d4e8fe19cdff2ed4c526b854091a26f9ff4c6df68\": container with ID starting with 350e8cbf22d2edde5b5c4a1d4e8fe19cdff2ed4c526b854091a26f9ff4c6df68 not found: ID does not exist" containerID="350e8cbf22d2edde5b5c4a1d4e8fe19cdff2ed4c526b854091a26f9ff4c6df68" Oct 09 16:37:54 crc kubenswrapper[4719]: I1009 16:37:54.787275 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"350e8cbf22d2edde5b5c4a1d4e8fe19cdff2ed4c526b854091a26f9ff4c6df68"} err="failed to get container status \"350e8cbf22d2edde5b5c4a1d4e8fe19cdff2ed4c526b854091a26f9ff4c6df68\": rpc error: code = NotFound desc = could not find container \"350e8cbf22d2edde5b5c4a1d4e8fe19cdff2ed4c526b854091a26f9ff4c6df68\": container with ID starting with 350e8cbf22d2edde5b5c4a1d4e8fe19cdff2ed4c526b854091a26f9ff4c6df68 not found: ID does not exist" Oct 09 16:37:55 crc kubenswrapper[4719]: I1009 16:37:55.169113 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:37:55 crc kubenswrapper[4719]: E1009 16:37:55.169740 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:37:55 crc kubenswrapper[4719]: I1009 16:37:55.172182 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="672ad4ad-d4b3-44b4-bfdb-7fe0465acd14" path="/var/lib/kubelet/pods/672ad4ad-d4b3-44b4-bfdb-7fe0465acd14/volumes" Oct 09 16:37:56 crc kubenswrapper[4719]: I1009 16:37:56.947456 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ltd58"] Oct 09 16:37:56 crc kubenswrapper[4719]: I1009 16:37:56.947977 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ltd58" podUID="fcee18df-40d9-46d1-9208-5bb3feae69b7" containerName="registry-server" containerID="cri-o://3676041bdb796a8bd8faae5ad99f4d70cc03a1d52f8302d873437fd3ff52a048" gracePeriod=2 Oct 09 16:37:57 crc kubenswrapper[4719]: I1009 16:37:57.437908 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ltd58" Oct 09 16:37:57 crc kubenswrapper[4719]: I1009 16:37:57.474149 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jrnf\" (UniqueName: \"kubernetes.io/projected/fcee18df-40d9-46d1-9208-5bb3feae69b7-kube-api-access-9jrnf\") pod \"fcee18df-40d9-46d1-9208-5bb3feae69b7\" (UID: \"fcee18df-40d9-46d1-9208-5bb3feae69b7\") " Oct 09 16:37:57 crc kubenswrapper[4719]: I1009 16:37:57.474219 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcee18df-40d9-46d1-9208-5bb3feae69b7-catalog-content\") pod \"fcee18df-40d9-46d1-9208-5bb3feae69b7\" (UID: \"fcee18df-40d9-46d1-9208-5bb3feae69b7\") " Oct 09 16:37:57 crc kubenswrapper[4719]: I1009 16:37:57.474274 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcee18df-40d9-46d1-9208-5bb3feae69b7-utilities\") pod \"fcee18df-40d9-46d1-9208-5bb3feae69b7\" (UID: \"fcee18df-40d9-46d1-9208-5bb3feae69b7\") " Oct 09 16:37:57 crc kubenswrapper[4719]: I1009 16:37:57.476590 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcee18df-40d9-46d1-9208-5bb3feae69b7-utilities" (OuterVolumeSpecName: "utilities") pod "fcee18df-40d9-46d1-9208-5bb3feae69b7" (UID: "fcee18df-40d9-46d1-9208-5bb3feae69b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:37:57 crc kubenswrapper[4719]: I1009 16:37:57.488405 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcee18df-40d9-46d1-9208-5bb3feae69b7-kube-api-access-9jrnf" (OuterVolumeSpecName: "kube-api-access-9jrnf") pod "fcee18df-40d9-46d1-9208-5bb3feae69b7" (UID: "fcee18df-40d9-46d1-9208-5bb3feae69b7"). InnerVolumeSpecName "kube-api-access-9jrnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:37:57 crc kubenswrapper[4719]: I1009 16:37:57.525065 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcee18df-40d9-46d1-9208-5bb3feae69b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fcee18df-40d9-46d1-9208-5bb3feae69b7" (UID: "fcee18df-40d9-46d1-9208-5bb3feae69b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:37:57 crc kubenswrapper[4719]: I1009 16:37:57.577528 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jrnf\" (UniqueName: \"kubernetes.io/projected/fcee18df-40d9-46d1-9208-5bb3feae69b7-kube-api-access-9jrnf\") on node \"crc\" DevicePath \"\"" Oct 09 16:37:57 crc kubenswrapper[4719]: I1009 16:37:57.577567 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcee18df-40d9-46d1-9208-5bb3feae69b7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 16:37:57 crc kubenswrapper[4719]: I1009 16:37:57.577577 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcee18df-40d9-46d1-9208-5bb3feae69b7-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 16:37:57 crc kubenswrapper[4719]: I1009 16:37:57.693909 4719 generic.go:334] "Generic (PLEG): container finished" podID="fcee18df-40d9-46d1-9208-5bb3feae69b7" containerID="3676041bdb796a8bd8faae5ad99f4d70cc03a1d52f8302d873437fd3ff52a048" exitCode=0 Oct 09 16:37:57 crc kubenswrapper[4719]: I1009 16:37:57.693955 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ltd58" event={"ID":"fcee18df-40d9-46d1-9208-5bb3feae69b7","Type":"ContainerDied","Data":"3676041bdb796a8bd8faae5ad99f4d70cc03a1d52f8302d873437fd3ff52a048"} Oct 09 16:37:57 crc kubenswrapper[4719]: I1009 16:37:57.693982 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ltd58" event={"ID":"fcee18df-40d9-46d1-9208-5bb3feae69b7","Type":"ContainerDied","Data":"50ae92d096266b2766fc3583effe80c6ac697716be413e70e2c1cb2ab80ce96b"} Oct 09 16:37:57 crc kubenswrapper[4719]: I1009 16:37:57.693979 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ltd58" Oct 09 16:37:57 crc kubenswrapper[4719]: I1009 16:37:57.694060 4719 scope.go:117] "RemoveContainer" containerID="3676041bdb796a8bd8faae5ad99f4d70cc03a1d52f8302d873437fd3ff52a048" Oct 09 16:37:57 crc kubenswrapper[4719]: I1009 16:37:57.738085 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ltd58"] Oct 09 16:37:57 crc kubenswrapper[4719]: I1009 16:37:57.751140 4719 scope.go:117] "RemoveContainer" containerID="4a5b0141eaa649f2f5d7dad1466c92668481e646f6521782b089d0a93dbc205d" Oct 09 16:37:57 crc kubenswrapper[4719]: I1009 16:37:57.751619 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ltd58"] Oct 09 16:37:57 crc kubenswrapper[4719]: I1009 16:37:57.775134 4719 scope.go:117] "RemoveContainer" containerID="9fc6efbb8ab65e0240f343ce77adcc399a6352a305e608f3cc650dbbb43fbb3f" Oct 09 16:37:57 crc kubenswrapper[4719]: I1009 16:37:57.820179 4719 scope.go:117] "RemoveContainer" containerID="3676041bdb796a8bd8faae5ad99f4d70cc03a1d52f8302d873437fd3ff52a048" Oct 09 16:37:57 crc kubenswrapper[4719]: E1009 16:37:57.820618 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3676041bdb796a8bd8faae5ad99f4d70cc03a1d52f8302d873437fd3ff52a048\": container with ID starting with 3676041bdb796a8bd8faae5ad99f4d70cc03a1d52f8302d873437fd3ff52a048 not found: ID does not exist" containerID="3676041bdb796a8bd8faae5ad99f4d70cc03a1d52f8302d873437fd3ff52a048" Oct 09 16:37:57 crc kubenswrapper[4719]: I1009 16:37:57.820660 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3676041bdb796a8bd8faae5ad99f4d70cc03a1d52f8302d873437fd3ff52a048"} err="failed to get container status \"3676041bdb796a8bd8faae5ad99f4d70cc03a1d52f8302d873437fd3ff52a048\": rpc error: code = NotFound desc = could not find container \"3676041bdb796a8bd8faae5ad99f4d70cc03a1d52f8302d873437fd3ff52a048\": container with ID starting with 3676041bdb796a8bd8faae5ad99f4d70cc03a1d52f8302d873437fd3ff52a048 not found: ID does not exist" Oct 09 16:37:57 crc kubenswrapper[4719]: I1009 16:37:57.820713 4719 scope.go:117] "RemoveContainer" containerID="4a5b0141eaa649f2f5d7dad1466c92668481e646f6521782b089d0a93dbc205d" Oct 09 16:37:57 crc kubenswrapper[4719]: E1009 16:37:57.821063 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a5b0141eaa649f2f5d7dad1466c92668481e646f6521782b089d0a93dbc205d\": container with ID starting with 4a5b0141eaa649f2f5d7dad1466c92668481e646f6521782b089d0a93dbc205d not found: ID does not exist" containerID="4a5b0141eaa649f2f5d7dad1466c92668481e646f6521782b089d0a93dbc205d" Oct 09 16:37:57 crc kubenswrapper[4719]: I1009 16:37:57.821113 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a5b0141eaa649f2f5d7dad1466c92668481e646f6521782b089d0a93dbc205d"} err="failed to get container status \"4a5b0141eaa649f2f5d7dad1466c92668481e646f6521782b089d0a93dbc205d\": rpc error: code = NotFound desc = could not find container \"4a5b0141eaa649f2f5d7dad1466c92668481e646f6521782b089d0a93dbc205d\": container with ID starting with 4a5b0141eaa649f2f5d7dad1466c92668481e646f6521782b089d0a93dbc205d not found: ID does not exist" Oct 09 16:37:57 crc kubenswrapper[4719]: I1009 16:37:57.821133 4719 scope.go:117] "RemoveContainer" containerID="9fc6efbb8ab65e0240f343ce77adcc399a6352a305e608f3cc650dbbb43fbb3f" Oct 09 16:37:57 crc kubenswrapper[4719]: E1009 16:37:57.821412 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fc6efbb8ab65e0240f343ce77adcc399a6352a305e608f3cc650dbbb43fbb3f\": container with ID starting with 9fc6efbb8ab65e0240f343ce77adcc399a6352a305e608f3cc650dbbb43fbb3f not found: ID does not exist" containerID="9fc6efbb8ab65e0240f343ce77adcc399a6352a305e608f3cc650dbbb43fbb3f" Oct 09 16:37:57 crc kubenswrapper[4719]: I1009 16:37:57.821532 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fc6efbb8ab65e0240f343ce77adcc399a6352a305e608f3cc650dbbb43fbb3f"} err="failed to get container status \"9fc6efbb8ab65e0240f343ce77adcc399a6352a305e608f3cc650dbbb43fbb3f\": rpc error: code = NotFound desc = could not find container \"9fc6efbb8ab65e0240f343ce77adcc399a6352a305e608f3cc650dbbb43fbb3f\": container with ID starting with 9fc6efbb8ab65e0240f343ce77adcc399a6352a305e608f3cc650dbbb43fbb3f not found: ID does not exist" Oct 09 16:37:59 crc kubenswrapper[4719]: I1009 16:37:59.172802 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcee18df-40d9-46d1-9208-5bb3feae69b7" path="/var/lib/kubelet/pods/fcee18df-40d9-46d1-9208-5bb3feae69b7/volumes" Oct 09 16:38:08 crc kubenswrapper[4719]: I1009 16:38:08.161589 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:38:08 crc kubenswrapper[4719]: E1009 16:38:08.163636 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:38:23 crc kubenswrapper[4719]: I1009 16:38:23.161574 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:38:23 crc kubenswrapper[4719]: E1009 16:38:23.164204 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:38:34 crc kubenswrapper[4719]: I1009 16:38:34.160949 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:38:34 crc kubenswrapper[4719]: E1009 16:38:34.161831 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:38:37 crc kubenswrapper[4719]: I1009 16:38:37.505806 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-msslq"] Oct 09 16:38:37 crc kubenswrapper[4719]: E1009 16:38:37.511560 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcee18df-40d9-46d1-9208-5bb3feae69b7" containerName="registry-server" Oct 09 16:38:37 crc kubenswrapper[4719]: I1009 16:38:37.511691 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcee18df-40d9-46d1-9208-5bb3feae69b7" containerName="registry-server" Oct 09 16:38:37 crc kubenswrapper[4719]: E1009 16:38:37.511793 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="672ad4ad-d4b3-44b4-bfdb-7fe0465acd14" containerName="extract-utilities" Oct 09 16:38:37 crc kubenswrapper[4719]: I1009 16:38:37.511871 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="672ad4ad-d4b3-44b4-bfdb-7fe0465acd14" containerName="extract-utilities" Oct 09 16:38:37 crc kubenswrapper[4719]: E1009 16:38:37.511948 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcee18df-40d9-46d1-9208-5bb3feae69b7" containerName="extract-utilities" Oct 09 16:38:37 crc kubenswrapper[4719]: I1009 16:38:37.512014 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcee18df-40d9-46d1-9208-5bb3feae69b7" containerName="extract-utilities" Oct 09 16:38:37 crc kubenswrapper[4719]: E1009 16:38:37.512121 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcee18df-40d9-46d1-9208-5bb3feae69b7" containerName="extract-content" Oct 09 16:38:37 crc kubenswrapper[4719]: I1009 16:38:37.512228 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcee18df-40d9-46d1-9208-5bb3feae69b7" containerName="extract-content" Oct 09 16:38:37 crc kubenswrapper[4719]: E1009 16:38:37.512328 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="672ad4ad-d4b3-44b4-bfdb-7fe0465acd14" containerName="registry-server" Oct 09 16:38:37 crc kubenswrapper[4719]: I1009 16:38:37.512433 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="672ad4ad-d4b3-44b4-bfdb-7fe0465acd14" containerName="registry-server" Oct 09 16:38:37 crc kubenswrapper[4719]: E1009 16:38:37.512525 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="672ad4ad-d4b3-44b4-bfdb-7fe0465acd14" containerName="extract-content" Oct 09 16:38:37 crc kubenswrapper[4719]: I1009 16:38:37.512598 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="672ad4ad-d4b3-44b4-bfdb-7fe0465acd14" containerName="extract-content" Oct 09 16:38:37 crc kubenswrapper[4719]: I1009 16:38:37.512909 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcee18df-40d9-46d1-9208-5bb3feae69b7" containerName="registry-server" Oct 09 16:38:37 crc kubenswrapper[4719]: I1009 16:38:37.513005 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="672ad4ad-d4b3-44b4-bfdb-7fe0465acd14" containerName="registry-server" Oct 09 16:38:37 crc kubenswrapper[4719]: I1009 16:38:37.515023 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-msslq" Oct 09 16:38:37 crc kubenswrapper[4719]: I1009 16:38:37.519612 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-msslq"] Oct 09 16:38:37 crc kubenswrapper[4719]: I1009 16:38:37.532684 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61bd92ed-b8ee-4f85-a38e-55302ec93cbc-utilities\") pod \"redhat-marketplace-msslq\" (UID: \"61bd92ed-b8ee-4f85-a38e-55302ec93cbc\") " pod="openshift-marketplace/redhat-marketplace-msslq" Oct 09 16:38:37 crc kubenswrapper[4719]: I1009 16:38:37.532733 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61bd92ed-b8ee-4f85-a38e-55302ec93cbc-catalog-content\") pod \"redhat-marketplace-msslq\" (UID: \"61bd92ed-b8ee-4f85-a38e-55302ec93cbc\") " pod="openshift-marketplace/redhat-marketplace-msslq" Oct 09 16:38:37 crc kubenswrapper[4719]: I1009 16:38:37.532757 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnsv5\" (UniqueName: \"kubernetes.io/projected/61bd92ed-b8ee-4f85-a38e-55302ec93cbc-kube-api-access-bnsv5\") pod \"redhat-marketplace-msslq\" (UID: \"61bd92ed-b8ee-4f85-a38e-55302ec93cbc\") " pod="openshift-marketplace/redhat-marketplace-msslq" Oct 09 16:38:37 crc kubenswrapper[4719]: I1009 16:38:37.635619 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61bd92ed-b8ee-4f85-a38e-55302ec93cbc-utilities\") pod \"redhat-marketplace-msslq\" (UID: \"61bd92ed-b8ee-4f85-a38e-55302ec93cbc\") " pod="openshift-marketplace/redhat-marketplace-msslq" Oct 09 16:38:37 crc kubenswrapper[4719]: I1009 16:38:37.635685 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61bd92ed-b8ee-4f85-a38e-55302ec93cbc-catalog-content\") pod \"redhat-marketplace-msslq\" (UID: \"61bd92ed-b8ee-4f85-a38e-55302ec93cbc\") " pod="openshift-marketplace/redhat-marketplace-msslq" Oct 09 16:38:37 crc kubenswrapper[4719]: I1009 16:38:37.635725 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnsv5\" (UniqueName: \"kubernetes.io/projected/61bd92ed-b8ee-4f85-a38e-55302ec93cbc-kube-api-access-bnsv5\") pod \"redhat-marketplace-msslq\" (UID: \"61bd92ed-b8ee-4f85-a38e-55302ec93cbc\") " pod="openshift-marketplace/redhat-marketplace-msslq" Oct 09 16:38:37 crc kubenswrapper[4719]: I1009 16:38:37.636291 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61bd92ed-b8ee-4f85-a38e-55302ec93cbc-catalog-content\") pod \"redhat-marketplace-msslq\" (UID: \"61bd92ed-b8ee-4f85-a38e-55302ec93cbc\") " pod="openshift-marketplace/redhat-marketplace-msslq" Oct 09 16:38:37 crc kubenswrapper[4719]: I1009 16:38:37.636314 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61bd92ed-b8ee-4f85-a38e-55302ec93cbc-utilities\") pod \"redhat-marketplace-msslq\" (UID: \"61bd92ed-b8ee-4f85-a38e-55302ec93cbc\") " pod="openshift-marketplace/redhat-marketplace-msslq" Oct 09 16:38:37 crc kubenswrapper[4719]: I1009 16:38:37.663385 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnsv5\" (UniqueName: \"kubernetes.io/projected/61bd92ed-b8ee-4f85-a38e-55302ec93cbc-kube-api-access-bnsv5\") pod \"redhat-marketplace-msslq\" (UID: \"61bd92ed-b8ee-4f85-a38e-55302ec93cbc\") " pod="openshift-marketplace/redhat-marketplace-msslq" Oct 09 16:38:37 crc kubenswrapper[4719]: I1009 16:38:37.846627 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-msslq" Oct 09 16:38:38 crc kubenswrapper[4719]: I1009 16:38:38.339060 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-msslq"] Oct 09 16:38:39 crc kubenswrapper[4719]: I1009 16:38:39.072982 4719 generic.go:334] "Generic (PLEG): container finished" podID="61bd92ed-b8ee-4f85-a38e-55302ec93cbc" containerID="29830bdaffeff3f64c50dabb7ebbdddafcdbb61af7dc8cd150e49207d77507a6" exitCode=0 Oct 09 16:38:39 crc kubenswrapper[4719]: I1009 16:38:39.073065 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-msslq" event={"ID":"61bd92ed-b8ee-4f85-a38e-55302ec93cbc","Type":"ContainerDied","Data":"29830bdaffeff3f64c50dabb7ebbdddafcdbb61af7dc8cd150e49207d77507a6"} Oct 09 16:38:39 crc kubenswrapper[4719]: I1009 16:38:39.073330 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-msslq" event={"ID":"61bd92ed-b8ee-4f85-a38e-55302ec93cbc","Type":"ContainerStarted","Data":"507a0f9123a9f86190d34082ff8094df9e336052c5094e3e7976d274ea6a35fd"} Oct 09 16:38:41 crc kubenswrapper[4719]: I1009 16:38:41.094798 4719 generic.go:334] "Generic (PLEG): container finished" podID="61bd92ed-b8ee-4f85-a38e-55302ec93cbc" containerID="470cc5cd17733a6db4b216186c9f3d08c2789b8c0ace7a7a3b972d5978f17a06" exitCode=0 Oct 09 16:38:41 crc kubenswrapper[4719]: I1009 16:38:41.094885 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-msslq" event={"ID":"61bd92ed-b8ee-4f85-a38e-55302ec93cbc","Type":"ContainerDied","Data":"470cc5cd17733a6db4b216186c9f3d08c2789b8c0ace7a7a3b972d5978f17a06"} Oct 09 16:38:42 crc kubenswrapper[4719]: I1009 16:38:42.110020 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-msslq" event={"ID":"61bd92ed-b8ee-4f85-a38e-55302ec93cbc","Type":"ContainerStarted","Data":"b870a853bf7e3608801651ea72bfb8b61c441c3d721185bbde4a57a1c668d8f9"} Oct 09 16:38:42 crc kubenswrapper[4719]: I1009 16:38:42.150855 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-msslq" podStartSLOduration=2.755875675 podStartE2EDuration="5.150829982s" podCreationTimestamp="2025-10-09 16:38:37 +0000 UTC" firstStartedPulling="2025-10-09 16:38:39.074597495 +0000 UTC m=+4824.584308780" lastFinishedPulling="2025-10-09 16:38:41.469551812 +0000 UTC m=+4826.979263087" observedRunningTime="2025-10-09 16:38:42.140092898 +0000 UTC m=+4827.649804193" watchObservedRunningTime="2025-10-09 16:38:42.150829982 +0000 UTC m=+4827.660541267" Oct 09 16:38:45 crc kubenswrapper[4719]: I1009 16:38:45.167930 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:38:45 crc kubenswrapper[4719]: E1009 16:38:45.168421 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:38:46 crc kubenswrapper[4719]: E1009 16:38:46.230244 4719 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.59:57068->38.102.83.59:45053: read tcp 38.102.83.59:57068->38.102.83.59:45053: read: connection reset by peer Oct 09 16:38:47 crc kubenswrapper[4719]: I1009 16:38:47.846817 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-msslq" Oct 09 16:38:47 crc kubenswrapper[4719]: I1009 16:38:47.847185 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-msslq" Oct 09 16:38:47 crc kubenswrapper[4719]: I1009 16:38:47.910490 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-msslq" Oct 09 16:38:48 crc kubenswrapper[4719]: I1009 16:38:48.225546 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-msslq" Oct 09 16:38:48 crc kubenswrapper[4719]: I1009 16:38:48.272208 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-msslq"] Oct 09 16:38:50 crc kubenswrapper[4719]: I1009 16:38:50.185055 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-msslq" podUID="61bd92ed-b8ee-4f85-a38e-55302ec93cbc" containerName="registry-server" containerID="cri-o://b870a853bf7e3608801651ea72bfb8b61c441c3d721185bbde4a57a1c668d8f9" gracePeriod=2 Oct 09 16:38:50 crc kubenswrapper[4719]: I1009 16:38:50.729721 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-msslq" Oct 09 16:38:50 crc kubenswrapper[4719]: I1009 16:38:50.842881 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnsv5\" (UniqueName: \"kubernetes.io/projected/61bd92ed-b8ee-4f85-a38e-55302ec93cbc-kube-api-access-bnsv5\") pod \"61bd92ed-b8ee-4f85-a38e-55302ec93cbc\" (UID: \"61bd92ed-b8ee-4f85-a38e-55302ec93cbc\") " Oct 09 16:38:50 crc kubenswrapper[4719]: I1009 16:38:50.842967 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61bd92ed-b8ee-4f85-a38e-55302ec93cbc-catalog-content\") pod \"61bd92ed-b8ee-4f85-a38e-55302ec93cbc\" (UID: \"61bd92ed-b8ee-4f85-a38e-55302ec93cbc\") " Oct 09 16:38:50 crc kubenswrapper[4719]: I1009 16:38:50.843220 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61bd92ed-b8ee-4f85-a38e-55302ec93cbc-utilities\") pod \"61bd92ed-b8ee-4f85-a38e-55302ec93cbc\" (UID: \"61bd92ed-b8ee-4f85-a38e-55302ec93cbc\") " Oct 09 16:38:50 crc kubenswrapper[4719]: I1009 16:38:50.844114 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61bd92ed-b8ee-4f85-a38e-55302ec93cbc-utilities" (OuterVolumeSpecName: "utilities") pod "61bd92ed-b8ee-4f85-a38e-55302ec93cbc" (UID: "61bd92ed-b8ee-4f85-a38e-55302ec93cbc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:38:50 crc kubenswrapper[4719]: I1009 16:38:50.851794 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61bd92ed-b8ee-4f85-a38e-55302ec93cbc-kube-api-access-bnsv5" (OuterVolumeSpecName: "kube-api-access-bnsv5") pod "61bd92ed-b8ee-4f85-a38e-55302ec93cbc" (UID: "61bd92ed-b8ee-4f85-a38e-55302ec93cbc"). InnerVolumeSpecName "kube-api-access-bnsv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:38:50 crc kubenswrapper[4719]: I1009 16:38:50.858024 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61bd92ed-b8ee-4f85-a38e-55302ec93cbc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61bd92ed-b8ee-4f85-a38e-55302ec93cbc" (UID: "61bd92ed-b8ee-4f85-a38e-55302ec93cbc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:38:50 crc kubenswrapper[4719]: I1009 16:38:50.945169 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61bd92ed-b8ee-4f85-a38e-55302ec93cbc-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 16:38:50 crc kubenswrapper[4719]: I1009 16:38:50.945469 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnsv5\" (UniqueName: \"kubernetes.io/projected/61bd92ed-b8ee-4f85-a38e-55302ec93cbc-kube-api-access-bnsv5\") on node \"crc\" DevicePath \"\"" Oct 09 16:38:50 crc kubenswrapper[4719]: I1009 16:38:50.945484 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61bd92ed-b8ee-4f85-a38e-55302ec93cbc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 16:38:51 crc kubenswrapper[4719]: I1009 16:38:51.200612 4719 generic.go:334] "Generic (PLEG): container finished" podID="61bd92ed-b8ee-4f85-a38e-55302ec93cbc" containerID="b870a853bf7e3608801651ea72bfb8b61c441c3d721185bbde4a57a1c668d8f9" exitCode=0 Oct 09 16:38:51 crc kubenswrapper[4719]: I1009 16:38:51.200726 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-msslq" Oct 09 16:38:51 crc kubenswrapper[4719]: I1009 16:38:51.201445 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-msslq" event={"ID":"61bd92ed-b8ee-4f85-a38e-55302ec93cbc","Type":"ContainerDied","Data":"b870a853bf7e3608801651ea72bfb8b61c441c3d721185bbde4a57a1c668d8f9"} Oct 09 16:38:51 crc kubenswrapper[4719]: I1009 16:38:51.201535 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-msslq" event={"ID":"61bd92ed-b8ee-4f85-a38e-55302ec93cbc","Type":"ContainerDied","Data":"507a0f9123a9f86190d34082ff8094df9e336052c5094e3e7976d274ea6a35fd"} Oct 09 16:38:51 crc kubenswrapper[4719]: I1009 16:38:51.201556 4719 scope.go:117] "RemoveContainer" containerID="b870a853bf7e3608801651ea72bfb8b61c441c3d721185bbde4a57a1c668d8f9" Oct 09 16:38:51 crc kubenswrapper[4719]: I1009 16:38:51.236594 4719 scope.go:117] "RemoveContainer" containerID="470cc5cd17733a6db4b216186c9f3d08c2789b8c0ace7a7a3b972d5978f17a06" Oct 09 16:38:51 crc kubenswrapper[4719]: I1009 16:38:51.252518 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-msslq"] Oct 09 16:38:51 crc kubenswrapper[4719]: I1009 16:38:51.265414 4719 scope.go:117] "RemoveContainer" containerID="29830bdaffeff3f64c50dabb7ebbdddafcdbb61af7dc8cd150e49207d77507a6" Oct 09 16:38:51 crc kubenswrapper[4719]: I1009 16:38:51.272394 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-msslq"] Oct 09 16:38:51 crc kubenswrapper[4719]: I1009 16:38:51.320406 4719 scope.go:117] "RemoveContainer" containerID="b870a853bf7e3608801651ea72bfb8b61c441c3d721185bbde4a57a1c668d8f9" Oct 09 16:38:51 crc kubenswrapper[4719]: E1009 16:38:51.320893 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b870a853bf7e3608801651ea72bfb8b61c441c3d721185bbde4a57a1c668d8f9\": container with ID starting with b870a853bf7e3608801651ea72bfb8b61c441c3d721185bbde4a57a1c668d8f9 not found: ID does not exist" containerID="b870a853bf7e3608801651ea72bfb8b61c441c3d721185bbde4a57a1c668d8f9" Oct 09 16:38:51 crc kubenswrapper[4719]: I1009 16:38:51.320938 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b870a853bf7e3608801651ea72bfb8b61c441c3d721185bbde4a57a1c668d8f9"} err="failed to get container status \"b870a853bf7e3608801651ea72bfb8b61c441c3d721185bbde4a57a1c668d8f9\": rpc error: code = NotFound desc = could not find container \"b870a853bf7e3608801651ea72bfb8b61c441c3d721185bbde4a57a1c668d8f9\": container with ID starting with b870a853bf7e3608801651ea72bfb8b61c441c3d721185bbde4a57a1c668d8f9 not found: ID does not exist" Oct 09 16:38:51 crc kubenswrapper[4719]: I1009 16:38:51.320971 4719 scope.go:117] "RemoveContainer" containerID="470cc5cd17733a6db4b216186c9f3d08c2789b8c0ace7a7a3b972d5978f17a06" Oct 09 16:38:51 crc kubenswrapper[4719]: E1009 16:38:51.321244 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"470cc5cd17733a6db4b216186c9f3d08c2789b8c0ace7a7a3b972d5978f17a06\": container with ID starting with 470cc5cd17733a6db4b216186c9f3d08c2789b8c0ace7a7a3b972d5978f17a06 not found: ID does not exist" containerID="470cc5cd17733a6db4b216186c9f3d08c2789b8c0ace7a7a3b972d5978f17a06" Oct 09 16:38:51 crc kubenswrapper[4719]: I1009 16:38:51.321312 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"470cc5cd17733a6db4b216186c9f3d08c2789b8c0ace7a7a3b972d5978f17a06"} err="failed to get container status \"470cc5cd17733a6db4b216186c9f3d08c2789b8c0ace7a7a3b972d5978f17a06\": rpc error: code = NotFound desc = could not find container \"470cc5cd17733a6db4b216186c9f3d08c2789b8c0ace7a7a3b972d5978f17a06\": container with ID starting with 470cc5cd17733a6db4b216186c9f3d08c2789b8c0ace7a7a3b972d5978f17a06 not found: ID does not exist" Oct 09 16:38:51 crc kubenswrapper[4719]: I1009 16:38:51.321338 4719 scope.go:117] "RemoveContainer" containerID="29830bdaffeff3f64c50dabb7ebbdddafcdbb61af7dc8cd150e49207d77507a6" Oct 09 16:38:51 crc kubenswrapper[4719]: E1009 16:38:51.321565 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29830bdaffeff3f64c50dabb7ebbdddafcdbb61af7dc8cd150e49207d77507a6\": container with ID starting with 29830bdaffeff3f64c50dabb7ebbdddafcdbb61af7dc8cd150e49207d77507a6 not found: ID does not exist" containerID="29830bdaffeff3f64c50dabb7ebbdddafcdbb61af7dc8cd150e49207d77507a6" Oct 09 16:38:51 crc kubenswrapper[4719]: I1009 16:38:51.321591 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29830bdaffeff3f64c50dabb7ebbdddafcdbb61af7dc8cd150e49207d77507a6"} err="failed to get container status \"29830bdaffeff3f64c50dabb7ebbdddafcdbb61af7dc8cd150e49207d77507a6\": rpc error: code = NotFound desc = could not find container \"29830bdaffeff3f64c50dabb7ebbdddafcdbb61af7dc8cd150e49207d77507a6\": container with ID starting with 29830bdaffeff3f64c50dabb7ebbdddafcdbb61af7dc8cd150e49207d77507a6 not found: ID does not exist" Oct 09 16:38:53 crc kubenswrapper[4719]: I1009 16:38:53.171800 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61bd92ed-b8ee-4f85-a38e-55302ec93cbc" path="/var/lib/kubelet/pods/61bd92ed-b8ee-4f85-a38e-55302ec93cbc/volumes" Oct 09 16:39:00 crc kubenswrapper[4719]: I1009 16:39:00.162199 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:39:00 crc kubenswrapper[4719]: E1009 16:39:00.163919 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:39:14 crc kubenswrapper[4719]: I1009 16:39:14.162155 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:39:15 crc kubenswrapper[4719]: I1009 16:39:15.451108 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerStarted","Data":"b4867c20f750888c215fbc5443e23212b58bf4b8c7545fec7bdcbda25a01fb16"} Oct 09 16:41:36 crc kubenswrapper[4719]: I1009 16:41:36.976419 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:41:36 crc kubenswrapper[4719]: I1009 16:41:36.976986 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:42:06 crc kubenswrapper[4719]: I1009 16:42:06.976948 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:42:06 crc kubenswrapper[4719]: I1009 16:42:06.977570 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:42:36 crc kubenswrapper[4719]: I1009 16:42:36.976521 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:42:36 crc kubenswrapper[4719]: I1009 16:42:36.977142 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:42:36 crc kubenswrapper[4719]: I1009 16:42:36.977198 4719 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 16:42:36 crc kubenswrapper[4719]: I1009 16:42:36.978054 4719 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b4867c20f750888c215fbc5443e23212b58bf4b8c7545fec7bdcbda25a01fb16"} pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 16:42:36 crc kubenswrapper[4719]: I1009 16:42:36.978105 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" containerID="cri-o://b4867c20f750888c215fbc5443e23212b58bf4b8c7545fec7bdcbda25a01fb16" gracePeriod=600 Oct 09 16:42:37 crc kubenswrapper[4719]: I1009 16:42:37.419776 4719 generic.go:334] "Generic (PLEG): container finished" podID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerID="b4867c20f750888c215fbc5443e23212b58bf4b8c7545fec7bdcbda25a01fb16" exitCode=0 Oct 09 16:42:37 crc kubenswrapper[4719]: I1009 16:42:37.419813 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerDied","Data":"b4867c20f750888c215fbc5443e23212b58bf4b8c7545fec7bdcbda25a01fb16"} Oct 09 16:42:37 crc kubenswrapper[4719]: I1009 16:42:37.420442 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerStarted","Data":"d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199"} Oct 09 16:42:37 crc kubenswrapper[4719]: I1009 16:42:37.420467 4719 scope.go:117] "RemoveContainer" containerID="ab8e39a1b9738f293da97baf5097a0476e541f0d738db409215643fbcdcb6edb" Oct 09 16:45:00 crc kubenswrapper[4719]: I1009 16:45:00.159170 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333805-ltqfv"] Oct 09 16:45:00 crc kubenswrapper[4719]: E1009 16:45:00.160407 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61bd92ed-b8ee-4f85-a38e-55302ec93cbc" containerName="extract-content" Oct 09 16:45:00 crc kubenswrapper[4719]: I1009 16:45:00.160514 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="61bd92ed-b8ee-4f85-a38e-55302ec93cbc" containerName="extract-content" Oct 09 16:45:00 crc kubenswrapper[4719]: E1009 16:45:00.160535 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61bd92ed-b8ee-4f85-a38e-55302ec93cbc" containerName="extract-utilities" Oct 09 16:45:00 crc kubenswrapper[4719]: I1009 16:45:00.160541 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="61bd92ed-b8ee-4f85-a38e-55302ec93cbc" containerName="extract-utilities" Oct 09 16:45:00 crc kubenswrapper[4719]: E1009 16:45:00.160563 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61bd92ed-b8ee-4f85-a38e-55302ec93cbc" containerName="registry-server" Oct 09 16:45:00 crc kubenswrapper[4719]: I1009 16:45:00.160569 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="61bd92ed-b8ee-4f85-a38e-55302ec93cbc" containerName="registry-server" Oct 09 16:45:00 crc kubenswrapper[4719]: I1009 16:45:00.160779 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="61bd92ed-b8ee-4f85-a38e-55302ec93cbc" containerName="registry-server" Oct 09 16:45:00 crc kubenswrapper[4719]: I1009 16:45:00.161981 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333805-ltqfv" Oct 09 16:45:00 crc kubenswrapper[4719]: I1009 16:45:00.164609 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 16:45:00 crc kubenswrapper[4719]: I1009 16:45:00.165162 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 16:45:00 crc kubenswrapper[4719]: I1009 16:45:00.168606 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333805-ltqfv"] Oct 09 16:45:00 crc kubenswrapper[4719]: I1009 16:45:00.236457 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/efa60592-0beb-4820-8d50-53245546af8e-secret-volume\") pod \"collect-profiles-29333805-ltqfv\" (UID: \"efa60592-0beb-4820-8d50-53245546af8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333805-ltqfv" Oct 09 16:45:00 crc kubenswrapper[4719]: I1009 16:45:00.236527 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/efa60592-0beb-4820-8d50-53245546af8e-config-volume\") pod \"collect-profiles-29333805-ltqfv\" (UID: \"efa60592-0beb-4820-8d50-53245546af8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333805-ltqfv" Oct 09 16:45:00 crc kubenswrapper[4719]: I1009 16:45:00.236645 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjgls\" (UniqueName: \"kubernetes.io/projected/efa60592-0beb-4820-8d50-53245546af8e-kube-api-access-vjgls\") pod \"collect-profiles-29333805-ltqfv\" (UID: \"efa60592-0beb-4820-8d50-53245546af8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333805-ltqfv" Oct 09 16:45:00 crc kubenswrapper[4719]: I1009 16:45:00.337946 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/efa60592-0beb-4820-8d50-53245546af8e-secret-volume\") pod \"collect-profiles-29333805-ltqfv\" (UID: \"efa60592-0beb-4820-8d50-53245546af8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333805-ltqfv" Oct 09 16:45:00 crc kubenswrapper[4719]: I1009 16:45:00.337998 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/efa60592-0beb-4820-8d50-53245546af8e-config-volume\") pod \"collect-profiles-29333805-ltqfv\" (UID: \"efa60592-0beb-4820-8d50-53245546af8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333805-ltqfv" Oct 09 16:45:00 crc kubenswrapper[4719]: I1009 16:45:00.338065 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjgls\" (UniqueName: \"kubernetes.io/projected/efa60592-0beb-4820-8d50-53245546af8e-kube-api-access-vjgls\") pod \"collect-profiles-29333805-ltqfv\" (UID: \"efa60592-0beb-4820-8d50-53245546af8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333805-ltqfv" Oct 09 16:45:00 crc kubenswrapper[4719]: I1009 16:45:00.339082 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/efa60592-0beb-4820-8d50-53245546af8e-config-volume\") pod \"collect-profiles-29333805-ltqfv\" (UID: \"efa60592-0beb-4820-8d50-53245546af8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333805-ltqfv" Oct 09 16:45:00 crc kubenswrapper[4719]: I1009 16:45:00.345120 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/efa60592-0beb-4820-8d50-53245546af8e-secret-volume\") pod \"collect-profiles-29333805-ltqfv\" (UID: \"efa60592-0beb-4820-8d50-53245546af8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333805-ltqfv" Oct 09 16:45:00 crc kubenswrapper[4719]: I1009 16:45:00.359911 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjgls\" (UniqueName: \"kubernetes.io/projected/efa60592-0beb-4820-8d50-53245546af8e-kube-api-access-vjgls\") pod \"collect-profiles-29333805-ltqfv\" (UID: \"efa60592-0beb-4820-8d50-53245546af8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333805-ltqfv" Oct 09 16:45:00 crc kubenswrapper[4719]: I1009 16:45:00.499712 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333805-ltqfv" Oct 09 16:45:00 crc kubenswrapper[4719]: I1009 16:45:00.947267 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333805-ltqfv"] Oct 09 16:45:01 crc kubenswrapper[4719]: I1009 16:45:01.797022 4719 generic.go:334] "Generic (PLEG): container finished" podID="efa60592-0beb-4820-8d50-53245546af8e" containerID="ec5eddfd07b82aec27b275f5acf36c1f1ac5bdfc77c5a66885a2b085202a497e" exitCode=0 Oct 09 16:45:01 crc kubenswrapper[4719]: I1009 16:45:01.797084 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333805-ltqfv" event={"ID":"efa60592-0beb-4820-8d50-53245546af8e","Type":"ContainerDied","Data":"ec5eddfd07b82aec27b275f5acf36c1f1ac5bdfc77c5a66885a2b085202a497e"} Oct 09 16:45:01 crc kubenswrapper[4719]: I1009 16:45:01.797413 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333805-ltqfv" event={"ID":"efa60592-0beb-4820-8d50-53245546af8e","Type":"ContainerStarted","Data":"90a2aff9f499df8e07e0c852d2a5cd2cc0f4b29af49be823f6b691d2dbd3a60b"} Oct 09 16:45:03 crc kubenswrapper[4719]: I1009 16:45:03.185998 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333805-ltqfv" Oct 09 16:45:03 crc kubenswrapper[4719]: I1009 16:45:03.307848 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjgls\" (UniqueName: \"kubernetes.io/projected/efa60592-0beb-4820-8d50-53245546af8e-kube-api-access-vjgls\") pod \"efa60592-0beb-4820-8d50-53245546af8e\" (UID: \"efa60592-0beb-4820-8d50-53245546af8e\") " Oct 09 16:45:03 crc kubenswrapper[4719]: I1009 16:45:03.308156 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/efa60592-0beb-4820-8d50-53245546af8e-config-volume\") pod \"efa60592-0beb-4820-8d50-53245546af8e\" (UID: \"efa60592-0beb-4820-8d50-53245546af8e\") " Oct 09 16:45:03 crc kubenswrapper[4719]: I1009 16:45:03.308345 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/efa60592-0beb-4820-8d50-53245546af8e-secret-volume\") pod \"efa60592-0beb-4820-8d50-53245546af8e\" (UID: \"efa60592-0beb-4820-8d50-53245546af8e\") " Oct 09 16:45:03 crc kubenswrapper[4719]: I1009 16:45:03.309025 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efa60592-0beb-4820-8d50-53245546af8e-config-volume" (OuterVolumeSpecName: "config-volume") pod "efa60592-0beb-4820-8d50-53245546af8e" (UID: "efa60592-0beb-4820-8d50-53245546af8e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 16:45:03 crc kubenswrapper[4719]: I1009 16:45:03.311810 4719 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/efa60592-0beb-4820-8d50-53245546af8e-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 16:45:03 crc kubenswrapper[4719]: I1009 16:45:03.319561 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efa60592-0beb-4820-8d50-53245546af8e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "efa60592-0beb-4820-8d50-53245546af8e" (UID: "efa60592-0beb-4820-8d50-53245546af8e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:45:03 crc kubenswrapper[4719]: I1009 16:45:03.319853 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efa60592-0beb-4820-8d50-53245546af8e-kube-api-access-vjgls" (OuterVolumeSpecName: "kube-api-access-vjgls") pod "efa60592-0beb-4820-8d50-53245546af8e" (UID: "efa60592-0beb-4820-8d50-53245546af8e"). InnerVolumeSpecName "kube-api-access-vjgls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:45:03 crc kubenswrapper[4719]: I1009 16:45:03.413768 4719 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/efa60592-0beb-4820-8d50-53245546af8e-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 16:45:03 crc kubenswrapper[4719]: I1009 16:45:03.413812 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjgls\" (UniqueName: \"kubernetes.io/projected/efa60592-0beb-4820-8d50-53245546af8e-kube-api-access-vjgls\") on node \"crc\" DevicePath \"\"" Oct 09 16:45:03 crc kubenswrapper[4719]: I1009 16:45:03.818184 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333805-ltqfv" event={"ID":"efa60592-0beb-4820-8d50-53245546af8e","Type":"ContainerDied","Data":"90a2aff9f499df8e07e0c852d2a5cd2cc0f4b29af49be823f6b691d2dbd3a60b"} Oct 09 16:45:03 crc kubenswrapper[4719]: I1009 16:45:03.818224 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333805-ltqfv" Oct 09 16:45:03 crc kubenswrapper[4719]: I1009 16:45:03.818227 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90a2aff9f499df8e07e0c852d2a5cd2cc0f4b29af49be823f6b691d2dbd3a60b" Oct 09 16:45:04 crc kubenswrapper[4719]: I1009 16:45:04.254990 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333760-zgd5q"] Oct 09 16:45:04 crc kubenswrapper[4719]: I1009 16:45:04.262801 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333760-zgd5q"] Oct 09 16:45:05 crc kubenswrapper[4719]: I1009 16:45:05.171990 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="923d9a5f-f038-4d7f-877c-cf0f4c970a59" path="/var/lib/kubelet/pods/923d9a5f-f038-4d7f-877c-cf0f4c970a59/volumes" Oct 09 16:45:06 crc kubenswrapper[4719]: I1009 16:45:06.977139 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:45:06 crc kubenswrapper[4719]: I1009 16:45:06.977776 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:45:28 crc kubenswrapper[4719]: I1009 16:45:28.147223 4719 scope.go:117] "RemoveContainer" containerID="6b80fab1e5a80d5afb83105a8c55745a7d0fdd547f3964c823b163620b3c5c18" Oct 09 16:45:36 crc kubenswrapper[4719]: I1009 16:45:36.976402 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:45:36 crc kubenswrapper[4719]: I1009 16:45:36.977279 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:46:06 crc kubenswrapper[4719]: I1009 16:46:06.976908 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:46:06 crc kubenswrapper[4719]: I1009 16:46:06.977834 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:46:06 crc kubenswrapper[4719]: I1009 16:46:06.977893 4719 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 16:46:06 crc kubenswrapper[4719]: I1009 16:46:06.978771 4719 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199"} pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 16:46:06 crc kubenswrapper[4719]: I1009 16:46:06.978817 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" containerID="cri-o://d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" gracePeriod=600 Oct 09 16:46:07 crc kubenswrapper[4719]: E1009 16:46:07.104943 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:46:07 crc kubenswrapper[4719]: I1009 16:46:07.430385 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerDied","Data":"d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199"} Oct 09 16:46:07 crc kubenswrapper[4719]: I1009 16:46:07.430391 4719 generic.go:334] "Generic (PLEG): container finished" podID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" exitCode=0 Oct 09 16:46:07 crc kubenswrapper[4719]: I1009 16:46:07.430457 4719 scope.go:117] "RemoveContainer" containerID="b4867c20f750888c215fbc5443e23212b58bf4b8c7545fec7bdcbda25a01fb16" Oct 09 16:46:07 crc kubenswrapper[4719]: I1009 16:46:07.431837 4719 scope.go:117] "RemoveContainer" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" Oct 09 16:46:07 crc kubenswrapper[4719]: E1009 16:46:07.432504 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:46:17 crc kubenswrapper[4719]: I1009 16:46:17.825057 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pxmfd"] Oct 09 16:46:17 crc kubenswrapper[4719]: E1009 16:46:17.826558 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efa60592-0beb-4820-8d50-53245546af8e" containerName="collect-profiles" Oct 09 16:46:17 crc kubenswrapper[4719]: I1009 16:46:17.826575 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="efa60592-0beb-4820-8d50-53245546af8e" containerName="collect-profiles" Oct 09 16:46:17 crc kubenswrapper[4719]: I1009 16:46:17.826891 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="efa60592-0beb-4820-8d50-53245546af8e" containerName="collect-profiles" Oct 09 16:46:17 crc kubenswrapper[4719]: I1009 16:46:17.828411 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pxmfd" Oct 09 16:46:17 crc kubenswrapper[4719]: I1009 16:46:17.845059 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pxmfd"] Oct 09 16:46:17 crc kubenswrapper[4719]: I1009 16:46:17.993257 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3979caa6-b9fc-4c04-80ed-5f684700e060-utilities\") pod \"redhat-operators-pxmfd\" (UID: \"3979caa6-b9fc-4c04-80ed-5f684700e060\") " pod="openshift-marketplace/redhat-operators-pxmfd" Oct 09 16:46:17 crc kubenswrapper[4719]: I1009 16:46:17.993332 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3979caa6-b9fc-4c04-80ed-5f684700e060-catalog-content\") pod \"redhat-operators-pxmfd\" (UID: \"3979caa6-b9fc-4c04-80ed-5f684700e060\") " pod="openshift-marketplace/redhat-operators-pxmfd" Oct 09 16:46:17 crc kubenswrapper[4719]: I1009 16:46:17.993598 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97776\" (UniqueName: \"kubernetes.io/projected/3979caa6-b9fc-4c04-80ed-5f684700e060-kube-api-access-97776\") pod \"redhat-operators-pxmfd\" (UID: \"3979caa6-b9fc-4c04-80ed-5f684700e060\") " pod="openshift-marketplace/redhat-operators-pxmfd" Oct 09 16:46:18 crc kubenswrapper[4719]: I1009 16:46:18.097336 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3979caa6-b9fc-4c04-80ed-5f684700e060-utilities\") pod \"redhat-operators-pxmfd\" (UID: \"3979caa6-b9fc-4c04-80ed-5f684700e060\") " pod="openshift-marketplace/redhat-operators-pxmfd" Oct 09 16:46:18 crc kubenswrapper[4719]: I1009 16:46:18.097464 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3979caa6-b9fc-4c04-80ed-5f684700e060-catalog-content\") pod \"redhat-operators-pxmfd\" (UID: \"3979caa6-b9fc-4c04-80ed-5f684700e060\") " pod="openshift-marketplace/redhat-operators-pxmfd" Oct 09 16:46:18 crc kubenswrapper[4719]: I1009 16:46:18.097634 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97776\" (UniqueName: \"kubernetes.io/projected/3979caa6-b9fc-4c04-80ed-5f684700e060-kube-api-access-97776\") pod \"redhat-operators-pxmfd\" (UID: \"3979caa6-b9fc-4c04-80ed-5f684700e060\") " pod="openshift-marketplace/redhat-operators-pxmfd" Oct 09 16:46:18 crc kubenswrapper[4719]: I1009 16:46:18.097900 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3979caa6-b9fc-4c04-80ed-5f684700e060-utilities\") pod \"redhat-operators-pxmfd\" (UID: \"3979caa6-b9fc-4c04-80ed-5f684700e060\") " pod="openshift-marketplace/redhat-operators-pxmfd" Oct 09 16:46:18 crc kubenswrapper[4719]: I1009 16:46:18.097939 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3979caa6-b9fc-4c04-80ed-5f684700e060-catalog-content\") pod \"redhat-operators-pxmfd\" (UID: \"3979caa6-b9fc-4c04-80ed-5f684700e060\") " pod="openshift-marketplace/redhat-operators-pxmfd" Oct 09 16:46:18 crc kubenswrapper[4719]: I1009 16:46:18.128335 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97776\" (UniqueName: \"kubernetes.io/projected/3979caa6-b9fc-4c04-80ed-5f684700e060-kube-api-access-97776\") pod \"redhat-operators-pxmfd\" (UID: \"3979caa6-b9fc-4c04-80ed-5f684700e060\") " pod="openshift-marketplace/redhat-operators-pxmfd" Oct 09 16:46:18 crc kubenswrapper[4719]: I1009 16:46:18.163780 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pxmfd" Oct 09 16:46:18 crc kubenswrapper[4719]: I1009 16:46:18.669240 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pxmfd"] Oct 09 16:46:19 crc kubenswrapper[4719]: I1009 16:46:19.161954 4719 scope.go:117] "RemoveContainer" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" Oct 09 16:46:19 crc kubenswrapper[4719]: E1009 16:46:19.162664 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:46:19 crc kubenswrapper[4719]: I1009 16:46:19.561541 4719 generic.go:334] "Generic (PLEG): container finished" podID="3979caa6-b9fc-4c04-80ed-5f684700e060" containerID="54b2bcecc7dde0dd538c02f43088856d55e098ec5e9e1b56731f2afa627890f7" exitCode=0 Oct 09 16:46:19 crc kubenswrapper[4719]: I1009 16:46:19.561587 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxmfd" event={"ID":"3979caa6-b9fc-4c04-80ed-5f684700e060","Type":"ContainerDied","Data":"54b2bcecc7dde0dd538c02f43088856d55e098ec5e9e1b56731f2afa627890f7"} Oct 09 16:46:19 crc kubenswrapper[4719]: I1009 16:46:19.561613 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxmfd" event={"ID":"3979caa6-b9fc-4c04-80ed-5f684700e060","Type":"ContainerStarted","Data":"89eaa19c2d5a075118948e6225996e307f27becd8eb0aaea12950efc88b27069"} Oct 09 16:46:19 crc kubenswrapper[4719]: I1009 16:46:19.563805 4719 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 16:46:21 crc kubenswrapper[4719]: I1009 16:46:21.580688 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxmfd" event={"ID":"3979caa6-b9fc-4c04-80ed-5f684700e060","Type":"ContainerStarted","Data":"f9e3ef16aaee0e8a935c603eb328a2a00257dbb60f7203c928a26594c812ae40"} Oct 09 16:46:24 crc kubenswrapper[4719]: I1009 16:46:24.611299 4719 generic.go:334] "Generic (PLEG): container finished" podID="3979caa6-b9fc-4c04-80ed-5f684700e060" containerID="f9e3ef16aaee0e8a935c603eb328a2a00257dbb60f7203c928a26594c812ae40" exitCode=0 Oct 09 16:46:24 crc kubenswrapper[4719]: I1009 16:46:24.611479 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxmfd" event={"ID":"3979caa6-b9fc-4c04-80ed-5f684700e060","Type":"ContainerDied","Data":"f9e3ef16aaee0e8a935c603eb328a2a00257dbb60f7203c928a26594c812ae40"} Oct 09 16:46:25 crc kubenswrapper[4719]: I1009 16:46:25.624689 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxmfd" event={"ID":"3979caa6-b9fc-4c04-80ed-5f684700e060","Type":"ContainerStarted","Data":"8d6bad8fa357b7b2182b320bd5eb3f4c75530bfa50b575a095201af76658d4a1"} Oct 09 16:46:25 crc kubenswrapper[4719]: I1009 16:46:25.646101 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pxmfd" podStartSLOduration=2.873110037 podStartE2EDuration="8.646085705s" podCreationTimestamp="2025-10-09 16:46:17 +0000 UTC" firstStartedPulling="2025-10-09 16:46:19.563605297 +0000 UTC m=+5285.073316572" lastFinishedPulling="2025-10-09 16:46:25.336580955 +0000 UTC m=+5290.846292240" observedRunningTime="2025-10-09 16:46:25.643500783 +0000 UTC m=+5291.153212088" watchObservedRunningTime="2025-10-09 16:46:25.646085705 +0000 UTC m=+5291.155796990" Oct 09 16:46:27 crc kubenswrapper[4719]: I1009 16:46:27.642042 4719 generic.go:334] "Generic (PLEG): container finished" podID="8905824a-8f15-4df7-b938-b63f2a5aebb1" containerID="53273a4a10d0010e94e9c43e9a4573892691b9349c2b9e5ed482a717c208480c" exitCode=0 Oct 09 16:46:27 crc kubenswrapper[4719]: I1009 16:46:27.642132 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"8905824a-8f15-4df7-b938-b63f2a5aebb1","Type":"ContainerDied","Data":"53273a4a10d0010e94e9c43e9a4573892691b9349c2b9e5ed482a717c208480c"} Oct 09 16:46:28 crc kubenswrapper[4719]: I1009 16:46:28.164864 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pxmfd" Oct 09 16:46:28 crc kubenswrapper[4719]: I1009 16:46:28.164903 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pxmfd" Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.211605 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pxmfd" podUID="3979caa6-b9fc-4c04-80ed-5f684700e060" containerName="registry-server" probeResult="failure" output=< Oct 09 16:46:29 crc kubenswrapper[4719]: timeout: failed to connect service ":50051" within 1s Oct 09 16:46:29 crc kubenswrapper[4719]: > Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.662279 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"8905824a-8f15-4df7-b938-b63f2a5aebb1","Type":"ContainerDied","Data":"2f68ed9af2b0741aa503f9e940fa5a4c11805bd4e49ec880fb00e2ec7ed1b8d3"} Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.662317 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f68ed9af2b0741aa503f9e940fa5a4c11805bd4e49ec880fb00e2ec7ed1b8d3" Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.713802 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.839944 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8905824a-8f15-4df7-b938-b63f2a5aebb1-ca-certs\") pod \"8905824a-8f15-4df7-b938-b63f2a5aebb1\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.839983 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8905824a-8f15-4df7-b938-b63f2a5aebb1-test-operator-ephemeral-temporary\") pod \"8905824a-8f15-4df7-b938-b63f2a5aebb1\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.840122 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"8905824a-8f15-4df7-b938-b63f2a5aebb1\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.840141 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8905824a-8f15-4df7-b938-b63f2a5aebb1-ssh-key\") pod \"8905824a-8f15-4df7-b938-b63f2a5aebb1\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.840186 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8905824a-8f15-4df7-b938-b63f2a5aebb1-openstack-config\") pod \"8905824a-8f15-4df7-b938-b63f2a5aebb1\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.840205 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ch8x\" (UniqueName: \"kubernetes.io/projected/8905824a-8f15-4df7-b938-b63f2a5aebb1-kube-api-access-7ch8x\") pod \"8905824a-8f15-4df7-b938-b63f2a5aebb1\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.840283 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8905824a-8f15-4df7-b938-b63f2a5aebb1-openstack-config-secret\") pod \"8905824a-8f15-4df7-b938-b63f2a5aebb1\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.840308 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8905824a-8f15-4df7-b938-b63f2a5aebb1-test-operator-ephemeral-workdir\") pod \"8905824a-8f15-4df7-b938-b63f2a5aebb1\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.840376 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8905824a-8f15-4df7-b938-b63f2a5aebb1-config-data\") pod \"8905824a-8f15-4df7-b938-b63f2a5aebb1\" (UID: \"8905824a-8f15-4df7-b938-b63f2a5aebb1\") " Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.840822 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8905824a-8f15-4df7-b938-b63f2a5aebb1-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "8905824a-8f15-4df7-b938-b63f2a5aebb1" (UID: "8905824a-8f15-4df7-b938-b63f2a5aebb1"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.841174 4719 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8905824a-8f15-4df7-b938-b63f2a5aebb1-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.841552 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8905824a-8f15-4df7-b938-b63f2a5aebb1-config-data" (OuterVolumeSpecName: "config-data") pod "8905824a-8f15-4df7-b938-b63f2a5aebb1" (UID: "8905824a-8f15-4df7-b938-b63f2a5aebb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.846237 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8905824a-8f15-4df7-b938-b63f2a5aebb1-kube-api-access-7ch8x" (OuterVolumeSpecName: "kube-api-access-7ch8x") pod "8905824a-8f15-4df7-b938-b63f2a5aebb1" (UID: "8905824a-8f15-4df7-b938-b63f2a5aebb1"). InnerVolumeSpecName "kube-api-access-7ch8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.848537 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8905824a-8f15-4df7-b938-b63f2a5aebb1-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "8905824a-8f15-4df7-b938-b63f2a5aebb1" (UID: "8905824a-8f15-4df7-b938-b63f2a5aebb1"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.859582 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "test-operator-logs") pod "8905824a-8f15-4df7-b938-b63f2a5aebb1" (UID: "8905824a-8f15-4df7-b938-b63f2a5aebb1"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.871400 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8905824a-8f15-4df7-b938-b63f2a5aebb1-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "8905824a-8f15-4df7-b938-b63f2a5aebb1" (UID: "8905824a-8f15-4df7-b938-b63f2a5aebb1"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.872875 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8905824a-8f15-4df7-b938-b63f2a5aebb1-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "8905824a-8f15-4df7-b938-b63f2a5aebb1" (UID: "8905824a-8f15-4df7-b938-b63f2a5aebb1"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.879792 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8905824a-8f15-4df7-b938-b63f2a5aebb1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8905824a-8f15-4df7-b938-b63f2a5aebb1" (UID: "8905824a-8f15-4df7-b938-b63f2a5aebb1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.909476 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8905824a-8f15-4df7-b938-b63f2a5aebb1-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "8905824a-8f15-4df7-b938-b63f2a5aebb1" (UID: "8905824a-8f15-4df7-b938-b63f2a5aebb1"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.943539 4719 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8905824a-8f15-4df7-b938-b63f2a5aebb1-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.943769 4719 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8905824a-8f15-4df7-b938-b63f2a5aebb1-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.943834 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8905824a-8f15-4df7-b938-b63f2a5aebb1-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.943898 4719 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8905824a-8f15-4df7-b938-b63f2a5aebb1-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.944023 4719 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.944102 4719 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8905824a-8f15-4df7-b938-b63f2a5aebb1-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.944181 4719 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8905824a-8f15-4df7-b938-b63f2a5aebb1-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.944239 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ch8x\" (UniqueName: \"kubernetes.io/projected/8905824a-8f15-4df7-b938-b63f2a5aebb1-kube-api-access-7ch8x\") on node \"crc\" DevicePath \"\"" Oct 09 16:46:29 crc kubenswrapper[4719]: I1009 16:46:29.985387 4719 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 09 16:46:30 crc kubenswrapper[4719]: I1009 16:46:30.046658 4719 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 09 16:46:30 crc kubenswrapper[4719]: I1009 16:46:30.160715 4719 scope.go:117] "RemoveContainer" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" Oct 09 16:46:30 crc kubenswrapper[4719]: E1009 16:46:30.161061 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:46:30 crc kubenswrapper[4719]: I1009 16:46:30.669667 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 09 16:46:30 crc kubenswrapper[4719]: E1009 16:46:30.832715 4719 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8905824a_8f15_4df7_b938_b63f2a5aebb1.slice/crio-2f68ed9af2b0741aa503f9e940fa5a4c11805bd4e49ec880fb00e2ec7ed1b8d3\": RecentStats: unable to find data in memory cache]" Oct 09 16:46:38 crc kubenswrapper[4719]: I1009 16:46:38.052556 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 09 16:46:38 crc kubenswrapper[4719]: E1009 16:46:38.054803 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8905824a-8f15-4df7-b938-b63f2a5aebb1" containerName="tempest-tests-tempest-tests-runner" Oct 09 16:46:38 crc kubenswrapper[4719]: I1009 16:46:38.054897 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="8905824a-8f15-4df7-b938-b63f2a5aebb1" containerName="tempest-tests-tempest-tests-runner" Oct 09 16:46:38 crc kubenswrapper[4719]: I1009 16:46:38.055211 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="8905824a-8f15-4df7-b938-b63f2a5aebb1" containerName="tempest-tests-tempest-tests-runner" Oct 09 16:46:38 crc kubenswrapper[4719]: I1009 16:46:38.056085 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 16:46:38 crc kubenswrapper[4719]: I1009 16:46:38.058193 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-vfn4v" Oct 09 16:46:38 crc kubenswrapper[4719]: I1009 16:46:38.067774 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 09 16:46:38 crc kubenswrapper[4719]: I1009 16:46:38.236050 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0677443b-23bd-4727-a28c-34f602835052\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 16:46:38 crc kubenswrapper[4719]: I1009 16:46:38.236552 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj5jn\" (UniqueName: \"kubernetes.io/projected/0677443b-23bd-4727-a28c-34f602835052-kube-api-access-bj5jn\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0677443b-23bd-4727-a28c-34f602835052\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 16:46:38 crc kubenswrapper[4719]: I1009 16:46:38.338841 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0677443b-23bd-4727-a28c-34f602835052\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 16:46:38 crc kubenswrapper[4719]: I1009 16:46:38.338907 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj5jn\" (UniqueName: \"kubernetes.io/projected/0677443b-23bd-4727-a28c-34f602835052-kube-api-access-bj5jn\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0677443b-23bd-4727-a28c-34f602835052\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 16:46:38 crc kubenswrapper[4719]: I1009 16:46:38.339910 4719 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0677443b-23bd-4727-a28c-34f602835052\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 16:46:38 crc kubenswrapper[4719]: I1009 16:46:38.368827 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj5jn\" (UniqueName: \"kubernetes.io/projected/0677443b-23bd-4727-a28c-34f602835052-kube-api-access-bj5jn\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0677443b-23bd-4727-a28c-34f602835052\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 16:46:38 crc kubenswrapper[4719]: I1009 16:46:38.385150 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0677443b-23bd-4727-a28c-34f602835052\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 16:46:38 crc kubenswrapper[4719]: I1009 16:46:38.680977 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 16:46:39 crc kubenswrapper[4719]: I1009 16:46:39.149044 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 09 16:46:39 crc kubenswrapper[4719]: I1009 16:46:39.213180 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pxmfd" podUID="3979caa6-b9fc-4c04-80ed-5f684700e060" containerName="registry-server" probeResult="failure" output=< Oct 09 16:46:39 crc kubenswrapper[4719]: timeout: failed to connect service ":50051" within 1s Oct 09 16:46:39 crc kubenswrapper[4719]: > Oct 09 16:46:39 crc kubenswrapper[4719]: I1009 16:46:39.753620 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"0677443b-23bd-4727-a28c-34f602835052","Type":"ContainerStarted","Data":"eea37ce7eeac952219fe280cf792018a90762c232f671819ea02c57a00636674"} Oct 09 16:46:40 crc kubenswrapper[4719]: I1009 16:46:40.768622 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"0677443b-23bd-4727-a28c-34f602835052","Type":"ContainerStarted","Data":"0dbe7d3a08af95710461b090c77b895ecdb14e4e2cdc9460944a684ea950464a"} Oct 09 16:46:40 crc kubenswrapper[4719]: I1009 16:46:40.788370 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.682909922 podStartE2EDuration="2.788344662s" podCreationTimestamp="2025-10-09 16:46:38 +0000 UTC" firstStartedPulling="2025-10-09 16:46:39.160394351 +0000 UTC m=+5304.670105626" lastFinishedPulling="2025-10-09 16:46:40.265829081 +0000 UTC m=+5305.775540366" observedRunningTime="2025-10-09 16:46:40.784713276 +0000 UTC m=+5306.294424571" watchObservedRunningTime="2025-10-09 16:46:40.788344662 +0000 UTC m=+5306.298055947" Oct 09 16:46:41 crc kubenswrapper[4719]: I1009 16:46:41.161280 4719 scope.go:117] "RemoveContainer" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" Oct 09 16:46:41 crc kubenswrapper[4719]: E1009 16:46:41.161976 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:46:48 crc kubenswrapper[4719]: I1009 16:46:48.216456 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pxmfd" Oct 09 16:46:48 crc kubenswrapper[4719]: I1009 16:46:48.281387 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pxmfd" Oct 09 16:46:49 crc kubenswrapper[4719]: I1009 16:46:49.026856 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pxmfd"] Oct 09 16:46:49 crc kubenswrapper[4719]: I1009 16:46:49.853305 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pxmfd" podUID="3979caa6-b9fc-4c04-80ed-5f684700e060" containerName="registry-server" containerID="cri-o://8d6bad8fa357b7b2182b320bd5eb3f4c75530bfa50b575a095201af76658d4a1" gracePeriod=2 Oct 09 16:46:50 crc kubenswrapper[4719]: I1009 16:46:50.358183 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pxmfd" Oct 09 16:46:50 crc kubenswrapper[4719]: I1009 16:46:50.435048 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3979caa6-b9fc-4c04-80ed-5f684700e060-utilities\") pod \"3979caa6-b9fc-4c04-80ed-5f684700e060\" (UID: \"3979caa6-b9fc-4c04-80ed-5f684700e060\") " Oct 09 16:46:50 crc kubenswrapper[4719]: I1009 16:46:50.435229 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3979caa6-b9fc-4c04-80ed-5f684700e060-catalog-content\") pod \"3979caa6-b9fc-4c04-80ed-5f684700e060\" (UID: \"3979caa6-b9fc-4c04-80ed-5f684700e060\") " Oct 09 16:46:50 crc kubenswrapper[4719]: I1009 16:46:50.435375 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97776\" (UniqueName: \"kubernetes.io/projected/3979caa6-b9fc-4c04-80ed-5f684700e060-kube-api-access-97776\") pod \"3979caa6-b9fc-4c04-80ed-5f684700e060\" (UID: \"3979caa6-b9fc-4c04-80ed-5f684700e060\") " Oct 09 16:46:50 crc kubenswrapper[4719]: I1009 16:46:50.436992 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3979caa6-b9fc-4c04-80ed-5f684700e060-utilities" (OuterVolumeSpecName: "utilities") pod "3979caa6-b9fc-4c04-80ed-5f684700e060" (UID: "3979caa6-b9fc-4c04-80ed-5f684700e060"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:46:50 crc kubenswrapper[4719]: I1009 16:46:50.442913 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3979caa6-b9fc-4c04-80ed-5f684700e060-kube-api-access-97776" (OuterVolumeSpecName: "kube-api-access-97776") pod "3979caa6-b9fc-4c04-80ed-5f684700e060" (UID: "3979caa6-b9fc-4c04-80ed-5f684700e060"). InnerVolumeSpecName "kube-api-access-97776". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:46:50 crc kubenswrapper[4719]: I1009 16:46:50.514939 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3979caa6-b9fc-4c04-80ed-5f684700e060-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3979caa6-b9fc-4c04-80ed-5f684700e060" (UID: "3979caa6-b9fc-4c04-80ed-5f684700e060"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:46:50 crc kubenswrapper[4719]: I1009 16:46:50.538429 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97776\" (UniqueName: \"kubernetes.io/projected/3979caa6-b9fc-4c04-80ed-5f684700e060-kube-api-access-97776\") on node \"crc\" DevicePath \"\"" Oct 09 16:46:50 crc kubenswrapper[4719]: I1009 16:46:50.538472 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3979caa6-b9fc-4c04-80ed-5f684700e060-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 16:46:50 crc kubenswrapper[4719]: I1009 16:46:50.538488 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3979caa6-b9fc-4c04-80ed-5f684700e060-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 16:46:50 crc kubenswrapper[4719]: I1009 16:46:50.867144 4719 generic.go:334] "Generic (PLEG): container finished" podID="3979caa6-b9fc-4c04-80ed-5f684700e060" containerID="8d6bad8fa357b7b2182b320bd5eb3f4c75530bfa50b575a095201af76658d4a1" exitCode=0 Oct 09 16:46:50 crc kubenswrapper[4719]: I1009 16:46:50.867521 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxmfd" event={"ID":"3979caa6-b9fc-4c04-80ed-5f684700e060","Type":"ContainerDied","Data":"8d6bad8fa357b7b2182b320bd5eb3f4c75530bfa50b575a095201af76658d4a1"} Oct 09 16:46:50 crc kubenswrapper[4719]: I1009 16:46:50.867625 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxmfd" event={"ID":"3979caa6-b9fc-4c04-80ed-5f684700e060","Type":"ContainerDied","Data":"89eaa19c2d5a075118948e6225996e307f27becd8eb0aaea12950efc88b27069"} Oct 09 16:46:50 crc kubenswrapper[4719]: I1009 16:46:50.867667 4719 scope.go:117] "RemoveContainer" containerID="8d6bad8fa357b7b2182b320bd5eb3f4c75530bfa50b575a095201af76658d4a1" Oct 09 16:46:50 crc kubenswrapper[4719]: I1009 16:46:50.867565 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pxmfd" Oct 09 16:46:50 crc kubenswrapper[4719]: I1009 16:46:50.908599 4719 scope.go:117] "RemoveContainer" containerID="f9e3ef16aaee0e8a935c603eb328a2a00257dbb60f7203c928a26594c812ae40" Oct 09 16:46:50 crc kubenswrapper[4719]: I1009 16:46:50.915483 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pxmfd"] Oct 09 16:46:50 crc kubenswrapper[4719]: I1009 16:46:50.926696 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pxmfd"] Oct 09 16:46:50 crc kubenswrapper[4719]: I1009 16:46:50.946468 4719 scope.go:117] "RemoveContainer" containerID="54b2bcecc7dde0dd538c02f43088856d55e098ec5e9e1b56731f2afa627890f7" Oct 09 16:46:50 crc kubenswrapper[4719]: I1009 16:46:50.991121 4719 scope.go:117] "RemoveContainer" containerID="8d6bad8fa357b7b2182b320bd5eb3f4c75530bfa50b575a095201af76658d4a1" Oct 09 16:46:50 crc kubenswrapper[4719]: E1009 16:46:50.991677 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d6bad8fa357b7b2182b320bd5eb3f4c75530bfa50b575a095201af76658d4a1\": container with ID starting with 8d6bad8fa357b7b2182b320bd5eb3f4c75530bfa50b575a095201af76658d4a1 not found: ID does not exist" containerID="8d6bad8fa357b7b2182b320bd5eb3f4c75530bfa50b575a095201af76658d4a1" Oct 09 16:46:50 crc kubenswrapper[4719]: I1009 16:46:50.991720 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d6bad8fa357b7b2182b320bd5eb3f4c75530bfa50b575a095201af76658d4a1"} err="failed to get container status \"8d6bad8fa357b7b2182b320bd5eb3f4c75530bfa50b575a095201af76658d4a1\": rpc error: code = NotFound desc = could not find container \"8d6bad8fa357b7b2182b320bd5eb3f4c75530bfa50b575a095201af76658d4a1\": container with ID starting with 8d6bad8fa357b7b2182b320bd5eb3f4c75530bfa50b575a095201af76658d4a1 not found: ID does not exist" Oct 09 16:46:50 crc kubenswrapper[4719]: I1009 16:46:50.991746 4719 scope.go:117] "RemoveContainer" containerID="f9e3ef16aaee0e8a935c603eb328a2a00257dbb60f7203c928a26594c812ae40" Oct 09 16:46:50 crc kubenswrapper[4719]: E1009 16:46:50.992155 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9e3ef16aaee0e8a935c603eb328a2a00257dbb60f7203c928a26594c812ae40\": container with ID starting with f9e3ef16aaee0e8a935c603eb328a2a00257dbb60f7203c928a26594c812ae40 not found: ID does not exist" containerID="f9e3ef16aaee0e8a935c603eb328a2a00257dbb60f7203c928a26594c812ae40" Oct 09 16:46:50 crc kubenswrapper[4719]: I1009 16:46:50.992198 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9e3ef16aaee0e8a935c603eb328a2a00257dbb60f7203c928a26594c812ae40"} err="failed to get container status \"f9e3ef16aaee0e8a935c603eb328a2a00257dbb60f7203c928a26594c812ae40\": rpc error: code = NotFound desc = could not find container \"f9e3ef16aaee0e8a935c603eb328a2a00257dbb60f7203c928a26594c812ae40\": container with ID starting with f9e3ef16aaee0e8a935c603eb328a2a00257dbb60f7203c928a26594c812ae40 not found: ID does not exist" Oct 09 16:46:50 crc kubenswrapper[4719]: I1009 16:46:50.992225 4719 scope.go:117] "RemoveContainer" containerID="54b2bcecc7dde0dd538c02f43088856d55e098ec5e9e1b56731f2afa627890f7" Oct 09 16:46:50 crc kubenswrapper[4719]: E1009 16:46:50.992580 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54b2bcecc7dde0dd538c02f43088856d55e098ec5e9e1b56731f2afa627890f7\": container with ID starting with 54b2bcecc7dde0dd538c02f43088856d55e098ec5e9e1b56731f2afa627890f7 not found: ID does not exist" containerID="54b2bcecc7dde0dd538c02f43088856d55e098ec5e9e1b56731f2afa627890f7" Oct 09 16:46:50 crc kubenswrapper[4719]: I1009 16:46:50.992605 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54b2bcecc7dde0dd538c02f43088856d55e098ec5e9e1b56731f2afa627890f7"} err="failed to get container status \"54b2bcecc7dde0dd538c02f43088856d55e098ec5e9e1b56731f2afa627890f7\": rpc error: code = NotFound desc = could not find container \"54b2bcecc7dde0dd538c02f43088856d55e098ec5e9e1b56731f2afa627890f7\": container with ID starting with 54b2bcecc7dde0dd538c02f43088856d55e098ec5e9e1b56731f2afa627890f7 not found: ID does not exist" Oct 09 16:46:51 crc kubenswrapper[4719]: I1009 16:46:51.173808 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3979caa6-b9fc-4c04-80ed-5f684700e060" path="/var/lib/kubelet/pods/3979caa6-b9fc-4c04-80ed-5f684700e060/volumes" Oct 09 16:46:54 crc kubenswrapper[4719]: I1009 16:46:54.160919 4719 scope.go:117] "RemoveContainer" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" Oct 09 16:46:54 crc kubenswrapper[4719]: E1009 16:46:54.161560 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:46:58 crc kubenswrapper[4719]: I1009 16:46:58.671482 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d2mzl/must-gather-pnnpc"] Oct 09 16:46:58 crc kubenswrapper[4719]: E1009 16:46:58.672761 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3979caa6-b9fc-4c04-80ed-5f684700e060" containerName="registry-server" Oct 09 16:46:58 crc kubenswrapper[4719]: I1009 16:46:58.672782 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="3979caa6-b9fc-4c04-80ed-5f684700e060" containerName="registry-server" Oct 09 16:46:58 crc kubenswrapper[4719]: E1009 16:46:58.672798 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3979caa6-b9fc-4c04-80ed-5f684700e060" containerName="extract-utilities" Oct 09 16:46:58 crc kubenswrapper[4719]: I1009 16:46:58.672804 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="3979caa6-b9fc-4c04-80ed-5f684700e060" containerName="extract-utilities" Oct 09 16:46:58 crc kubenswrapper[4719]: E1009 16:46:58.672825 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3979caa6-b9fc-4c04-80ed-5f684700e060" containerName="extract-content" Oct 09 16:46:58 crc kubenswrapper[4719]: I1009 16:46:58.672831 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="3979caa6-b9fc-4c04-80ed-5f684700e060" containerName="extract-content" Oct 09 16:46:58 crc kubenswrapper[4719]: I1009 16:46:58.673022 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="3979caa6-b9fc-4c04-80ed-5f684700e060" containerName="registry-server" Oct 09 16:46:58 crc kubenswrapper[4719]: I1009 16:46:58.674108 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2mzl/must-gather-pnnpc" Oct 09 16:46:58 crc kubenswrapper[4719]: I1009 16:46:58.678705 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-d2mzl"/"kube-root-ca.crt" Oct 09 16:46:58 crc kubenswrapper[4719]: I1009 16:46:58.679097 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-d2mzl"/"openshift-service-ca.crt" Oct 09 16:46:58 crc kubenswrapper[4719]: I1009 16:46:58.680878 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-d2mzl"/"default-dockercfg-665v5" Oct 09 16:46:58 crc kubenswrapper[4719]: I1009 16:46:58.682681 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d2mzl/must-gather-pnnpc"] Oct 09 16:46:58 crc kubenswrapper[4719]: I1009 16:46:58.821960 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3c04f13f-99d3-45a7-afdc-c8c75afe6737-must-gather-output\") pod \"must-gather-pnnpc\" (UID: \"3c04f13f-99d3-45a7-afdc-c8c75afe6737\") " pod="openshift-must-gather-d2mzl/must-gather-pnnpc" Oct 09 16:46:58 crc kubenswrapper[4719]: I1009 16:46:58.822121 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xsx2\" (UniqueName: \"kubernetes.io/projected/3c04f13f-99d3-45a7-afdc-c8c75afe6737-kube-api-access-8xsx2\") pod \"must-gather-pnnpc\" (UID: \"3c04f13f-99d3-45a7-afdc-c8c75afe6737\") " pod="openshift-must-gather-d2mzl/must-gather-pnnpc" Oct 09 16:46:58 crc kubenswrapper[4719]: I1009 16:46:58.923583 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xsx2\" (UniqueName: \"kubernetes.io/projected/3c04f13f-99d3-45a7-afdc-c8c75afe6737-kube-api-access-8xsx2\") pod \"must-gather-pnnpc\" (UID: \"3c04f13f-99d3-45a7-afdc-c8c75afe6737\") " pod="openshift-must-gather-d2mzl/must-gather-pnnpc" Oct 09 16:46:58 crc kubenswrapper[4719]: I1009 16:46:58.924085 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3c04f13f-99d3-45a7-afdc-c8c75afe6737-must-gather-output\") pod \"must-gather-pnnpc\" (UID: \"3c04f13f-99d3-45a7-afdc-c8c75afe6737\") " pod="openshift-must-gather-d2mzl/must-gather-pnnpc" Oct 09 16:46:58 crc kubenswrapper[4719]: I1009 16:46:58.924644 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3c04f13f-99d3-45a7-afdc-c8c75afe6737-must-gather-output\") pod \"must-gather-pnnpc\" (UID: \"3c04f13f-99d3-45a7-afdc-c8c75afe6737\") " pod="openshift-must-gather-d2mzl/must-gather-pnnpc" Oct 09 16:46:58 crc kubenswrapper[4719]: I1009 16:46:58.958306 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xsx2\" (UniqueName: \"kubernetes.io/projected/3c04f13f-99d3-45a7-afdc-c8c75afe6737-kube-api-access-8xsx2\") pod \"must-gather-pnnpc\" (UID: \"3c04f13f-99d3-45a7-afdc-c8c75afe6737\") " pod="openshift-must-gather-d2mzl/must-gather-pnnpc" Oct 09 16:46:58 crc kubenswrapper[4719]: I1009 16:46:58.998635 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2mzl/must-gather-pnnpc" Oct 09 16:46:59 crc kubenswrapper[4719]: I1009 16:46:59.594943 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d2mzl/must-gather-pnnpc"] Oct 09 16:46:59 crc kubenswrapper[4719]: I1009 16:46:59.957554 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2mzl/must-gather-pnnpc" event={"ID":"3c04f13f-99d3-45a7-afdc-c8c75afe6737","Type":"ContainerStarted","Data":"033fd55f7ef950e5ba21023a397521de3a3f2bc3f21a8e8d0b0c0ac485039608"} Oct 09 16:47:07 crc kubenswrapper[4719]: I1009 16:47:07.046479 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2mzl/must-gather-pnnpc" event={"ID":"3c04f13f-99d3-45a7-afdc-c8c75afe6737","Type":"ContainerStarted","Data":"008d4782160db9493449fe21d6a79efca2293ac2da9381ded8ac84c29fd0fb09"} Oct 09 16:47:08 crc kubenswrapper[4719]: I1009 16:47:08.056339 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2mzl/must-gather-pnnpc" event={"ID":"3c04f13f-99d3-45a7-afdc-c8c75afe6737","Type":"ContainerStarted","Data":"c7f39fecf4987a694e7120a018f1a5978607ead4f98d5eb4b8140ed893279338"} Oct 09 16:47:08 crc kubenswrapper[4719]: I1009 16:47:08.089645 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d2mzl/must-gather-pnnpc" podStartSLOduration=3.537369769 podStartE2EDuration="10.089621045s" podCreationTimestamp="2025-10-09 16:46:58 +0000 UTC" firstStartedPulling="2025-10-09 16:46:59.605884247 +0000 UTC m=+5325.115595532" lastFinishedPulling="2025-10-09 16:47:06.158135513 +0000 UTC m=+5331.667846808" observedRunningTime="2025-10-09 16:47:08.070186994 +0000 UTC m=+5333.579898289" watchObservedRunningTime="2025-10-09 16:47:08.089621045 +0000 UTC m=+5333.599332340" Oct 09 16:47:09 crc kubenswrapper[4719]: I1009 16:47:09.162012 4719 scope.go:117] "RemoveContainer" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" Oct 09 16:47:09 crc kubenswrapper[4719]: E1009 16:47:09.162711 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:47:11 crc kubenswrapper[4719]: I1009 16:47:11.336283 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d2mzl/crc-debug-x7566"] Oct 09 16:47:11 crc kubenswrapper[4719]: I1009 16:47:11.338313 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2mzl/crc-debug-x7566" Oct 09 16:47:11 crc kubenswrapper[4719]: I1009 16:47:11.515658 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qnpl\" (UniqueName: \"kubernetes.io/projected/686b6761-868b-4fd2-a0e3-5959f37637ad-kube-api-access-7qnpl\") pod \"crc-debug-x7566\" (UID: \"686b6761-868b-4fd2-a0e3-5959f37637ad\") " pod="openshift-must-gather-d2mzl/crc-debug-x7566" Oct 09 16:47:11 crc kubenswrapper[4719]: I1009 16:47:11.515929 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/686b6761-868b-4fd2-a0e3-5959f37637ad-host\") pod \"crc-debug-x7566\" (UID: \"686b6761-868b-4fd2-a0e3-5959f37637ad\") " pod="openshift-must-gather-d2mzl/crc-debug-x7566" Oct 09 16:47:11 crc kubenswrapper[4719]: I1009 16:47:11.617994 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/686b6761-868b-4fd2-a0e3-5959f37637ad-host\") pod \"crc-debug-x7566\" (UID: \"686b6761-868b-4fd2-a0e3-5959f37637ad\") " pod="openshift-must-gather-d2mzl/crc-debug-x7566" Oct 09 16:47:11 crc kubenswrapper[4719]: I1009 16:47:11.618108 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qnpl\" (UniqueName: \"kubernetes.io/projected/686b6761-868b-4fd2-a0e3-5959f37637ad-kube-api-access-7qnpl\") pod \"crc-debug-x7566\" (UID: \"686b6761-868b-4fd2-a0e3-5959f37637ad\") " pod="openshift-must-gather-d2mzl/crc-debug-x7566" Oct 09 16:47:11 crc kubenswrapper[4719]: I1009 16:47:11.618648 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/686b6761-868b-4fd2-a0e3-5959f37637ad-host\") pod \"crc-debug-x7566\" (UID: \"686b6761-868b-4fd2-a0e3-5959f37637ad\") " pod="openshift-must-gather-d2mzl/crc-debug-x7566" Oct 09 16:47:11 crc kubenswrapper[4719]: I1009 16:47:11.646425 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qnpl\" (UniqueName: \"kubernetes.io/projected/686b6761-868b-4fd2-a0e3-5959f37637ad-kube-api-access-7qnpl\") pod \"crc-debug-x7566\" (UID: \"686b6761-868b-4fd2-a0e3-5959f37637ad\") " pod="openshift-must-gather-d2mzl/crc-debug-x7566" Oct 09 16:47:11 crc kubenswrapper[4719]: I1009 16:47:11.658626 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2mzl/crc-debug-x7566" Oct 09 16:47:12 crc kubenswrapper[4719]: I1009 16:47:12.091228 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2mzl/crc-debug-x7566" event={"ID":"686b6761-868b-4fd2-a0e3-5959f37637ad","Type":"ContainerStarted","Data":"856d3729394ddebcbd079302c8ecddbd28cdd54b39a2b835872a7a29c82a6531"} Oct 09 16:47:22 crc kubenswrapper[4719]: I1009 16:47:22.197125 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2mzl/crc-debug-x7566" event={"ID":"686b6761-868b-4fd2-a0e3-5959f37637ad","Type":"ContainerStarted","Data":"23c115c5b8b865c5806d9934b3eb59b4140b95599ec19e55b09e8127e2c7d639"} Oct 09 16:47:22 crc kubenswrapper[4719]: I1009 16:47:22.221382 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d2mzl/crc-debug-x7566" podStartSLOduration=1.628184295 podStartE2EDuration="11.221365312s" podCreationTimestamp="2025-10-09 16:47:11 +0000 UTC" firstStartedPulling="2025-10-09 16:47:11.698747253 +0000 UTC m=+5337.208458538" lastFinishedPulling="2025-10-09 16:47:21.29192827 +0000 UTC m=+5346.801639555" observedRunningTime="2025-10-09 16:47:22.210305158 +0000 UTC m=+5347.720016433" watchObservedRunningTime="2025-10-09 16:47:22.221365312 +0000 UTC m=+5347.731076597" Oct 09 16:47:24 crc kubenswrapper[4719]: I1009 16:47:24.161474 4719 scope.go:117] "RemoveContainer" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" Oct 09 16:47:24 crc kubenswrapper[4719]: E1009 16:47:24.162905 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:47:38 crc kubenswrapper[4719]: I1009 16:47:38.161242 4719 scope.go:117] "RemoveContainer" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" Oct 09 16:47:38 crc kubenswrapper[4719]: E1009 16:47:38.162138 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:47:38 crc kubenswrapper[4719]: I1009 16:47:38.956781 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ggt2r"] Oct 09 16:47:38 crc kubenswrapper[4719]: I1009 16:47:38.959976 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ggt2r" Oct 09 16:47:38 crc kubenswrapper[4719]: I1009 16:47:38.974402 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ggt2r"] Oct 09 16:47:39 crc kubenswrapper[4719]: I1009 16:47:39.097990 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fd4f54d-b75c-4ef1-bf58-4f551158db6c-utilities\") pod \"community-operators-ggt2r\" (UID: \"2fd4f54d-b75c-4ef1-bf58-4f551158db6c\") " pod="openshift-marketplace/community-operators-ggt2r" Oct 09 16:47:39 crc kubenswrapper[4719]: I1009 16:47:39.098167 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lmjv\" (UniqueName: \"kubernetes.io/projected/2fd4f54d-b75c-4ef1-bf58-4f551158db6c-kube-api-access-6lmjv\") pod \"community-operators-ggt2r\" (UID: \"2fd4f54d-b75c-4ef1-bf58-4f551158db6c\") " pod="openshift-marketplace/community-operators-ggt2r" Oct 09 16:47:39 crc kubenswrapper[4719]: I1009 16:47:39.098217 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fd4f54d-b75c-4ef1-bf58-4f551158db6c-catalog-content\") pod \"community-operators-ggt2r\" (UID: \"2fd4f54d-b75c-4ef1-bf58-4f551158db6c\") " pod="openshift-marketplace/community-operators-ggt2r" Oct 09 16:47:39 crc kubenswrapper[4719]: I1009 16:47:39.205201 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fd4f54d-b75c-4ef1-bf58-4f551158db6c-utilities\") pod \"community-operators-ggt2r\" (UID: \"2fd4f54d-b75c-4ef1-bf58-4f551158db6c\") " pod="openshift-marketplace/community-operators-ggt2r" Oct 09 16:47:39 crc kubenswrapper[4719]: I1009 16:47:39.205602 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lmjv\" (UniqueName: \"kubernetes.io/projected/2fd4f54d-b75c-4ef1-bf58-4f551158db6c-kube-api-access-6lmjv\") pod \"community-operators-ggt2r\" (UID: \"2fd4f54d-b75c-4ef1-bf58-4f551158db6c\") " pod="openshift-marketplace/community-operators-ggt2r" Oct 09 16:47:39 crc kubenswrapper[4719]: I1009 16:47:39.205689 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fd4f54d-b75c-4ef1-bf58-4f551158db6c-catalog-content\") pod \"community-operators-ggt2r\" (UID: \"2fd4f54d-b75c-4ef1-bf58-4f551158db6c\") " pod="openshift-marketplace/community-operators-ggt2r" Oct 09 16:47:39 crc kubenswrapper[4719]: I1009 16:47:39.205958 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fd4f54d-b75c-4ef1-bf58-4f551158db6c-utilities\") pod \"community-operators-ggt2r\" (UID: \"2fd4f54d-b75c-4ef1-bf58-4f551158db6c\") " pod="openshift-marketplace/community-operators-ggt2r" Oct 09 16:47:39 crc kubenswrapper[4719]: I1009 16:47:39.206124 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fd4f54d-b75c-4ef1-bf58-4f551158db6c-catalog-content\") pod \"community-operators-ggt2r\" (UID: \"2fd4f54d-b75c-4ef1-bf58-4f551158db6c\") " pod="openshift-marketplace/community-operators-ggt2r" Oct 09 16:47:39 crc kubenswrapper[4719]: I1009 16:47:39.233151 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lmjv\" (UniqueName: \"kubernetes.io/projected/2fd4f54d-b75c-4ef1-bf58-4f551158db6c-kube-api-access-6lmjv\") pod \"community-operators-ggt2r\" (UID: \"2fd4f54d-b75c-4ef1-bf58-4f551158db6c\") " pod="openshift-marketplace/community-operators-ggt2r" Oct 09 16:47:39 crc kubenswrapper[4719]: I1009 16:47:39.280945 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ggt2r" Oct 09 16:47:39 crc kubenswrapper[4719]: I1009 16:47:39.905736 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ggt2r"] Oct 09 16:47:40 crc kubenswrapper[4719]: I1009 16:47:40.396848 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggt2r" event={"ID":"2fd4f54d-b75c-4ef1-bf58-4f551158db6c","Type":"ContainerStarted","Data":"6b386eb16b99d92d8daf8ffc65958ef0773fa6a4f10eb71e12b76262dd5c2938"} Oct 09 16:47:41 crc kubenswrapper[4719]: I1009 16:47:41.409140 4719 generic.go:334] "Generic (PLEG): container finished" podID="2fd4f54d-b75c-4ef1-bf58-4f551158db6c" containerID="367adff7e0df3172115ff9029de44e70e959df4f1a81c267517d750fca703c3e" exitCode=0 Oct 09 16:47:41 crc kubenswrapper[4719]: I1009 16:47:41.409222 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggt2r" event={"ID":"2fd4f54d-b75c-4ef1-bf58-4f551158db6c","Type":"ContainerDied","Data":"367adff7e0df3172115ff9029de44e70e959df4f1a81c267517d750fca703c3e"} Oct 09 16:47:42 crc kubenswrapper[4719]: I1009 16:47:42.420890 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggt2r" event={"ID":"2fd4f54d-b75c-4ef1-bf58-4f551158db6c","Type":"ContainerStarted","Data":"ca43b4c8c40e9511c0c8580ddd2dfc5999c93240d113db43ecfca1986e4be625"} Oct 09 16:47:44 crc kubenswrapper[4719]: I1009 16:47:44.442752 4719 generic.go:334] "Generic (PLEG): container finished" podID="2fd4f54d-b75c-4ef1-bf58-4f551158db6c" containerID="ca43b4c8c40e9511c0c8580ddd2dfc5999c93240d113db43ecfca1986e4be625" exitCode=0 Oct 09 16:47:44 crc kubenswrapper[4719]: I1009 16:47:44.442829 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggt2r" event={"ID":"2fd4f54d-b75c-4ef1-bf58-4f551158db6c","Type":"ContainerDied","Data":"ca43b4c8c40e9511c0c8580ddd2dfc5999c93240d113db43ecfca1986e4be625"} Oct 09 16:47:45 crc kubenswrapper[4719]: I1009 16:47:45.456343 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggt2r" event={"ID":"2fd4f54d-b75c-4ef1-bf58-4f551158db6c","Type":"ContainerStarted","Data":"3fda90bac2fd824136ed382b940686e06d1ab4415a5cee6de5e027083f08aa46"} Oct 09 16:47:45 crc kubenswrapper[4719]: I1009 16:47:45.478438 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ggt2r" podStartSLOduration=3.933143696 podStartE2EDuration="7.478417777s" podCreationTimestamp="2025-10-09 16:47:38 +0000 UTC" firstStartedPulling="2025-10-09 16:47:41.411675109 +0000 UTC m=+5366.921386394" lastFinishedPulling="2025-10-09 16:47:44.95694919 +0000 UTC m=+5370.466660475" observedRunningTime="2025-10-09 16:47:45.477618802 +0000 UTC m=+5370.987330107" watchObservedRunningTime="2025-10-09 16:47:45.478417777 +0000 UTC m=+5370.988129082" Oct 09 16:47:49 crc kubenswrapper[4719]: I1009 16:47:49.281960 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ggt2r" Oct 09 16:47:49 crc kubenswrapper[4719]: I1009 16:47:49.282527 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ggt2r" Oct 09 16:47:49 crc kubenswrapper[4719]: I1009 16:47:49.342033 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ggt2r" Oct 09 16:47:50 crc kubenswrapper[4719]: I1009 16:47:50.161105 4719 scope.go:117] "RemoveContainer" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" Oct 09 16:47:50 crc kubenswrapper[4719]: E1009 16:47:50.161532 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:47:59 crc kubenswrapper[4719]: I1009 16:47:59.329631 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ggt2r" Oct 09 16:47:59 crc kubenswrapper[4719]: I1009 16:47:59.380731 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ggt2r"] Oct 09 16:47:59 crc kubenswrapper[4719]: I1009 16:47:59.574004 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ggt2r" podUID="2fd4f54d-b75c-4ef1-bf58-4f551158db6c" containerName="registry-server" containerID="cri-o://3fda90bac2fd824136ed382b940686e06d1ab4415a5cee6de5e027083f08aa46" gracePeriod=2 Oct 09 16:48:00 crc kubenswrapper[4719]: I1009 16:48:00.053825 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ggt2r" Oct 09 16:48:00 crc kubenswrapper[4719]: I1009 16:48:00.134085 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lmjv\" (UniqueName: \"kubernetes.io/projected/2fd4f54d-b75c-4ef1-bf58-4f551158db6c-kube-api-access-6lmjv\") pod \"2fd4f54d-b75c-4ef1-bf58-4f551158db6c\" (UID: \"2fd4f54d-b75c-4ef1-bf58-4f551158db6c\") " Oct 09 16:48:00 crc kubenswrapper[4719]: I1009 16:48:00.134176 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fd4f54d-b75c-4ef1-bf58-4f551158db6c-catalog-content\") pod \"2fd4f54d-b75c-4ef1-bf58-4f551158db6c\" (UID: \"2fd4f54d-b75c-4ef1-bf58-4f551158db6c\") " Oct 09 16:48:00 crc kubenswrapper[4719]: I1009 16:48:00.134370 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fd4f54d-b75c-4ef1-bf58-4f551158db6c-utilities\") pod \"2fd4f54d-b75c-4ef1-bf58-4f551158db6c\" (UID: \"2fd4f54d-b75c-4ef1-bf58-4f551158db6c\") " Oct 09 16:48:00 crc kubenswrapper[4719]: I1009 16:48:00.135364 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fd4f54d-b75c-4ef1-bf58-4f551158db6c-utilities" (OuterVolumeSpecName: "utilities") pod "2fd4f54d-b75c-4ef1-bf58-4f551158db6c" (UID: "2fd4f54d-b75c-4ef1-bf58-4f551158db6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:48:00 crc kubenswrapper[4719]: I1009 16:48:00.192991 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fd4f54d-b75c-4ef1-bf58-4f551158db6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2fd4f54d-b75c-4ef1-bf58-4f551158db6c" (UID: "2fd4f54d-b75c-4ef1-bf58-4f551158db6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:48:00 crc kubenswrapper[4719]: I1009 16:48:00.237970 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fd4f54d-b75c-4ef1-bf58-4f551158db6c-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 16:48:00 crc kubenswrapper[4719]: I1009 16:48:00.237997 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fd4f54d-b75c-4ef1-bf58-4f551158db6c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 16:48:00 crc kubenswrapper[4719]: I1009 16:48:00.587603 4719 generic.go:334] "Generic (PLEG): container finished" podID="2fd4f54d-b75c-4ef1-bf58-4f551158db6c" containerID="3fda90bac2fd824136ed382b940686e06d1ab4415a5cee6de5e027083f08aa46" exitCode=0 Oct 09 16:48:00 crc kubenswrapper[4719]: I1009 16:48:00.587651 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ggt2r" Oct 09 16:48:00 crc kubenswrapper[4719]: I1009 16:48:00.587651 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggt2r" event={"ID":"2fd4f54d-b75c-4ef1-bf58-4f551158db6c","Type":"ContainerDied","Data":"3fda90bac2fd824136ed382b940686e06d1ab4415a5cee6de5e027083f08aa46"} Oct 09 16:48:00 crc kubenswrapper[4719]: I1009 16:48:00.588646 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggt2r" event={"ID":"2fd4f54d-b75c-4ef1-bf58-4f551158db6c","Type":"ContainerDied","Data":"6b386eb16b99d92d8daf8ffc65958ef0773fa6a4f10eb71e12b76262dd5c2938"} Oct 09 16:48:00 crc kubenswrapper[4719]: I1009 16:48:00.588672 4719 scope.go:117] "RemoveContainer" containerID="3fda90bac2fd824136ed382b940686e06d1ab4415a5cee6de5e027083f08aa46" Oct 09 16:48:00 crc kubenswrapper[4719]: I1009 16:48:00.610552 4719 scope.go:117] "RemoveContainer" containerID="ca43b4c8c40e9511c0c8580ddd2dfc5999c93240d113db43ecfca1986e4be625" Oct 09 16:48:00 crc kubenswrapper[4719]: I1009 16:48:00.694163 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fd4f54d-b75c-4ef1-bf58-4f551158db6c-kube-api-access-6lmjv" (OuterVolumeSpecName: "kube-api-access-6lmjv") pod "2fd4f54d-b75c-4ef1-bf58-4f551158db6c" (UID: "2fd4f54d-b75c-4ef1-bf58-4f551158db6c"). InnerVolumeSpecName "kube-api-access-6lmjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:48:00 crc kubenswrapper[4719]: I1009 16:48:00.705491 4719 scope.go:117] "RemoveContainer" containerID="367adff7e0df3172115ff9029de44e70e959df4f1a81c267517d750fca703c3e" Oct 09 16:48:00 crc kubenswrapper[4719]: I1009 16:48:00.748134 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lmjv\" (UniqueName: \"kubernetes.io/projected/2fd4f54d-b75c-4ef1-bf58-4f551158db6c-kube-api-access-6lmjv\") on node \"crc\" DevicePath \"\"" Oct 09 16:48:00 crc kubenswrapper[4719]: I1009 16:48:00.924656 4719 scope.go:117] "RemoveContainer" containerID="3fda90bac2fd824136ed382b940686e06d1ab4415a5cee6de5e027083f08aa46" Oct 09 16:48:00 crc kubenswrapper[4719]: E1009 16:48:00.925010 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fda90bac2fd824136ed382b940686e06d1ab4415a5cee6de5e027083f08aa46\": container with ID starting with 3fda90bac2fd824136ed382b940686e06d1ab4415a5cee6de5e027083f08aa46 not found: ID does not exist" containerID="3fda90bac2fd824136ed382b940686e06d1ab4415a5cee6de5e027083f08aa46" Oct 09 16:48:00 crc kubenswrapper[4719]: I1009 16:48:00.925039 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fda90bac2fd824136ed382b940686e06d1ab4415a5cee6de5e027083f08aa46"} err="failed to get container status \"3fda90bac2fd824136ed382b940686e06d1ab4415a5cee6de5e027083f08aa46\": rpc error: code = NotFound desc = could not find container \"3fda90bac2fd824136ed382b940686e06d1ab4415a5cee6de5e027083f08aa46\": container with ID starting with 3fda90bac2fd824136ed382b940686e06d1ab4415a5cee6de5e027083f08aa46 not found: ID does not exist" Oct 09 16:48:00 crc kubenswrapper[4719]: I1009 16:48:00.925060 4719 scope.go:117] "RemoveContainer" containerID="ca43b4c8c40e9511c0c8580ddd2dfc5999c93240d113db43ecfca1986e4be625" Oct 09 16:48:00 crc kubenswrapper[4719]: E1009 16:48:00.925416 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca43b4c8c40e9511c0c8580ddd2dfc5999c93240d113db43ecfca1986e4be625\": container with ID starting with ca43b4c8c40e9511c0c8580ddd2dfc5999c93240d113db43ecfca1986e4be625 not found: ID does not exist" containerID="ca43b4c8c40e9511c0c8580ddd2dfc5999c93240d113db43ecfca1986e4be625" Oct 09 16:48:00 crc kubenswrapper[4719]: I1009 16:48:00.925436 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca43b4c8c40e9511c0c8580ddd2dfc5999c93240d113db43ecfca1986e4be625"} err="failed to get container status \"ca43b4c8c40e9511c0c8580ddd2dfc5999c93240d113db43ecfca1986e4be625\": rpc error: code = NotFound desc = could not find container \"ca43b4c8c40e9511c0c8580ddd2dfc5999c93240d113db43ecfca1986e4be625\": container with ID starting with ca43b4c8c40e9511c0c8580ddd2dfc5999c93240d113db43ecfca1986e4be625 not found: ID does not exist" Oct 09 16:48:00 crc kubenswrapper[4719]: I1009 16:48:00.925449 4719 scope.go:117] "RemoveContainer" containerID="367adff7e0df3172115ff9029de44e70e959df4f1a81c267517d750fca703c3e" Oct 09 16:48:00 crc kubenswrapper[4719]: E1009 16:48:00.925924 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"367adff7e0df3172115ff9029de44e70e959df4f1a81c267517d750fca703c3e\": container with ID starting with 367adff7e0df3172115ff9029de44e70e959df4f1a81c267517d750fca703c3e not found: ID does not exist" containerID="367adff7e0df3172115ff9029de44e70e959df4f1a81c267517d750fca703c3e" Oct 09 16:48:00 crc kubenswrapper[4719]: I1009 16:48:00.925974 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"367adff7e0df3172115ff9029de44e70e959df4f1a81c267517d750fca703c3e"} err="failed to get container status \"367adff7e0df3172115ff9029de44e70e959df4f1a81c267517d750fca703c3e\": rpc error: code = NotFound desc = could not find container \"367adff7e0df3172115ff9029de44e70e959df4f1a81c267517d750fca703c3e\": container with ID starting with 367adff7e0df3172115ff9029de44e70e959df4f1a81c267517d750fca703c3e not found: ID does not exist" Oct 09 16:48:00 crc kubenswrapper[4719]: I1009 16:48:00.983747 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ggt2r"] Oct 09 16:48:00 crc kubenswrapper[4719]: I1009 16:48:00.995870 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ggt2r"] Oct 09 16:48:01 crc kubenswrapper[4719]: I1009 16:48:01.160854 4719 scope.go:117] "RemoveContainer" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" Oct 09 16:48:01 crc kubenswrapper[4719]: E1009 16:48:01.161373 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:48:01 crc kubenswrapper[4719]: I1009 16:48:01.172950 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fd4f54d-b75c-4ef1-bf58-4f551158db6c" path="/var/lib/kubelet/pods/2fd4f54d-b75c-4ef1-bf58-4f551158db6c/volumes" Oct 09 16:48:06 crc kubenswrapper[4719]: I1009 16:48:06.659971 4719 generic.go:334] "Generic (PLEG): container finished" podID="686b6761-868b-4fd2-a0e3-5959f37637ad" containerID="23c115c5b8b865c5806d9934b3eb59b4140b95599ec19e55b09e8127e2c7d639" exitCode=0 Oct 09 16:48:06 crc kubenswrapper[4719]: I1009 16:48:06.660059 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2mzl/crc-debug-x7566" event={"ID":"686b6761-868b-4fd2-a0e3-5959f37637ad","Type":"ContainerDied","Data":"23c115c5b8b865c5806d9934b3eb59b4140b95599ec19e55b09e8127e2c7d639"} Oct 09 16:48:07 crc kubenswrapper[4719]: I1009 16:48:07.774142 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2mzl/crc-debug-x7566" Oct 09 16:48:07 crc kubenswrapper[4719]: I1009 16:48:07.795831 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qnpl\" (UniqueName: \"kubernetes.io/projected/686b6761-868b-4fd2-a0e3-5959f37637ad-kube-api-access-7qnpl\") pod \"686b6761-868b-4fd2-a0e3-5959f37637ad\" (UID: \"686b6761-868b-4fd2-a0e3-5959f37637ad\") " Oct 09 16:48:07 crc kubenswrapper[4719]: I1009 16:48:07.795877 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/686b6761-868b-4fd2-a0e3-5959f37637ad-host\") pod \"686b6761-868b-4fd2-a0e3-5959f37637ad\" (UID: \"686b6761-868b-4fd2-a0e3-5959f37637ad\") " Oct 09 16:48:07 crc kubenswrapper[4719]: I1009 16:48:07.796050 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/686b6761-868b-4fd2-a0e3-5959f37637ad-host" (OuterVolumeSpecName: "host") pod "686b6761-868b-4fd2-a0e3-5959f37637ad" (UID: "686b6761-868b-4fd2-a0e3-5959f37637ad"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 16:48:07 crc kubenswrapper[4719]: I1009 16:48:07.796465 4719 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/686b6761-868b-4fd2-a0e3-5959f37637ad-host\") on node \"crc\" DevicePath \"\"" Oct 09 16:48:07 crc kubenswrapper[4719]: I1009 16:48:07.815496 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/686b6761-868b-4fd2-a0e3-5959f37637ad-kube-api-access-7qnpl" (OuterVolumeSpecName: "kube-api-access-7qnpl") pod "686b6761-868b-4fd2-a0e3-5959f37637ad" (UID: "686b6761-868b-4fd2-a0e3-5959f37637ad"). InnerVolumeSpecName "kube-api-access-7qnpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:48:07 crc kubenswrapper[4719]: I1009 16:48:07.824660 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d2mzl/crc-debug-x7566"] Oct 09 16:48:07 crc kubenswrapper[4719]: I1009 16:48:07.845216 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d2mzl/crc-debug-x7566"] Oct 09 16:48:07 crc kubenswrapper[4719]: I1009 16:48:07.898231 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qnpl\" (UniqueName: \"kubernetes.io/projected/686b6761-868b-4fd2-a0e3-5959f37637ad-kube-api-access-7qnpl\") on node \"crc\" DevicePath \"\"" Oct 09 16:48:08 crc kubenswrapper[4719]: I1009 16:48:08.678584 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="856d3729394ddebcbd079302c8ecddbd28cdd54b39a2b835872a7a29c82a6531" Oct 09 16:48:08 crc kubenswrapper[4719]: I1009 16:48:08.678649 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2mzl/crc-debug-x7566" Oct 09 16:48:09 crc kubenswrapper[4719]: I1009 16:48:09.007037 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d2mzl/crc-debug-6h9qw"] Oct 09 16:48:09 crc kubenswrapper[4719]: E1009 16:48:09.007509 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fd4f54d-b75c-4ef1-bf58-4f551158db6c" containerName="extract-utilities" Oct 09 16:48:09 crc kubenswrapper[4719]: I1009 16:48:09.007526 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd4f54d-b75c-4ef1-bf58-4f551158db6c" containerName="extract-utilities" Oct 09 16:48:09 crc kubenswrapper[4719]: E1009 16:48:09.007545 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fd4f54d-b75c-4ef1-bf58-4f551158db6c" containerName="extract-content" Oct 09 16:48:09 crc kubenswrapper[4719]: I1009 16:48:09.007552 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd4f54d-b75c-4ef1-bf58-4f551158db6c" containerName="extract-content" Oct 09 16:48:09 crc kubenswrapper[4719]: E1009 16:48:09.007567 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="686b6761-868b-4fd2-a0e3-5959f37637ad" containerName="container-00" Oct 09 16:48:09 crc kubenswrapper[4719]: I1009 16:48:09.007575 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="686b6761-868b-4fd2-a0e3-5959f37637ad" containerName="container-00" Oct 09 16:48:09 crc kubenswrapper[4719]: E1009 16:48:09.007600 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fd4f54d-b75c-4ef1-bf58-4f551158db6c" containerName="registry-server" Oct 09 16:48:09 crc kubenswrapper[4719]: I1009 16:48:09.007608 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd4f54d-b75c-4ef1-bf58-4f551158db6c" containerName="registry-server" Oct 09 16:48:09 crc kubenswrapper[4719]: I1009 16:48:09.007840 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="686b6761-868b-4fd2-a0e3-5959f37637ad" containerName="container-00" Oct 09 16:48:09 crc kubenswrapper[4719]: I1009 16:48:09.007871 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fd4f54d-b75c-4ef1-bf58-4f551158db6c" containerName="registry-server" Oct 09 16:48:09 crc kubenswrapper[4719]: I1009 16:48:09.008704 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2mzl/crc-debug-6h9qw" Oct 09 16:48:09 crc kubenswrapper[4719]: I1009 16:48:09.121083 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/903e2931-9ed2-491e-a040-c42c900215c2-host\") pod \"crc-debug-6h9qw\" (UID: \"903e2931-9ed2-491e-a040-c42c900215c2\") " pod="openshift-must-gather-d2mzl/crc-debug-6h9qw" Oct 09 16:48:09 crc kubenswrapper[4719]: I1009 16:48:09.121766 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqv7c\" (UniqueName: \"kubernetes.io/projected/903e2931-9ed2-491e-a040-c42c900215c2-kube-api-access-jqv7c\") pod \"crc-debug-6h9qw\" (UID: \"903e2931-9ed2-491e-a040-c42c900215c2\") " pod="openshift-must-gather-d2mzl/crc-debug-6h9qw" Oct 09 16:48:09 crc kubenswrapper[4719]: I1009 16:48:09.173038 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="686b6761-868b-4fd2-a0e3-5959f37637ad" path="/var/lib/kubelet/pods/686b6761-868b-4fd2-a0e3-5959f37637ad/volumes" Oct 09 16:48:09 crc kubenswrapper[4719]: I1009 16:48:09.223117 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqv7c\" (UniqueName: \"kubernetes.io/projected/903e2931-9ed2-491e-a040-c42c900215c2-kube-api-access-jqv7c\") pod \"crc-debug-6h9qw\" (UID: \"903e2931-9ed2-491e-a040-c42c900215c2\") " pod="openshift-must-gather-d2mzl/crc-debug-6h9qw" Oct 09 16:48:09 crc kubenswrapper[4719]: I1009 16:48:09.223243 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/903e2931-9ed2-491e-a040-c42c900215c2-host\") pod \"crc-debug-6h9qw\" (UID: \"903e2931-9ed2-491e-a040-c42c900215c2\") " pod="openshift-must-gather-d2mzl/crc-debug-6h9qw" Oct 09 16:48:09 crc kubenswrapper[4719]: I1009 16:48:09.223503 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/903e2931-9ed2-491e-a040-c42c900215c2-host\") pod \"crc-debug-6h9qw\" (UID: \"903e2931-9ed2-491e-a040-c42c900215c2\") " pod="openshift-must-gather-d2mzl/crc-debug-6h9qw" Oct 09 16:48:09 crc kubenswrapper[4719]: I1009 16:48:09.243153 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqv7c\" (UniqueName: \"kubernetes.io/projected/903e2931-9ed2-491e-a040-c42c900215c2-kube-api-access-jqv7c\") pod \"crc-debug-6h9qw\" (UID: \"903e2931-9ed2-491e-a040-c42c900215c2\") " pod="openshift-must-gather-d2mzl/crc-debug-6h9qw" Oct 09 16:48:09 crc kubenswrapper[4719]: I1009 16:48:09.328952 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2mzl/crc-debug-6h9qw" Oct 09 16:48:09 crc kubenswrapper[4719]: I1009 16:48:09.700462 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2mzl/crc-debug-6h9qw" event={"ID":"903e2931-9ed2-491e-a040-c42c900215c2","Type":"ContainerStarted","Data":"30dffeaaed714faadf04ee26e38b1aa2496be5d91df1d1696b070438d2bc5847"} Oct 09 16:48:09 crc kubenswrapper[4719]: I1009 16:48:09.700821 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2mzl/crc-debug-6h9qw" event={"ID":"903e2931-9ed2-491e-a040-c42c900215c2","Type":"ContainerStarted","Data":"951c497bd9b66d8bbfb6b740f8736e5e65f461e5dec2cab0d708d1e142f4028f"} Oct 09 16:48:09 crc kubenswrapper[4719]: I1009 16:48:09.723228 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d2mzl/crc-debug-6h9qw" podStartSLOduration=1.723211206 podStartE2EDuration="1.723211206s" podCreationTimestamp="2025-10-09 16:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 16:48:09.721196832 +0000 UTC m=+5395.230908117" watchObservedRunningTime="2025-10-09 16:48:09.723211206 +0000 UTC m=+5395.232922491" Oct 09 16:48:10 crc kubenswrapper[4719]: I1009 16:48:10.711658 4719 generic.go:334] "Generic (PLEG): container finished" podID="903e2931-9ed2-491e-a040-c42c900215c2" containerID="30dffeaaed714faadf04ee26e38b1aa2496be5d91df1d1696b070438d2bc5847" exitCode=0 Oct 09 16:48:10 crc kubenswrapper[4719]: I1009 16:48:10.711766 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2mzl/crc-debug-6h9qw" event={"ID":"903e2931-9ed2-491e-a040-c42c900215c2","Type":"ContainerDied","Data":"30dffeaaed714faadf04ee26e38b1aa2496be5d91df1d1696b070438d2bc5847"} Oct 09 16:48:11 crc kubenswrapper[4719]: I1009 16:48:11.853380 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2mzl/crc-debug-6h9qw" Oct 09 16:48:11 crc kubenswrapper[4719]: I1009 16:48:11.983538 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/903e2931-9ed2-491e-a040-c42c900215c2-host\") pod \"903e2931-9ed2-491e-a040-c42c900215c2\" (UID: \"903e2931-9ed2-491e-a040-c42c900215c2\") " Oct 09 16:48:11 crc kubenswrapper[4719]: I1009 16:48:11.983690 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/903e2931-9ed2-491e-a040-c42c900215c2-host" (OuterVolumeSpecName: "host") pod "903e2931-9ed2-491e-a040-c42c900215c2" (UID: "903e2931-9ed2-491e-a040-c42c900215c2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 16:48:11 crc kubenswrapper[4719]: I1009 16:48:11.984111 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqv7c\" (UniqueName: \"kubernetes.io/projected/903e2931-9ed2-491e-a040-c42c900215c2-kube-api-access-jqv7c\") pod \"903e2931-9ed2-491e-a040-c42c900215c2\" (UID: \"903e2931-9ed2-491e-a040-c42c900215c2\") " Oct 09 16:48:11 crc kubenswrapper[4719]: I1009 16:48:11.984707 4719 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/903e2931-9ed2-491e-a040-c42c900215c2-host\") on node \"crc\" DevicePath \"\"" Oct 09 16:48:12 crc kubenswrapper[4719]: I1009 16:48:12.001089 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/903e2931-9ed2-491e-a040-c42c900215c2-kube-api-access-jqv7c" (OuterVolumeSpecName: "kube-api-access-jqv7c") pod "903e2931-9ed2-491e-a040-c42c900215c2" (UID: "903e2931-9ed2-491e-a040-c42c900215c2"). InnerVolumeSpecName "kube-api-access-jqv7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:48:12 crc kubenswrapper[4719]: I1009 16:48:12.013778 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d2mzl/crc-debug-6h9qw"] Oct 09 16:48:12 crc kubenswrapper[4719]: I1009 16:48:12.022181 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d2mzl/crc-debug-6h9qw"] Oct 09 16:48:12 crc kubenswrapper[4719]: I1009 16:48:12.086683 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqv7c\" (UniqueName: \"kubernetes.io/projected/903e2931-9ed2-491e-a040-c42c900215c2-kube-api-access-jqv7c\") on node \"crc\" DevicePath \"\"" Oct 09 16:48:12 crc kubenswrapper[4719]: I1009 16:48:12.162433 4719 scope.go:117] "RemoveContainer" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" Oct 09 16:48:12 crc kubenswrapper[4719]: E1009 16:48:12.162818 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:48:12 crc kubenswrapper[4719]: I1009 16:48:12.732274 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="951c497bd9b66d8bbfb6b740f8736e5e65f461e5dec2cab0d708d1e142f4028f" Oct 09 16:48:12 crc kubenswrapper[4719]: I1009 16:48:12.732389 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2mzl/crc-debug-6h9qw" Oct 09 16:48:13 crc kubenswrapper[4719]: I1009 16:48:13.173494 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="903e2931-9ed2-491e-a040-c42c900215c2" path="/var/lib/kubelet/pods/903e2931-9ed2-491e-a040-c42c900215c2/volumes" Oct 09 16:48:13 crc kubenswrapper[4719]: I1009 16:48:13.281754 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d2mzl/crc-debug-z8mbw"] Oct 09 16:48:13 crc kubenswrapper[4719]: E1009 16:48:13.282454 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="903e2931-9ed2-491e-a040-c42c900215c2" containerName="container-00" Oct 09 16:48:13 crc kubenswrapper[4719]: I1009 16:48:13.282470 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="903e2931-9ed2-491e-a040-c42c900215c2" containerName="container-00" Oct 09 16:48:13 crc kubenswrapper[4719]: I1009 16:48:13.282669 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="903e2931-9ed2-491e-a040-c42c900215c2" containerName="container-00" Oct 09 16:48:13 crc kubenswrapper[4719]: I1009 16:48:13.283437 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2mzl/crc-debug-z8mbw" Oct 09 16:48:13 crc kubenswrapper[4719]: I1009 16:48:13.412116 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52293f0f-00fb-4fc4-ab57-a1538fe759b9-host\") pod \"crc-debug-z8mbw\" (UID: \"52293f0f-00fb-4fc4-ab57-a1538fe759b9\") " pod="openshift-must-gather-d2mzl/crc-debug-z8mbw" Oct 09 16:48:13 crc kubenswrapper[4719]: I1009 16:48:13.412182 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bbn9\" (UniqueName: \"kubernetes.io/projected/52293f0f-00fb-4fc4-ab57-a1538fe759b9-kube-api-access-4bbn9\") pod \"crc-debug-z8mbw\" (UID: \"52293f0f-00fb-4fc4-ab57-a1538fe759b9\") " pod="openshift-must-gather-d2mzl/crc-debug-z8mbw" Oct 09 16:48:13 crc kubenswrapper[4719]: I1009 16:48:13.520689 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52293f0f-00fb-4fc4-ab57-a1538fe759b9-host\") pod \"crc-debug-z8mbw\" (UID: \"52293f0f-00fb-4fc4-ab57-a1538fe759b9\") " pod="openshift-must-gather-d2mzl/crc-debug-z8mbw" Oct 09 16:48:13 crc kubenswrapper[4719]: I1009 16:48:13.520952 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bbn9\" (UniqueName: \"kubernetes.io/projected/52293f0f-00fb-4fc4-ab57-a1538fe759b9-kube-api-access-4bbn9\") pod \"crc-debug-z8mbw\" (UID: \"52293f0f-00fb-4fc4-ab57-a1538fe759b9\") " pod="openshift-must-gather-d2mzl/crc-debug-z8mbw" Oct 09 16:48:13 crc kubenswrapper[4719]: I1009 16:48:13.521204 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52293f0f-00fb-4fc4-ab57-a1538fe759b9-host\") pod \"crc-debug-z8mbw\" (UID: \"52293f0f-00fb-4fc4-ab57-a1538fe759b9\") " pod="openshift-must-gather-d2mzl/crc-debug-z8mbw" Oct 09 16:48:13 crc kubenswrapper[4719]: I1009 16:48:13.544211 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bbn9\" (UniqueName: \"kubernetes.io/projected/52293f0f-00fb-4fc4-ab57-a1538fe759b9-kube-api-access-4bbn9\") pod \"crc-debug-z8mbw\" (UID: \"52293f0f-00fb-4fc4-ab57-a1538fe759b9\") " pod="openshift-must-gather-d2mzl/crc-debug-z8mbw" Oct 09 16:48:13 crc kubenswrapper[4719]: I1009 16:48:13.598585 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2mzl/crc-debug-z8mbw" Oct 09 16:48:13 crc kubenswrapper[4719]: I1009 16:48:13.766421 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2mzl/crc-debug-z8mbw" event={"ID":"52293f0f-00fb-4fc4-ab57-a1538fe759b9","Type":"ContainerStarted","Data":"f552eabb3c079b610290be853f435183153be36822de72e8815a4758c435afbc"} Oct 09 16:48:14 crc kubenswrapper[4719]: I1009 16:48:14.778111 4719 generic.go:334] "Generic (PLEG): container finished" podID="52293f0f-00fb-4fc4-ab57-a1538fe759b9" containerID="def9d58b89986fb8c3b69f824245bae9cb4336409acbe75995a3b8f9e2315fcf" exitCode=0 Oct 09 16:48:14 crc kubenswrapper[4719]: I1009 16:48:14.778216 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2mzl/crc-debug-z8mbw" event={"ID":"52293f0f-00fb-4fc4-ab57-a1538fe759b9","Type":"ContainerDied","Data":"def9d58b89986fb8c3b69f824245bae9cb4336409acbe75995a3b8f9e2315fcf"} Oct 09 16:48:14 crc kubenswrapper[4719]: I1009 16:48:14.812855 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d2mzl/crc-debug-z8mbw"] Oct 09 16:48:14 crc kubenswrapper[4719]: I1009 16:48:14.821800 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d2mzl/crc-debug-z8mbw"] Oct 09 16:48:15 crc kubenswrapper[4719]: I1009 16:48:15.919882 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2mzl/crc-debug-z8mbw" Oct 09 16:48:15 crc kubenswrapper[4719]: I1009 16:48:15.983519 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52293f0f-00fb-4fc4-ab57-a1538fe759b9-host\") pod \"52293f0f-00fb-4fc4-ab57-a1538fe759b9\" (UID: \"52293f0f-00fb-4fc4-ab57-a1538fe759b9\") " Oct 09 16:48:15 crc kubenswrapper[4719]: I1009 16:48:15.983617 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bbn9\" (UniqueName: \"kubernetes.io/projected/52293f0f-00fb-4fc4-ab57-a1538fe759b9-kube-api-access-4bbn9\") pod \"52293f0f-00fb-4fc4-ab57-a1538fe759b9\" (UID: \"52293f0f-00fb-4fc4-ab57-a1538fe759b9\") " Oct 09 16:48:15 crc kubenswrapper[4719]: I1009 16:48:15.983645 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52293f0f-00fb-4fc4-ab57-a1538fe759b9-host" (OuterVolumeSpecName: "host") pod "52293f0f-00fb-4fc4-ab57-a1538fe759b9" (UID: "52293f0f-00fb-4fc4-ab57-a1538fe759b9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 16:48:15 crc kubenswrapper[4719]: I1009 16:48:15.984071 4719 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52293f0f-00fb-4fc4-ab57-a1538fe759b9-host\") on node \"crc\" DevicePath \"\"" Oct 09 16:48:15 crc kubenswrapper[4719]: I1009 16:48:15.989777 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52293f0f-00fb-4fc4-ab57-a1538fe759b9-kube-api-access-4bbn9" (OuterVolumeSpecName: "kube-api-access-4bbn9") pod "52293f0f-00fb-4fc4-ab57-a1538fe759b9" (UID: "52293f0f-00fb-4fc4-ab57-a1538fe759b9"). InnerVolumeSpecName "kube-api-access-4bbn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:48:16 crc kubenswrapper[4719]: I1009 16:48:16.085921 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bbn9\" (UniqueName: \"kubernetes.io/projected/52293f0f-00fb-4fc4-ab57-a1538fe759b9-kube-api-access-4bbn9\") on node \"crc\" DevicePath \"\"" Oct 09 16:48:16 crc kubenswrapper[4719]: I1009 16:48:16.796374 4719 scope.go:117] "RemoveContainer" containerID="def9d58b89986fb8c3b69f824245bae9cb4336409acbe75995a3b8f9e2315fcf" Oct 09 16:48:16 crc kubenswrapper[4719]: I1009 16:48:16.796437 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2mzl/crc-debug-z8mbw" Oct 09 16:48:17 crc kubenswrapper[4719]: I1009 16:48:17.172322 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52293f0f-00fb-4fc4-ab57-a1538fe759b9" path="/var/lib/kubelet/pods/52293f0f-00fb-4fc4-ab57-a1538fe759b9/volumes" Oct 09 16:48:24 crc kubenswrapper[4719]: I1009 16:48:24.160954 4719 scope.go:117] "RemoveContainer" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" Oct 09 16:48:24 crc kubenswrapper[4719]: E1009 16:48:24.162014 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:48:35 crc kubenswrapper[4719]: I1009 16:48:35.680107 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b4958cb64-65wft_0a621c39-47f2-4b25-ac34-cf712d8b27c3/barbican-api/0.log" Oct 09 16:48:35 crc kubenswrapper[4719]: I1009 16:48:35.760272 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b4958cb64-65wft_0a621c39-47f2-4b25-ac34-cf712d8b27c3/barbican-api-log/0.log" Oct 09 16:48:35 crc kubenswrapper[4719]: I1009 16:48:35.871420 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7ccff5b764-rskpw_276c7ea1-10eb-4a7d-9eb1-50c62518b5b4/barbican-keystone-listener/0.log" Oct 09 16:48:35 crc kubenswrapper[4719]: I1009 16:48:35.943632 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7ccff5b764-rskpw_276c7ea1-10eb-4a7d-9eb1-50c62518b5b4/barbican-keystone-listener-log/0.log" Oct 09 16:48:36 crc kubenswrapper[4719]: I1009 16:48:36.079594 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-68857c4d7f-ns5gc_e305acce-34be-4503-b643-b60e4201ecfa/barbican-worker/0.log" Oct 09 16:48:36 crc kubenswrapper[4719]: I1009 16:48:36.125313 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-68857c4d7f-ns5gc_e305acce-34be-4503-b643-b60e4201ecfa/barbican-worker-log/0.log" Oct 09 16:48:36 crc kubenswrapper[4719]: I1009 16:48:36.276057 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh_75c7240d-03e4-40f9-a915-c85892b060d9/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:48:36 crc kubenswrapper[4719]: I1009 16:48:36.387567 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c2a64def-d060-46e7-8792-835bb734a809/ceilometer-notification-agent/0.log" Oct 09 16:48:36 crc kubenswrapper[4719]: I1009 16:48:36.489209 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c2a64def-d060-46e7-8792-835bb734a809/ceilometer-central-agent/0.log" Oct 09 16:48:36 crc kubenswrapper[4719]: I1009 16:48:36.519210 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c2a64def-d060-46e7-8792-835bb734a809/proxy-httpd/0.log" Oct 09 16:48:36 crc kubenswrapper[4719]: I1009 16:48:36.602139 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c2a64def-d060-46e7-8792-835bb734a809/sg-core/0.log" Oct 09 16:48:36 crc kubenswrapper[4719]: I1009 16:48:36.916000 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f5b332c9-c154-4ef0-8921-4e329b4b504a/cinder-api-log/0.log" Oct 09 16:48:37 crc kubenswrapper[4719]: I1009 16:48:37.225907 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_7fd9ad9a-1651-46cc-9c22-adae6a548ef8/probe/0.log" Oct 09 16:48:37 crc kubenswrapper[4719]: I1009 16:48:37.326567 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f5b332c9-c154-4ef0-8921-4e329b4b504a/cinder-api/0.log" Oct 09 16:48:37 crc kubenswrapper[4719]: I1009 16:48:37.405033 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_7fd9ad9a-1651-46cc-9c22-adae6a548ef8/cinder-backup/0.log" Oct 09 16:48:37 crc kubenswrapper[4719]: I1009 16:48:37.469343 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_46e46828-5596-4987-8998-c52dbaf93086/cinder-scheduler/0.log" Oct 09 16:48:37 crc kubenswrapper[4719]: I1009 16:48:37.612289 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_46e46828-5596-4987-8998-c52dbaf93086/probe/0.log" Oct 09 16:48:37 crc kubenswrapper[4719]: I1009 16:48:37.776654 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_b95abe4c-159b-460a-b238-3be4b341ccc2/cinder-volume/0.log" Oct 09 16:48:37 crc kubenswrapper[4719]: I1009 16:48:37.790875 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_b95abe4c-159b-460a-b238-3be4b341ccc2/probe/0.log" Oct 09 16:48:37 crc kubenswrapper[4719]: I1009 16:48:37.997187 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_c3633b1f-2c6e-4483-8255-71551f8a25db/cinder-volume/0.log" Oct 09 16:48:38 crc kubenswrapper[4719]: I1009 16:48:38.085762 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_c3633b1f-2c6e-4483-8255-71551f8a25db/probe/0.log" Oct 09 16:48:38 crc kubenswrapper[4719]: I1009 16:48:38.242564 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-fstf7_6368a031-4a2d-43bd-a289-fd9966d38182/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:48:38 crc kubenswrapper[4719]: I1009 16:48:38.328835 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-b2llw_6fa2621b-c679-4391-9058-cd2a871264df/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:48:38 crc kubenswrapper[4719]: I1009 16:48:38.859890 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-756dcbffdc-f275x_78bd028a-6324-402f-80e4-5a712e07bfb6/init/0.log" Oct 09 16:48:38 crc kubenswrapper[4719]: I1009 16:48:38.922389 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-blvrz_c2ecd37c-0c41-4b8f-8072-c690aa729218/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:48:39 crc kubenswrapper[4719]: I1009 16:48:39.085458 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-756dcbffdc-f275x_78bd028a-6324-402f-80e4-5a712e07bfb6/init/0.log" Oct 09 16:48:39 crc kubenswrapper[4719]: I1009 16:48:39.162800 4719 scope.go:117] "RemoveContainer" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" Oct 09 16:48:39 crc kubenswrapper[4719]: E1009 16:48:39.163523 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:48:39 crc kubenswrapper[4719]: I1009 16:48:39.240612 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl_49f3b180-01ca-489f-9a12-5e22d186b1b7/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:48:39 crc kubenswrapper[4719]: I1009 16:48:39.298957 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-756dcbffdc-f275x_78bd028a-6324-402f-80e4-5a712e07bfb6/dnsmasq-dns/0.log" Oct 09 16:48:39 crc kubenswrapper[4719]: I1009 16:48:39.350865 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_b3bab132-2f43-4321-99c6-6164f0f93e86/glance-httpd/0.log" Oct 09 16:48:39 crc kubenswrapper[4719]: I1009 16:48:39.479446 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_de5609da-6273-4076-9f02-b6c4614ebd07/glance-httpd/0.log" Oct 09 16:48:39 crc kubenswrapper[4719]: I1009 16:48:39.481807 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_b3bab132-2f43-4321-99c6-6164f0f93e86/glance-log/0.log" Oct 09 16:48:39 crc kubenswrapper[4719]: I1009 16:48:39.576156 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_de5609da-6273-4076-9f02-b6c4614ebd07/glance-log/0.log" Oct 09 16:48:39 crc kubenswrapper[4719]: I1009 16:48:39.758863 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6f5bc696cd-sgqb2_a66fd9c2-b3cc-43db-b520-6972ce53871f/horizon/0.log" Oct 09 16:48:39 crc kubenswrapper[4719]: I1009 16:48:39.855942 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw_976acf87-d11d-47a4-ad0d-2119fc70504c/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:48:40 crc kubenswrapper[4719]: I1009 16:48:40.074531 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-fm7mp_1f37f188-1b48-4b10-a085-e6a44d7e16d5/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:48:40 crc kubenswrapper[4719]: I1009 16:48:40.488319 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6f5bc696cd-sgqb2_a66fd9c2-b3cc-43db-b520-6972ce53871f/horizon-log/0.log" Oct 09 16:48:40 crc kubenswrapper[4719]: I1009 16:48:40.700082 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-69dbc5fbc7-t286g_ad01be3d-57da-4019-8059-f0a78501266b/keystone-api/0.log" Oct 09 16:48:40 crc kubenswrapper[4719]: I1009 16:48:40.781586 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29333761-x56tx_88187911-7d06-4147-97ad-9279f3e101e0/keystone-cron/0.log" Oct 09 16:48:40 crc kubenswrapper[4719]: I1009 16:48:40.852417 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_1f083e47-9fa6-462e-b596-8665719a2e4f/kube-state-metrics/0.log" Oct 09 16:48:40 crc kubenswrapper[4719]: I1009 16:48:40.984314 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-7mv72_2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:48:41 crc kubenswrapper[4719]: I1009 16:48:41.375486 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg_d36d0870-b55a-4791-9554-11d38e304e92/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:48:41 crc kubenswrapper[4719]: I1009 16:48:41.401037 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-57c55b4b47-8npb9_f4a6c362-de01-454a-a0d8-7ea4c677720c/neutron-api/0.log" Oct 09 16:48:41 crc kubenswrapper[4719]: I1009 16:48:41.453812 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-57c55b4b47-8npb9_f4a6c362-de01-454a-a0d8-7ea4c677720c/neutron-httpd/0.log" Oct 09 16:48:41 crc kubenswrapper[4719]: I1009 16:48:41.971813 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_1a46282a-5af1-483f-97a4-b96fd855dc00/nova-cell0-conductor-conductor/0.log" Oct 09 16:48:42 crc kubenswrapper[4719]: I1009 16:48:42.344503 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_d0cbbab9-8de8-43f9-bf34-b235d2fb4400/nova-cell1-conductor-conductor/0.log" Oct 09 16:48:42 crc kubenswrapper[4719]: I1009 16:48:42.575611 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_0ff2cdae-bf76-4452-9d8d-26560a89a2da/nova-cell1-novncproxy-novncproxy/0.log" Oct 09 16:48:42 crc kubenswrapper[4719]: I1009 16:48:42.902139 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-vc5ss_23a47423-b3ad-4ba3-b0ab-9a452d485f2b/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:48:43 crc kubenswrapper[4719]: I1009 16:48:43.110557 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_c3ae86dd-d13b-46b0-8f6f-a913c783a884/nova-api-log/0.log" Oct 09 16:48:43 crc kubenswrapper[4719]: I1009 16:48:43.162636 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c1cdf53c-cd57-4e4c-85b0-178a7bc15043/nova-metadata-log/0.log" Oct 09 16:48:43 crc kubenswrapper[4719]: I1009 16:48:43.445326 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_c3ae86dd-d13b-46b0-8f6f-a913c783a884/nova-api-api/0.log" Oct 09 16:48:43 crc kubenswrapper[4719]: I1009 16:48:43.646516 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d6970b67-4ebd-401d-838b-8be92b8ba72f/mysql-bootstrap/0.log" Oct 09 16:48:43 crc kubenswrapper[4719]: I1009 16:48:43.676934 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c8b2b868-83d7-496b-8036-a10584724f35/nova-scheduler-scheduler/0.log" Oct 09 16:48:43 crc kubenswrapper[4719]: I1009 16:48:43.853019 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d6970b67-4ebd-401d-838b-8be92b8ba72f/mysql-bootstrap/0.log" Oct 09 16:48:43 crc kubenswrapper[4719]: I1009 16:48:43.865835 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d6970b67-4ebd-401d-838b-8be92b8ba72f/galera/0.log" Oct 09 16:48:44 crc kubenswrapper[4719]: I1009 16:48:44.062480 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_05ff8a95-a910-4095-930b-e42c575bf4b8/mysql-bootstrap/0.log" Oct 09 16:48:44 crc kubenswrapper[4719]: I1009 16:48:44.290529 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_05ff8a95-a910-4095-930b-e42c575bf4b8/mysql-bootstrap/0.log" Oct 09 16:48:44 crc kubenswrapper[4719]: I1009 16:48:44.339967 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_05ff8a95-a910-4095-930b-e42c575bf4b8/galera/0.log" Oct 09 16:48:44 crc kubenswrapper[4719]: I1009 16:48:44.561645 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_893b05af-4bf3-4c76-940c-3ed1cceb7e18/openstackclient/0.log" Oct 09 16:48:44 crc kubenswrapper[4719]: I1009 16:48:44.615064 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-skb56_a6539f12-5508-4c6d-870a-d19815ba3120/openstack-network-exporter/0.log" Oct 09 16:48:44 crc kubenswrapper[4719]: I1009 16:48:44.779282 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hqfvq_07b112ef-0e6a-4927-93e4-d5fc023e495f/ovsdb-server-init/0.log" Oct 09 16:48:44 crc kubenswrapper[4719]: I1009 16:48:44.973493 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hqfvq_07b112ef-0e6a-4927-93e4-d5fc023e495f/ovsdb-server-init/0.log" Oct 09 16:48:45 crc kubenswrapper[4719]: I1009 16:48:45.078884 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hqfvq_07b112ef-0e6a-4927-93e4-d5fc023e495f/ovsdb-server/0.log" Oct 09 16:48:45 crc kubenswrapper[4719]: I1009 16:48:45.271578 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-p4t6l_f0151a18-0608-47b9-b58a-7eef9dfaf31b/ovn-controller/0.log" Oct 09 16:48:45 crc kubenswrapper[4719]: I1009 16:48:45.441117 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hqfvq_07b112ef-0e6a-4927-93e4-d5fc023e495f/ovs-vswitchd/0.log" Oct 09 16:48:45 crc kubenswrapper[4719]: I1009 16:48:45.466619 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c1cdf53c-cd57-4e4c-85b0-178a7bc15043/nova-metadata-metadata/0.log" Oct 09 16:48:45 crc kubenswrapper[4719]: I1009 16:48:45.548254 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-7f2xg_a768f51e-2990-40f5-84df-13c410d05385/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:48:45 crc kubenswrapper[4719]: I1009 16:48:45.633209 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4276fa06-e8dc-40e0-8276-eaf58420e0ca/openstack-network-exporter/0.log" Oct 09 16:48:45 crc kubenswrapper[4719]: I1009 16:48:45.722740 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4276fa06-e8dc-40e0-8276-eaf58420e0ca/ovn-northd/0.log" Oct 09 16:48:46 crc kubenswrapper[4719]: I1009 16:48:46.118645 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_29d7fec9-be2c-4fa8-9191-5ffaf287f825/openstack-network-exporter/0.log" Oct 09 16:48:46 crc kubenswrapper[4719]: I1009 16:48:46.195173 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_29d7fec9-be2c-4fa8-9191-5ffaf287f825/ovsdbserver-nb/0.log" Oct 09 16:48:46 crc kubenswrapper[4719]: I1009 16:48:46.277413 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c/openstack-network-exporter/0.log" Oct 09 16:48:46 crc kubenswrapper[4719]: I1009 16:48:46.385171 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c/ovsdbserver-sb/0.log" Oct 09 16:48:46 crc kubenswrapper[4719]: I1009 16:48:46.732893 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5db5d6b746-l6xlx_494a5aaa-f833-4429-bb35-d745fcdf4ad1/placement-api/0.log" Oct 09 16:48:46 crc kubenswrapper[4719]: I1009 16:48:46.792708 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_9cebaae5-69d4-4429-a062-aef6cafb9f4a/init-config-reloader/0.log" Oct 09 16:48:46 crc kubenswrapper[4719]: I1009 16:48:46.817555 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5db5d6b746-l6xlx_494a5aaa-f833-4429-bb35-d745fcdf4ad1/placement-log/0.log" Oct 09 16:48:46 crc kubenswrapper[4719]: I1009 16:48:46.985300 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_9cebaae5-69d4-4429-a062-aef6cafb9f4a/init-config-reloader/0.log" Oct 09 16:48:47 crc kubenswrapper[4719]: I1009 16:48:47.027926 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_9cebaae5-69d4-4429-a062-aef6cafb9f4a/config-reloader/0.log" Oct 09 16:48:47 crc kubenswrapper[4719]: I1009 16:48:47.108338 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_9cebaae5-69d4-4429-a062-aef6cafb9f4a/prometheus/0.log" Oct 09 16:48:47 crc kubenswrapper[4719]: I1009 16:48:47.130100 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_9cebaae5-69d4-4429-a062-aef6cafb9f4a/thanos-sidecar/0.log" Oct 09 16:48:47 crc kubenswrapper[4719]: I1009 16:48:47.317818 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8/setup-container/0.log" Oct 09 16:48:47 crc kubenswrapper[4719]: I1009 16:48:47.555254 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8/rabbitmq/0.log" Oct 09 16:48:47 crc kubenswrapper[4719]: I1009 16:48:47.569670 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8/setup-container/0.log" Oct 09 16:48:47 crc kubenswrapper[4719]: I1009 16:48:47.588022 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_1df540c9-8b54-44a5-9c5d-03cf736ee67a/setup-container/0.log" Oct 09 16:48:47 crc kubenswrapper[4719]: I1009 16:48:47.848503 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_1df540c9-8b54-44a5-9c5d-03cf736ee67a/setup-container/0.log" Oct 09 16:48:47 crc kubenswrapper[4719]: I1009 16:48:47.855798 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cf67e8f5-acbb-4033-bcca-d9c86d2be88c/setup-container/0.log" Oct 09 16:48:47 crc kubenswrapper[4719]: I1009 16:48:47.884193 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_1df540c9-8b54-44a5-9c5d-03cf736ee67a/rabbitmq/0.log" Oct 09 16:48:48 crc kubenswrapper[4719]: I1009 16:48:48.208561 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cf67e8f5-acbb-4033-bcca-d9c86d2be88c/rabbitmq/0.log" Oct 09 16:48:48 crc kubenswrapper[4719]: I1009 16:48:48.210984 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cf67e8f5-acbb-4033-bcca-d9c86d2be88c/setup-container/0.log" Oct 09 16:48:48 crc kubenswrapper[4719]: I1009 16:48:48.292594 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v_39e39eb0-02e7-46b7-82be-38cbb9e1bf19/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:48:48 crc kubenswrapper[4719]: I1009 16:48:48.514280 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-g2mb5_2cbe17ac-7862-4175-9d90-10fe6c51cfb4/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:48:48 crc kubenswrapper[4719]: I1009 16:48:48.569448 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-48j65_35cce4cf-e1ff-44fb-9f62-887951a77275/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:48:48 crc kubenswrapper[4719]: I1009 16:48:48.769828 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-h4kn5_39fd920e-4d39-4926-b9d2-3c3c02ebb9ed/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:48:48 crc kubenswrapper[4719]: I1009 16:48:48.821872 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-hfwbm_7df6a296-587c-407c-b2b4-ec923cd05cda/ssh-known-hosts-edpm-deployment/0.log" Oct 09 16:48:49 crc kubenswrapper[4719]: I1009 16:48:49.275209 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-556bc79449-9bdkb_6cbef595-0a78-4655-85ca-b329f51067af/proxy-httpd/0.log" Oct 09 16:48:49 crc kubenswrapper[4719]: I1009 16:48:49.302360 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-556bc79449-9bdkb_6cbef595-0a78-4655-85ca-b329f51067af/proxy-server/0.log" Oct 09 16:48:49 crc kubenswrapper[4719]: I1009 16:48:49.427054 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-dd7hf_5685c463-d342-436a-a619-f809a2559691/swift-ring-rebalance/0.log" Oct 09 16:48:49 crc kubenswrapper[4719]: I1009 16:48:49.507665 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/account-auditor/0.log" Oct 09 16:48:49 crc kubenswrapper[4719]: I1009 16:48:49.620191 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/account-reaper/0.log" Oct 09 16:48:49 crc kubenswrapper[4719]: I1009 16:48:49.685092 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/account-replicator/0.log" Oct 09 16:48:49 crc kubenswrapper[4719]: I1009 16:48:49.742619 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/account-server/0.log" Oct 09 16:48:49 crc kubenswrapper[4719]: I1009 16:48:49.930776 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/container-auditor/0.log" Oct 09 16:48:49 crc kubenswrapper[4719]: I1009 16:48:49.936621 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/container-replicator/0.log" Oct 09 16:48:49 crc kubenswrapper[4719]: I1009 16:48:49.953992 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/container-server/0.log" Oct 09 16:48:50 crc kubenswrapper[4719]: I1009 16:48:50.039937 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/container-updater/0.log" Oct 09 16:48:50 crc kubenswrapper[4719]: I1009 16:48:50.135086 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/object-expirer/0.log" Oct 09 16:48:50 crc kubenswrapper[4719]: I1009 16:48:50.195590 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/object-auditor/0.log" Oct 09 16:48:50 crc kubenswrapper[4719]: I1009 16:48:50.208964 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/object-replicator/0.log" Oct 09 16:48:50 crc kubenswrapper[4719]: I1009 16:48:50.260382 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/object-server/0.log" Oct 09 16:48:50 crc kubenswrapper[4719]: I1009 16:48:50.418608 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/object-updater/0.log" Oct 09 16:48:50 crc kubenswrapper[4719]: I1009 16:48:50.420360 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/swift-recon-cron/0.log" Oct 09 16:48:50 crc kubenswrapper[4719]: I1009 16:48:50.464786 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/rsync/0.log" Oct 09 16:48:50 crc kubenswrapper[4719]: I1009 16:48:50.748120 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt_0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:48:50 crc kubenswrapper[4719]: I1009 16:48:50.753271 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_8905824a-8f15-4df7-b938-b63f2a5aebb1/tempest-tests-tempest-tests-runner/0.log" Oct 09 16:48:50 crc kubenswrapper[4719]: I1009 16:48:50.961043 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_0677443b-23bd-4727-a28c-34f602835052/test-operator-logs-container/0.log" Oct 09 16:48:51 crc kubenswrapper[4719]: I1009 16:48:51.084945 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb_506813a5-78ae-4083-8d8f-27f6a46858c8/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:48:51 crc kubenswrapper[4719]: I1009 16:48:51.993572 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_b9629af3-81da-4d90-a2a8-735ac9bdaeb2/watcher-applier/0.log" Oct 09 16:48:52 crc kubenswrapper[4719]: I1009 16:48:52.403426 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_ead85b36-611c-47d5-8eb2-cddfecffaa77/watcher-api-log/0.log" Oct 09 16:48:54 crc kubenswrapper[4719]: I1009 16:48:54.164981 4719 scope.go:117] "RemoveContainer" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" Oct 09 16:48:54 crc kubenswrapper[4719]: E1009 16:48:54.165537 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:48:55 crc kubenswrapper[4719]: I1009 16:48:55.245889 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_c5d4f8ef-6b73-4d97-9899-49865bf6d744/watcher-decision-engine/0.log" Oct 09 16:48:56 crc kubenswrapper[4719]: I1009 16:48:56.273813 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_ead85b36-611c-47d5-8eb2-cddfecffaa77/watcher-api/0.log" Oct 09 16:49:01 crc kubenswrapper[4719]: I1009 16:49:01.032668 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_c7495027-5c56-46e2-9947-1ad2d6bcaf28/memcached/0.log" Oct 09 16:49:01 crc kubenswrapper[4719]: I1009 16:49:01.918332 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ks5dc"] Oct 09 16:49:01 crc kubenswrapper[4719]: E1009 16:49:01.919057 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52293f0f-00fb-4fc4-ab57-a1538fe759b9" containerName="container-00" Oct 09 16:49:01 crc kubenswrapper[4719]: I1009 16:49:01.919072 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="52293f0f-00fb-4fc4-ab57-a1538fe759b9" containerName="container-00" Oct 09 16:49:01 crc kubenswrapper[4719]: I1009 16:49:01.919281 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="52293f0f-00fb-4fc4-ab57-a1538fe759b9" containerName="container-00" Oct 09 16:49:01 crc kubenswrapper[4719]: I1009 16:49:01.920718 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ks5dc" Oct 09 16:49:01 crc kubenswrapper[4719]: I1009 16:49:01.932492 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ks5dc"] Oct 09 16:49:02 crc kubenswrapper[4719]: I1009 16:49:02.031006 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccpvg\" (UniqueName: \"kubernetes.io/projected/b03be8d2-2a64-413e-84bc-ea2dbb249d44-kube-api-access-ccpvg\") pod \"certified-operators-ks5dc\" (UID: \"b03be8d2-2a64-413e-84bc-ea2dbb249d44\") " pod="openshift-marketplace/certified-operators-ks5dc" Oct 09 16:49:02 crc kubenswrapper[4719]: I1009 16:49:02.031107 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b03be8d2-2a64-413e-84bc-ea2dbb249d44-catalog-content\") pod \"certified-operators-ks5dc\" (UID: \"b03be8d2-2a64-413e-84bc-ea2dbb249d44\") " pod="openshift-marketplace/certified-operators-ks5dc" Oct 09 16:49:02 crc kubenswrapper[4719]: I1009 16:49:02.031204 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b03be8d2-2a64-413e-84bc-ea2dbb249d44-utilities\") pod \"certified-operators-ks5dc\" (UID: \"b03be8d2-2a64-413e-84bc-ea2dbb249d44\") " pod="openshift-marketplace/certified-operators-ks5dc" Oct 09 16:49:02 crc kubenswrapper[4719]: I1009 16:49:02.133066 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b03be8d2-2a64-413e-84bc-ea2dbb249d44-utilities\") pod \"certified-operators-ks5dc\" (UID: \"b03be8d2-2a64-413e-84bc-ea2dbb249d44\") " pod="openshift-marketplace/certified-operators-ks5dc" Oct 09 16:49:02 crc kubenswrapper[4719]: I1009 16:49:02.133263 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccpvg\" (UniqueName: \"kubernetes.io/projected/b03be8d2-2a64-413e-84bc-ea2dbb249d44-kube-api-access-ccpvg\") pod \"certified-operators-ks5dc\" (UID: \"b03be8d2-2a64-413e-84bc-ea2dbb249d44\") " pod="openshift-marketplace/certified-operators-ks5dc" Oct 09 16:49:02 crc kubenswrapper[4719]: I1009 16:49:02.133334 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b03be8d2-2a64-413e-84bc-ea2dbb249d44-catalog-content\") pod \"certified-operators-ks5dc\" (UID: \"b03be8d2-2a64-413e-84bc-ea2dbb249d44\") " pod="openshift-marketplace/certified-operators-ks5dc" Oct 09 16:49:02 crc kubenswrapper[4719]: I1009 16:49:02.133809 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b03be8d2-2a64-413e-84bc-ea2dbb249d44-utilities\") pod \"certified-operators-ks5dc\" (UID: \"b03be8d2-2a64-413e-84bc-ea2dbb249d44\") " pod="openshift-marketplace/certified-operators-ks5dc" Oct 09 16:49:02 crc kubenswrapper[4719]: I1009 16:49:02.134109 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b03be8d2-2a64-413e-84bc-ea2dbb249d44-catalog-content\") pod \"certified-operators-ks5dc\" (UID: \"b03be8d2-2a64-413e-84bc-ea2dbb249d44\") " pod="openshift-marketplace/certified-operators-ks5dc" Oct 09 16:49:02 crc kubenswrapper[4719]: I1009 16:49:02.153502 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccpvg\" (UniqueName: \"kubernetes.io/projected/b03be8d2-2a64-413e-84bc-ea2dbb249d44-kube-api-access-ccpvg\") pod \"certified-operators-ks5dc\" (UID: \"b03be8d2-2a64-413e-84bc-ea2dbb249d44\") " pod="openshift-marketplace/certified-operators-ks5dc" Oct 09 16:49:02 crc kubenswrapper[4719]: I1009 16:49:02.241374 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ks5dc" Oct 09 16:49:02 crc kubenswrapper[4719]: I1009 16:49:02.841816 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ks5dc"] Oct 09 16:49:03 crc kubenswrapper[4719]: I1009 16:49:03.285668 4719 generic.go:334] "Generic (PLEG): container finished" podID="b03be8d2-2a64-413e-84bc-ea2dbb249d44" containerID="fa4a27f79844d505f5009956910759499fcbaf9959ed14b64af173af6b908b81" exitCode=0 Oct 09 16:49:03 crc kubenswrapper[4719]: I1009 16:49:03.285716 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ks5dc" event={"ID":"b03be8d2-2a64-413e-84bc-ea2dbb249d44","Type":"ContainerDied","Data":"fa4a27f79844d505f5009956910759499fcbaf9959ed14b64af173af6b908b81"} Oct 09 16:49:03 crc kubenswrapper[4719]: I1009 16:49:03.286960 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ks5dc" event={"ID":"b03be8d2-2a64-413e-84bc-ea2dbb249d44","Type":"ContainerStarted","Data":"27bd96db9cd7a204fe4d64a822b66e5d67ff57d7604c5bd6d723f3e19df46365"} Oct 09 16:49:05 crc kubenswrapper[4719]: I1009 16:49:05.312695 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ks5dc" event={"ID":"b03be8d2-2a64-413e-84bc-ea2dbb249d44","Type":"ContainerStarted","Data":"a2abfb50afc7c7abe30d833b9edc5667230c919eec77d2b60437b29c84506d3d"} Oct 09 16:49:06 crc kubenswrapper[4719]: I1009 16:49:06.322792 4719 generic.go:334] "Generic (PLEG): container finished" podID="b03be8d2-2a64-413e-84bc-ea2dbb249d44" containerID="a2abfb50afc7c7abe30d833b9edc5667230c919eec77d2b60437b29c84506d3d" exitCode=0 Oct 09 16:49:06 crc kubenswrapper[4719]: I1009 16:49:06.322894 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ks5dc" event={"ID":"b03be8d2-2a64-413e-84bc-ea2dbb249d44","Type":"ContainerDied","Data":"a2abfb50afc7c7abe30d833b9edc5667230c919eec77d2b60437b29c84506d3d"} Oct 09 16:49:07 crc kubenswrapper[4719]: I1009 16:49:07.161871 4719 scope.go:117] "RemoveContainer" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" Oct 09 16:49:07 crc kubenswrapper[4719]: E1009 16:49:07.162366 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:49:08 crc kubenswrapper[4719]: I1009 16:49:08.341581 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ks5dc" event={"ID":"b03be8d2-2a64-413e-84bc-ea2dbb249d44","Type":"ContainerStarted","Data":"35dee1e2a0623ab189c843dcc22306f7b0b7f09c47c3460371d2428a724c862f"} Oct 09 16:49:08 crc kubenswrapper[4719]: I1009 16:49:08.371305 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ks5dc" podStartSLOduration=3.179187378 podStartE2EDuration="7.371284907s" podCreationTimestamp="2025-10-09 16:49:01 +0000 UTC" firstStartedPulling="2025-10-09 16:49:03.288214044 +0000 UTC m=+5448.797925329" lastFinishedPulling="2025-10-09 16:49:07.480311573 +0000 UTC m=+5452.990022858" observedRunningTime="2025-10-09 16:49:08.367342592 +0000 UTC m=+5453.877053877" watchObservedRunningTime="2025-10-09 16:49:08.371284907 +0000 UTC m=+5453.880996182" Oct 09 16:49:12 crc kubenswrapper[4719]: I1009 16:49:12.241771 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ks5dc" Oct 09 16:49:12 crc kubenswrapper[4719]: I1009 16:49:12.242105 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ks5dc" Oct 09 16:49:12 crc kubenswrapper[4719]: I1009 16:49:12.316807 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ks5dc" Oct 09 16:49:12 crc kubenswrapper[4719]: I1009 16:49:12.449320 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ks5dc" Oct 09 16:49:12 crc kubenswrapper[4719]: I1009 16:49:12.561801 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ks5dc"] Oct 09 16:49:14 crc kubenswrapper[4719]: I1009 16:49:14.388785 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ks5dc" podUID="b03be8d2-2a64-413e-84bc-ea2dbb249d44" containerName="registry-server" containerID="cri-o://35dee1e2a0623ab189c843dcc22306f7b0b7f09c47c3460371d2428a724c862f" gracePeriod=2 Oct 09 16:49:14 crc kubenswrapper[4719]: I1009 16:49:14.850895 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ks5dc" Oct 09 16:49:14 crc kubenswrapper[4719]: I1009 16:49:14.999095 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccpvg\" (UniqueName: \"kubernetes.io/projected/b03be8d2-2a64-413e-84bc-ea2dbb249d44-kube-api-access-ccpvg\") pod \"b03be8d2-2a64-413e-84bc-ea2dbb249d44\" (UID: \"b03be8d2-2a64-413e-84bc-ea2dbb249d44\") " Oct 09 16:49:14 crc kubenswrapper[4719]: I1009 16:49:14.999185 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b03be8d2-2a64-413e-84bc-ea2dbb249d44-utilities\") pod \"b03be8d2-2a64-413e-84bc-ea2dbb249d44\" (UID: \"b03be8d2-2a64-413e-84bc-ea2dbb249d44\") " Oct 09 16:49:14 crc kubenswrapper[4719]: I1009 16:49:14.999452 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b03be8d2-2a64-413e-84bc-ea2dbb249d44-catalog-content\") pod \"b03be8d2-2a64-413e-84bc-ea2dbb249d44\" (UID: \"b03be8d2-2a64-413e-84bc-ea2dbb249d44\") " Oct 09 16:49:15 crc kubenswrapper[4719]: I1009 16:49:15.000095 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b03be8d2-2a64-413e-84bc-ea2dbb249d44-utilities" (OuterVolumeSpecName: "utilities") pod "b03be8d2-2a64-413e-84bc-ea2dbb249d44" (UID: "b03be8d2-2a64-413e-84bc-ea2dbb249d44"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:49:15 crc kubenswrapper[4719]: I1009 16:49:15.015279 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b03be8d2-2a64-413e-84bc-ea2dbb249d44-kube-api-access-ccpvg" (OuterVolumeSpecName: "kube-api-access-ccpvg") pod "b03be8d2-2a64-413e-84bc-ea2dbb249d44" (UID: "b03be8d2-2a64-413e-84bc-ea2dbb249d44"). InnerVolumeSpecName "kube-api-access-ccpvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:49:15 crc kubenswrapper[4719]: I1009 16:49:15.043057 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b03be8d2-2a64-413e-84bc-ea2dbb249d44-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b03be8d2-2a64-413e-84bc-ea2dbb249d44" (UID: "b03be8d2-2a64-413e-84bc-ea2dbb249d44"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:49:15 crc kubenswrapper[4719]: I1009 16:49:15.102165 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b03be8d2-2a64-413e-84bc-ea2dbb249d44-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 16:49:15 crc kubenswrapper[4719]: I1009 16:49:15.102210 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccpvg\" (UniqueName: \"kubernetes.io/projected/b03be8d2-2a64-413e-84bc-ea2dbb249d44-kube-api-access-ccpvg\") on node \"crc\" DevicePath \"\"" Oct 09 16:49:15 crc kubenswrapper[4719]: I1009 16:49:15.102224 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b03be8d2-2a64-413e-84bc-ea2dbb249d44-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 16:49:15 crc kubenswrapper[4719]: I1009 16:49:15.402614 4719 generic.go:334] "Generic (PLEG): container finished" podID="b03be8d2-2a64-413e-84bc-ea2dbb249d44" containerID="35dee1e2a0623ab189c843dcc22306f7b0b7f09c47c3460371d2428a724c862f" exitCode=0 Oct 09 16:49:15 crc kubenswrapper[4719]: I1009 16:49:15.402677 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ks5dc" event={"ID":"b03be8d2-2a64-413e-84bc-ea2dbb249d44","Type":"ContainerDied","Data":"35dee1e2a0623ab189c843dcc22306f7b0b7f09c47c3460371d2428a724c862f"} Oct 09 16:49:15 crc kubenswrapper[4719]: I1009 16:49:15.402727 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ks5dc" event={"ID":"b03be8d2-2a64-413e-84bc-ea2dbb249d44","Type":"ContainerDied","Data":"27bd96db9cd7a204fe4d64a822b66e5d67ff57d7604c5bd6d723f3e19df46365"} Oct 09 16:49:15 crc kubenswrapper[4719]: I1009 16:49:15.402729 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ks5dc" Oct 09 16:49:15 crc kubenswrapper[4719]: I1009 16:49:15.402745 4719 scope.go:117] "RemoveContainer" containerID="35dee1e2a0623ab189c843dcc22306f7b0b7f09c47c3460371d2428a724c862f" Oct 09 16:49:15 crc kubenswrapper[4719]: I1009 16:49:15.426989 4719 scope.go:117] "RemoveContainer" containerID="a2abfb50afc7c7abe30d833b9edc5667230c919eec77d2b60437b29c84506d3d" Oct 09 16:49:15 crc kubenswrapper[4719]: I1009 16:49:15.439898 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ks5dc"] Oct 09 16:49:15 crc kubenswrapper[4719]: I1009 16:49:15.449498 4719 scope.go:117] "RemoveContainer" containerID="fa4a27f79844d505f5009956910759499fcbaf9959ed14b64af173af6b908b81" Oct 09 16:49:15 crc kubenswrapper[4719]: I1009 16:49:15.451525 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ks5dc"] Oct 09 16:49:15 crc kubenswrapper[4719]: I1009 16:49:15.502607 4719 scope.go:117] "RemoveContainer" containerID="35dee1e2a0623ab189c843dcc22306f7b0b7f09c47c3460371d2428a724c862f" Oct 09 16:49:15 crc kubenswrapper[4719]: E1009 16:49:15.504408 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35dee1e2a0623ab189c843dcc22306f7b0b7f09c47c3460371d2428a724c862f\": container with ID starting with 35dee1e2a0623ab189c843dcc22306f7b0b7f09c47c3460371d2428a724c862f not found: ID does not exist" containerID="35dee1e2a0623ab189c843dcc22306f7b0b7f09c47c3460371d2428a724c862f" Oct 09 16:49:15 crc kubenswrapper[4719]: I1009 16:49:15.504448 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35dee1e2a0623ab189c843dcc22306f7b0b7f09c47c3460371d2428a724c862f"} err="failed to get container status \"35dee1e2a0623ab189c843dcc22306f7b0b7f09c47c3460371d2428a724c862f\": rpc error: code = NotFound desc = could not find container \"35dee1e2a0623ab189c843dcc22306f7b0b7f09c47c3460371d2428a724c862f\": container with ID starting with 35dee1e2a0623ab189c843dcc22306f7b0b7f09c47c3460371d2428a724c862f not found: ID does not exist" Oct 09 16:49:15 crc kubenswrapper[4719]: I1009 16:49:15.504473 4719 scope.go:117] "RemoveContainer" containerID="a2abfb50afc7c7abe30d833b9edc5667230c919eec77d2b60437b29c84506d3d" Oct 09 16:49:15 crc kubenswrapper[4719]: E1009 16:49:15.506477 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2abfb50afc7c7abe30d833b9edc5667230c919eec77d2b60437b29c84506d3d\": container with ID starting with a2abfb50afc7c7abe30d833b9edc5667230c919eec77d2b60437b29c84506d3d not found: ID does not exist" containerID="a2abfb50afc7c7abe30d833b9edc5667230c919eec77d2b60437b29c84506d3d" Oct 09 16:49:15 crc kubenswrapper[4719]: I1009 16:49:15.506510 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2abfb50afc7c7abe30d833b9edc5667230c919eec77d2b60437b29c84506d3d"} err="failed to get container status \"a2abfb50afc7c7abe30d833b9edc5667230c919eec77d2b60437b29c84506d3d\": rpc error: code = NotFound desc = could not find container \"a2abfb50afc7c7abe30d833b9edc5667230c919eec77d2b60437b29c84506d3d\": container with ID starting with a2abfb50afc7c7abe30d833b9edc5667230c919eec77d2b60437b29c84506d3d not found: ID does not exist" Oct 09 16:49:15 crc kubenswrapper[4719]: I1009 16:49:15.506547 4719 scope.go:117] "RemoveContainer" containerID="fa4a27f79844d505f5009956910759499fcbaf9959ed14b64af173af6b908b81" Oct 09 16:49:15 crc kubenswrapper[4719]: E1009 16:49:15.506889 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa4a27f79844d505f5009956910759499fcbaf9959ed14b64af173af6b908b81\": container with ID starting with fa4a27f79844d505f5009956910759499fcbaf9959ed14b64af173af6b908b81 not found: ID does not exist" containerID="fa4a27f79844d505f5009956910759499fcbaf9959ed14b64af173af6b908b81" Oct 09 16:49:15 crc kubenswrapper[4719]: I1009 16:49:15.507102 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa4a27f79844d505f5009956910759499fcbaf9959ed14b64af173af6b908b81"} err="failed to get container status \"fa4a27f79844d505f5009956910759499fcbaf9959ed14b64af173af6b908b81\": rpc error: code = NotFound desc = could not find container \"fa4a27f79844d505f5009956910759499fcbaf9959ed14b64af173af6b908b81\": container with ID starting with fa4a27f79844d505f5009956910759499fcbaf9959ed14b64af173af6b908b81 not found: ID does not exist" Oct 09 16:49:15 crc kubenswrapper[4719]: E1009 16:49:15.639526 4719 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb03be8d2_2a64_413e_84bc_ea2dbb249d44.slice/crio-27bd96db9cd7a204fe4d64a822b66e5d67ff57d7604c5bd6d723f3e19df46365\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb03be8d2_2a64_413e_84bc_ea2dbb249d44.slice\": RecentStats: unable to find data in memory cache]" Oct 09 16:49:17 crc kubenswrapper[4719]: I1009 16:49:17.177957 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b03be8d2-2a64-413e-84bc-ea2dbb249d44" path="/var/lib/kubelet/pods/b03be8d2-2a64-413e-84bc-ea2dbb249d44/volumes" Oct 09 16:49:20 crc kubenswrapper[4719]: I1009 16:49:20.360342 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt_57fba553-2ed7-4d57-94af-f8322ebd87d3/util/0.log" Oct 09 16:49:20 crc kubenswrapper[4719]: I1009 16:49:20.575677 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt_57fba553-2ed7-4d57-94af-f8322ebd87d3/pull/0.log" Oct 09 16:49:20 crc kubenswrapper[4719]: I1009 16:49:20.579972 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt_57fba553-2ed7-4d57-94af-f8322ebd87d3/pull/0.log" Oct 09 16:49:20 crc kubenswrapper[4719]: I1009 16:49:20.585298 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt_57fba553-2ed7-4d57-94af-f8322ebd87d3/util/0.log" Oct 09 16:49:20 crc kubenswrapper[4719]: I1009 16:49:20.749441 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt_57fba553-2ed7-4d57-94af-f8322ebd87d3/pull/0.log" Oct 09 16:49:20 crc kubenswrapper[4719]: I1009 16:49:20.752652 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt_57fba553-2ed7-4d57-94af-f8322ebd87d3/extract/0.log" Oct 09 16:49:20 crc kubenswrapper[4719]: I1009 16:49:20.755394 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt_57fba553-2ed7-4d57-94af-f8322ebd87d3/util/0.log" Oct 09 16:49:20 crc kubenswrapper[4719]: I1009 16:49:20.978116 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-kxkjp_ab93ff28-c8ec-4514-bd82-dbab0fe25cee/kube-rbac-proxy/0.log" Oct 09 16:49:21 crc kubenswrapper[4719]: I1009 16:49:21.008478 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-kxkjp_ab93ff28-c8ec-4514-bd82-dbab0fe25cee/manager/0.log" Oct 09 16:49:21 crc kubenswrapper[4719]: I1009 16:49:21.064239 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-pnb2g_f4b78ea6-51d8-4a7a-b5d3-cc4bdc3b5ba4/kube-rbac-proxy/0.log" Oct 09 16:49:21 crc kubenswrapper[4719]: I1009 16:49:21.194721 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-pnb2g_f4b78ea6-51d8-4a7a-b5d3-cc4bdc3b5ba4/manager/0.log" Oct 09 16:49:21 crc kubenswrapper[4719]: I1009 16:49:21.222056 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-wzn6r_f013ff43-3cb6-47f5-bc35-a4bf02143db0/kube-rbac-proxy/0.log" Oct 09 16:49:21 crc kubenswrapper[4719]: I1009 16:49:21.278796 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-wzn6r_f013ff43-3cb6-47f5-bc35-a4bf02143db0/manager/0.log" Oct 09 16:49:21 crc kubenswrapper[4719]: I1009 16:49:21.368512 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-pvjzc_d5e1695b-e7fb-4c23-9848-c6abacde588c/kube-rbac-proxy/0.log" Oct 09 16:49:21 crc kubenswrapper[4719]: I1009 16:49:21.510996 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-pvjzc_d5e1695b-e7fb-4c23-9848-c6abacde588c/manager/0.log" Oct 09 16:49:21 crc kubenswrapper[4719]: I1009 16:49:21.537524 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-z4mpg_582c5c2a-a5b2-43bf-bbdb-4c3fb1b21c09/kube-rbac-proxy/0.log" Oct 09 16:49:21 crc kubenswrapper[4719]: I1009 16:49:21.588204 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-z4mpg_582c5c2a-a5b2-43bf-bbdb-4c3fb1b21c09/manager/0.log" Oct 09 16:49:21 crc kubenswrapper[4719]: I1009 16:49:21.693148 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-jg6r2_64ce70f3-641d-4dfd-811e-c786365c9859/kube-rbac-proxy/0.log" Oct 09 16:49:21 crc kubenswrapper[4719]: I1009 16:49:21.748426 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-jg6r2_64ce70f3-641d-4dfd-811e-c786365c9859/manager/0.log" Oct 09 16:49:21 crc kubenswrapper[4719]: I1009 16:49:21.903679 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-jzwqf_3bc5e8dd-bc95-4b65-afda-a821512a89dd/kube-rbac-proxy/0.log" Oct 09 16:49:21 crc kubenswrapper[4719]: I1009 16:49:21.983174 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-j945k_8b59c5dc-f309-48cc-9c66-7a5c42050f8e/kube-rbac-proxy/0.log" Oct 09 16:49:22 crc kubenswrapper[4719]: I1009 16:49:22.097304 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-j945k_8b59c5dc-f309-48cc-9c66-7a5c42050f8e/manager/0.log" Oct 09 16:49:22 crc kubenswrapper[4719]: I1009 16:49:22.102237 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-jzwqf_3bc5e8dd-bc95-4b65-afda-a821512a89dd/manager/0.log" Oct 09 16:49:22 crc kubenswrapper[4719]: I1009 16:49:22.161800 4719 scope.go:117] "RemoveContainer" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" Oct 09 16:49:22 crc kubenswrapper[4719]: E1009 16:49:22.162033 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:49:22 crc kubenswrapper[4719]: I1009 16:49:22.201075 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-6zjhd_651b9dd5-bce9-4ca0-b6f7-cca0c3fb30eb/kube-rbac-proxy/0.log" Oct 09 16:49:22 crc kubenswrapper[4719]: I1009 16:49:22.325123 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-6zjhd_651b9dd5-bce9-4ca0-b6f7-cca0c3fb30eb/manager/0.log" Oct 09 16:49:22 crc kubenswrapper[4719]: I1009 16:49:22.352217 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-tfk7f_2292f494-d606-40b2-bb8b-7dcc6e9dfeb4/kube-rbac-proxy/0.log" Oct 09 16:49:22 crc kubenswrapper[4719]: I1009 16:49:22.434523 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-tfk7f_2292f494-d606-40b2-bb8b-7dcc6e9dfeb4/manager/0.log" Oct 09 16:49:22 crc kubenswrapper[4719]: I1009 16:49:22.538814 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-zj22f_bc8d9b2a-7a74-40f1-8a70-e8f0013fad38/kube-rbac-proxy/0.log" Oct 09 16:49:22 crc kubenswrapper[4719]: I1009 16:49:22.549189 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-zj22f_bc8d9b2a-7a74-40f1-8a70-e8f0013fad38/manager/0.log" Oct 09 16:49:22 crc kubenswrapper[4719]: I1009 16:49:22.687366 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-qbtsl_14a3f87a-25c5-476e-8379-0b15d3511315/kube-rbac-proxy/0.log" Oct 09 16:49:22 crc kubenswrapper[4719]: I1009 16:49:22.769196 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-qbtsl_14a3f87a-25c5-476e-8379-0b15d3511315/manager/0.log" Oct 09 16:49:22 crc kubenswrapper[4719]: I1009 16:49:22.838823 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-tmxgv_6b82d858-736f-487f-ba35-c1478301b229/kube-rbac-proxy/0.log" Oct 09 16:49:22 crc kubenswrapper[4719]: I1009 16:49:22.960668 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-tmxgv_6b82d858-736f-487f-ba35-c1478301b229/manager/0.log" Oct 09 16:49:23 crc kubenswrapper[4719]: I1009 16:49:23.001660 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-d284h_308fe096-8aff-4a3b-a83a-bb2b1ef8c5df/kube-rbac-proxy/0.log" Oct 09 16:49:23 crc kubenswrapper[4719]: I1009 16:49:23.023090 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-d284h_308fe096-8aff-4a3b-a83a-bb2b1ef8c5df/manager/0.log" Oct 09 16:49:23 crc kubenswrapper[4719]: I1009 16:49:23.186301 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk_5584dd28-d59b-41bf-b24a-ec18d01029e1/kube-rbac-proxy/0.log" Oct 09 16:49:23 crc kubenswrapper[4719]: I1009 16:49:23.192132 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk_5584dd28-d59b-41bf-b24a-ec18d01029e1/manager/0.log" Oct 09 16:49:23 crc kubenswrapper[4719]: I1009 16:49:23.431563 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-67f69c4d95-5p5fq_406c7514-3092-45dc-abde-352acbfa0108/kube-rbac-proxy/0.log" Oct 09 16:49:23 crc kubenswrapper[4719]: I1009 16:49:23.563082 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6cf9d6bfd4-rw5j8_a64c087b-46a2-4c1b-abf9-ce21ce6f9688/kube-rbac-proxy/0.log" Oct 09 16:49:23 crc kubenswrapper[4719]: I1009 16:49:23.845322 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-cmjrg_21fdeec4-a518-4f1f-a27d-50d49e078d3d/registry-server/0.log" Oct 09 16:49:23 crc kubenswrapper[4719]: I1009 16:49:23.914722 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6cf9d6bfd4-rw5j8_a64c087b-46a2-4c1b-abf9-ce21ce6f9688/operator/0.log" Oct 09 16:49:24 crc kubenswrapper[4719]: I1009 16:49:24.076706 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-k8dck_607972ec-63ef-43a7-a1ed-0aab9fffc680/kube-rbac-proxy/0.log" Oct 09 16:49:24 crc kubenswrapper[4719]: I1009 16:49:24.264160 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-k8dck_607972ec-63ef-43a7-a1ed-0aab9fffc680/manager/0.log" Oct 09 16:49:24 crc kubenswrapper[4719]: I1009 16:49:24.312735 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-2td8x_08380711-65b1-4957-80ba-36c2f064e618/kube-rbac-proxy/0.log" Oct 09 16:49:24 crc kubenswrapper[4719]: I1009 16:49:24.402604 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-2td8x_08380711-65b1-4957-80ba-36c2f064e618/manager/0.log" Oct 09 16:49:24 crc kubenswrapper[4719]: I1009 16:49:24.487463 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-r4m97_6776ccc8-9114-46e5-a2a2-699f8917bfac/operator/0.log" Oct 09 16:49:24 crc kubenswrapper[4719]: I1009 16:49:24.630029 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-lh9st_6d60ce50-53c4-47c1-b222-88b92c43fd4d/kube-rbac-proxy/0.log" Oct 09 16:49:24 crc kubenswrapper[4719]: I1009 16:49:24.771857 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-lh9st_6d60ce50-53c4-47c1-b222-88b92c43fd4d/manager/0.log" Oct 09 16:49:24 crc kubenswrapper[4719]: I1009 16:49:24.775848 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-67f69c4d95-5p5fq_406c7514-3092-45dc-abde-352acbfa0108/manager/0.log" Oct 09 16:49:24 crc kubenswrapper[4719]: I1009 16:49:24.821895 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-578874c84d-w6fj7_c56b1641-8023-4761-a55f-763dfe5f7c4f/kube-rbac-proxy/0.log" Oct 09 16:49:24 crc kubenswrapper[4719]: I1009 16:49:24.967827 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-q9cm2_288a232e-38ff-44b7-9fda-738becefc8d7/kube-rbac-proxy/0.log" Oct 09 16:49:25 crc kubenswrapper[4719]: I1009 16:49:25.008163 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-q9cm2_288a232e-38ff-44b7-9fda-738becefc8d7/manager/0.log" Oct 09 16:49:25 crc kubenswrapper[4719]: I1009 16:49:25.095152 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-578874c84d-w6fj7_c56b1641-8023-4761-a55f-763dfe5f7c4f/manager/0.log" Oct 09 16:49:25 crc kubenswrapper[4719]: I1009 16:49:25.179002 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-cc79478c-885gj_44ee4b27-7bdd-4a5e-98ed-1b8b5f01b54f/kube-rbac-proxy/0.log" Oct 09 16:49:25 crc kubenswrapper[4719]: I1009 16:49:25.247788 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-cc79478c-885gj_44ee4b27-7bdd-4a5e-98ed-1b8b5f01b54f/manager/0.log" Oct 09 16:49:25 crc kubenswrapper[4719]: E1009 16:49:25.904111 4719 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb03be8d2_2a64_413e_84bc_ea2dbb249d44.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb03be8d2_2a64_413e_84bc_ea2dbb249d44.slice/crio-27bd96db9cd7a204fe4d64a822b66e5d67ff57d7604c5bd6d723f3e19df46365\": RecentStats: unable to find data in memory cache]" Oct 09 16:49:36 crc kubenswrapper[4719]: E1009 16:49:36.197518 4719 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb03be8d2_2a64_413e_84bc_ea2dbb249d44.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb03be8d2_2a64_413e_84bc_ea2dbb249d44.slice/crio-27bd96db9cd7a204fe4d64a822b66e5d67ff57d7604c5bd6d723f3e19df46365\": RecentStats: unable to find data in memory cache]" Oct 09 16:49:37 crc kubenswrapper[4719]: I1009 16:49:37.161666 4719 scope.go:117] "RemoveContainer" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" Oct 09 16:49:37 crc kubenswrapper[4719]: E1009 16:49:37.162150 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:49:40 crc kubenswrapper[4719]: I1009 16:49:40.019781 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-2d4s7_4b4724c8-6007-4df3-b822-42d08ea33fde/control-plane-machine-set-operator/0.log" Oct 09 16:49:40 crc kubenswrapper[4719]: I1009 16:49:40.182438 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-75q5v_f07d2126-7037-4b5c-aa67-4d09bf873e07/kube-rbac-proxy/0.log" Oct 09 16:49:40 crc kubenswrapper[4719]: I1009 16:49:40.222547 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-75q5v_f07d2126-7037-4b5c-aa67-4d09bf873e07/machine-api-operator/0.log" Oct 09 16:49:46 crc kubenswrapper[4719]: E1009 16:49:46.467369 4719 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb03be8d2_2a64_413e_84bc_ea2dbb249d44.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb03be8d2_2a64_413e_84bc_ea2dbb249d44.slice/crio-27bd96db9cd7a204fe4d64a822b66e5d67ff57d7604c5bd6d723f3e19df46365\": RecentStats: unable to find data in memory cache]" Oct 09 16:49:50 crc kubenswrapper[4719]: I1009 16:49:50.971933 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-2kc72_419868d8-7886-45fb-be57-2c476ba8d305/cert-manager-controller/0.log" Oct 09 16:49:51 crc kubenswrapper[4719]: I1009 16:49:51.161391 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-mr6c9_a2b1f94e-9754-4aeb-9d99-a5c2258290ca/cert-manager-cainjector/0.log" Oct 09 16:49:51 crc kubenswrapper[4719]: I1009 16:49:51.163827 4719 scope.go:117] "RemoveContainer" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" Oct 09 16:49:51 crc kubenswrapper[4719]: E1009 16:49:51.164323 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:49:51 crc kubenswrapper[4719]: I1009 16:49:51.183299 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-s42jm_79d0f7b5-f165-44ee-8220-f31bcc6df1fd/cert-manager-webhook/0.log" Oct 09 16:49:56 crc kubenswrapper[4719]: E1009 16:49:56.738814 4719 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb03be8d2_2a64_413e_84bc_ea2dbb249d44.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb03be8d2_2a64_413e_84bc_ea2dbb249d44.slice/crio-27bd96db9cd7a204fe4d64a822b66e5d67ff57d7604c5bd6d723f3e19df46365\": RecentStats: unable to find data in memory cache]" Oct 09 16:50:02 crc kubenswrapper[4719]: I1009 16:50:02.328730 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-jz9ct_db938ad9-d041-4874-855a-83d6fa385b3e/nmstate-console-plugin/0.log" Oct 09 16:50:02 crc kubenswrapper[4719]: I1009 16:50:02.465152 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-xwghc_feb926a1-9332-41f0-80b7-b100b62f8664/nmstate-handler/0.log" Oct 09 16:50:02 crc kubenswrapper[4719]: I1009 16:50:02.521432 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-wgq7f_07cbbe5f-3176-4cfa-97e0-a7b3e6613c7b/kube-rbac-proxy/0.log" Oct 09 16:50:02 crc kubenswrapper[4719]: I1009 16:50:02.561084 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-wgq7f_07cbbe5f-3176-4cfa-97e0-a7b3e6613c7b/nmstate-metrics/0.log" Oct 09 16:50:02 crc kubenswrapper[4719]: I1009 16:50:02.705935 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-l4s9n_15fe7687-ec6e-42eb-9131-980871159a78/nmstate-operator/0.log" Oct 09 16:50:02 crc kubenswrapper[4719]: I1009 16:50:02.755245 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-m6lsq_036064c5-e3a3-49a7-b457-5e64df820401/nmstate-webhook/0.log" Oct 09 16:50:04 crc kubenswrapper[4719]: I1009 16:50:04.161491 4719 scope.go:117] "RemoveContainer" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" Oct 09 16:50:04 crc kubenswrapper[4719]: E1009 16:50:04.162073 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:50:07 crc kubenswrapper[4719]: E1009 16:50:07.022492 4719 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb03be8d2_2a64_413e_84bc_ea2dbb249d44.slice/crio-27bd96db9cd7a204fe4d64a822b66e5d67ff57d7604c5bd6d723f3e19df46365\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb03be8d2_2a64_413e_84bc_ea2dbb249d44.slice\": RecentStats: unable to find data in memory cache]" Oct 09 16:50:15 crc kubenswrapper[4719]: I1009 16:50:15.173603 4719 scope.go:117] "RemoveContainer" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" Oct 09 16:50:15 crc kubenswrapper[4719]: E1009 16:50:15.174507 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:50:15 crc kubenswrapper[4719]: I1009 16:50:15.446828 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-42kr4_bee3449b-a86a-4db6-9e57-233f95dfbad0/kube-rbac-proxy/0.log" Oct 09 16:50:15 crc kubenswrapper[4719]: I1009 16:50:15.584984 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-42kr4_bee3449b-a86a-4db6-9e57-233f95dfbad0/controller/0.log" Oct 09 16:50:15 crc kubenswrapper[4719]: I1009 16:50:15.628793 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/cp-frr-files/0.log" Oct 09 16:50:15 crc kubenswrapper[4719]: I1009 16:50:15.781210 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/cp-frr-files/0.log" Oct 09 16:50:15 crc kubenswrapper[4719]: I1009 16:50:15.832324 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/cp-reloader/0.log" Oct 09 16:50:15 crc kubenswrapper[4719]: I1009 16:50:15.832876 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/cp-reloader/0.log" Oct 09 16:50:15 crc kubenswrapper[4719]: I1009 16:50:15.862222 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/cp-metrics/0.log" Oct 09 16:50:16 crc kubenswrapper[4719]: I1009 16:50:16.034293 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/cp-frr-files/0.log" Oct 09 16:50:16 crc kubenswrapper[4719]: I1009 16:50:16.035120 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/cp-reloader/0.log" Oct 09 16:50:16 crc kubenswrapper[4719]: I1009 16:50:16.079782 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/cp-metrics/0.log" Oct 09 16:50:16 crc kubenswrapper[4719]: I1009 16:50:16.142634 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/cp-metrics/0.log" Oct 09 16:50:16 crc kubenswrapper[4719]: I1009 16:50:16.230556 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/cp-frr-files/0.log" Oct 09 16:50:16 crc kubenswrapper[4719]: I1009 16:50:16.261801 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/cp-metrics/0.log" Oct 09 16:50:16 crc kubenswrapper[4719]: I1009 16:50:16.268229 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/cp-reloader/0.log" Oct 09 16:50:16 crc kubenswrapper[4719]: I1009 16:50:16.341013 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/controller/0.log" Oct 09 16:50:16 crc kubenswrapper[4719]: I1009 16:50:16.466376 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/frr-metrics/0.log" Oct 09 16:50:16 crc kubenswrapper[4719]: I1009 16:50:16.477794 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/kube-rbac-proxy/0.log" Oct 09 16:50:16 crc kubenswrapper[4719]: I1009 16:50:16.628533 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/kube-rbac-proxy-frr/0.log" Oct 09 16:50:16 crc kubenswrapper[4719]: I1009 16:50:16.659469 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/reloader/0.log" Oct 09 16:50:16 crc kubenswrapper[4719]: I1009 16:50:16.833330 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-wpvvd_1f525824-038f-49b3-9410-10b49819ee01/frr-k8s-webhook-server/0.log" Oct 09 16:50:17 crc kubenswrapper[4719]: I1009 16:50:17.002976 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-54fb4b8cc7-c4j9j_7d280e60-020e-43c0-a430-fb220b1d8354/manager/0.log" Oct 09 16:50:17 crc kubenswrapper[4719]: I1009 16:50:17.098977 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-645dbdc857-kl4lw_e3a6e576-c324-4600-b0e9-4a83cd64d478/webhook-server/0.log" Oct 09 16:50:17 crc kubenswrapper[4719]: I1009 16:50:17.295718 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-9qhkk_6298dd42-080d-4d5e-bf61-c798382943a7/kube-rbac-proxy/0.log" Oct 09 16:50:17 crc kubenswrapper[4719]: I1009 16:50:17.898599 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-9qhkk_6298dd42-080d-4d5e-bf61-c798382943a7/speaker/0.log" Oct 09 16:50:18 crc kubenswrapper[4719]: I1009 16:50:18.142570 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/frr/0.log" Oct 09 16:50:26 crc kubenswrapper[4719]: I1009 16:50:26.161555 4719 scope.go:117] "RemoveContainer" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" Oct 09 16:50:26 crc kubenswrapper[4719]: E1009 16:50:26.162299 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:50:29 crc kubenswrapper[4719]: I1009 16:50:29.039778 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46_4b460c47-e24c-46c7-bf23-0e5b5d6819bd/util/0.log" Oct 09 16:50:29 crc kubenswrapper[4719]: I1009 16:50:29.169298 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46_4b460c47-e24c-46c7-bf23-0e5b5d6819bd/util/0.log" Oct 09 16:50:29 crc kubenswrapper[4719]: I1009 16:50:29.207968 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46_4b460c47-e24c-46c7-bf23-0e5b5d6819bd/pull/0.log" Oct 09 16:50:29 crc kubenswrapper[4719]: I1009 16:50:29.227282 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46_4b460c47-e24c-46c7-bf23-0e5b5d6819bd/pull/0.log" Oct 09 16:50:29 crc kubenswrapper[4719]: I1009 16:50:29.424823 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46_4b460c47-e24c-46c7-bf23-0e5b5d6819bd/util/0.log" Oct 09 16:50:29 crc kubenswrapper[4719]: I1009 16:50:29.432988 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46_4b460c47-e24c-46c7-bf23-0e5b5d6819bd/extract/0.log" Oct 09 16:50:29 crc kubenswrapper[4719]: I1009 16:50:29.456334 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46_4b460c47-e24c-46c7-bf23-0e5b5d6819bd/pull/0.log" Oct 09 16:50:29 crc kubenswrapper[4719]: I1009 16:50:29.583824 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4_2c118d4a-6a5b-4138-90ff-2270ea2dabd9/util/0.log" Oct 09 16:50:29 crc kubenswrapper[4719]: I1009 16:50:29.784406 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4_2c118d4a-6a5b-4138-90ff-2270ea2dabd9/pull/0.log" Oct 09 16:50:29 crc kubenswrapper[4719]: I1009 16:50:29.790057 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4_2c118d4a-6a5b-4138-90ff-2270ea2dabd9/util/0.log" Oct 09 16:50:29 crc kubenswrapper[4719]: I1009 16:50:29.792912 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4_2c118d4a-6a5b-4138-90ff-2270ea2dabd9/pull/0.log" Oct 09 16:50:29 crc kubenswrapper[4719]: I1009 16:50:29.971761 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4_2c118d4a-6a5b-4138-90ff-2270ea2dabd9/util/0.log" Oct 09 16:50:29 crc kubenswrapper[4719]: I1009 16:50:29.982762 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4_2c118d4a-6a5b-4138-90ff-2270ea2dabd9/pull/0.log" Oct 09 16:50:30 crc kubenswrapper[4719]: I1009 16:50:30.000314 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4_2c118d4a-6a5b-4138-90ff-2270ea2dabd9/extract/0.log" Oct 09 16:50:30 crc kubenswrapper[4719]: I1009 16:50:30.150815 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zjkt4_430dd9b6-25a9-482d-8fa6-d2dec5d84507/extract-utilities/0.log" Oct 09 16:50:30 crc kubenswrapper[4719]: I1009 16:50:30.532866 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zjkt4_430dd9b6-25a9-482d-8fa6-d2dec5d84507/extract-content/0.log" Oct 09 16:50:30 crc kubenswrapper[4719]: I1009 16:50:30.553938 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zjkt4_430dd9b6-25a9-482d-8fa6-d2dec5d84507/extract-content/0.log" Oct 09 16:50:30 crc kubenswrapper[4719]: I1009 16:50:30.554803 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zjkt4_430dd9b6-25a9-482d-8fa6-d2dec5d84507/extract-utilities/0.log" Oct 09 16:50:30 crc kubenswrapper[4719]: I1009 16:50:30.694335 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zjkt4_430dd9b6-25a9-482d-8fa6-d2dec5d84507/extract-utilities/0.log" Oct 09 16:50:30 crc kubenswrapper[4719]: I1009 16:50:30.712101 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zjkt4_430dd9b6-25a9-482d-8fa6-d2dec5d84507/extract-content/0.log" Oct 09 16:50:30 crc kubenswrapper[4719]: I1009 16:50:30.928411 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f56vs_400debb1-678f-4731-84d3-8d0b3c455305/extract-utilities/0.log" Oct 09 16:50:31 crc kubenswrapper[4719]: I1009 16:50:31.222509 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f56vs_400debb1-678f-4731-84d3-8d0b3c455305/extract-content/0.log" Oct 09 16:50:31 crc kubenswrapper[4719]: I1009 16:50:31.283099 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f56vs_400debb1-678f-4731-84d3-8d0b3c455305/extract-utilities/0.log" Oct 09 16:50:31 crc kubenswrapper[4719]: I1009 16:50:31.285462 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f56vs_400debb1-678f-4731-84d3-8d0b3c455305/extract-content/0.log" Oct 09 16:50:31 crc kubenswrapper[4719]: I1009 16:50:31.445853 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zjkt4_430dd9b6-25a9-482d-8fa6-d2dec5d84507/registry-server/0.log" Oct 09 16:50:31 crc kubenswrapper[4719]: I1009 16:50:31.471830 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f56vs_400debb1-678f-4731-84d3-8d0b3c455305/extract-utilities/0.log" Oct 09 16:50:31 crc kubenswrapper[4719]: I1009 16:50:31.500911 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f56vs_400debb1-678f-4731-84d3-8d0b3c455305/extract-content/0.log" Oct 09 16:50:31 crc kubenswrapper[4719]: I1009 16:50:31.739103 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz_57298b91-7d64-40fd-be0e-c400cdfd9b93/util/0.log" Oct 09 16:50:31 crc kubenswrapper[4719]: I1009 16:50:31.937596 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz_57298b91-7d64-40fd-be0e-c400cdfd9b93/pull/0.log" Oct 09 16:50:31 crc kubenswrapper[4719]: I1009 16:50:31.969088 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz_57298b91-7d64-40fd-be0e-c400cdfd9b93/pull/0.log" Oct 09 16:50:31 crc kubenswrapper[4719]: I1009 16:50:31.984111 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz_57298b91-7d64-40fd-be0e-c400cdfd9b93/util/0.log" Oct 09 16:50:32 crc kubenswrapper[4719]: I1009 16:50:32.283545 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz_57298b91-7d64-40fd-be0e-c400cdfd9b93/pull/0.log" Oct 09 16:50:32 crc kubenswrapper[4719]: I1009 16:50:32.292675 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz_57298b91-7d64-40fd-be0e-c400cdfd9b93/util/0.log" Oct 09 16:50:32 crc kubenswrapper[4719]: I1009 16:50:32.351168 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz_57298b91-7d64-40fd-be0e-c400cdfd9b93/extract/0.log" Oct 09 16:50:32 crc kubenswrapper[4719]: I1009 16:50:32.580643 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f56vs_400debb1-678f-4731-84d3-8d0b3c455305/registry-server/0.log" Oct 09 16:50:32 crc kubenswrapper[4719]: I1009 16:50:32.584904 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-gj4pz_9080569c-497b-4281-a120-7c538380a16c/marketplace-operator/0.log" Oct 09 16:50:32 crc kubenswrapper[4719]: I1009 16:50:32.718137 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8d5wd_f4c82774-ac3f-4330-b575-1cfb72f5dbf7/extract-utilities/0.log" Oct 09 16:50:32 crc kubenswrapper[4719]: I1009 16:50:32.886610 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8d5wd_f4c82774-ac3f-4330-b575-1cfb72f5dbf7/extract-utilities/0.log" Oct 09 16:50:32 crc kubenswrapper[4719]: I1009 16:50:32.913998 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8d5wd_f4c82774-ac3f-4330-b575-1cfb72f5dbf7/extract-content/0.log" Oct 09 16:50:32 crc kubenswrapper[4719]: I1009 16:50:32.940980 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8d5wd_f4c82774-ac3f-4330-b575-1cfb72f5dbf7/extract-content/0.log" Oct 09 16:50:33 crc kubenswrapper[4719]: I1009 16:50:33.092960 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8d5wd_f4c82774-ac3f-4330-b575-1cfb72f5dbf7/extract-utilities/0.log" Oct 09 16:50:33 crc kubenswrapper[4719]: I1009 16:50:33.183271 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8d5wd_f4c82774-ac3f-4330-b575-1cfb72f5dbf7/extract-content/0.log" Oct 09 16:50:33 crc kubenswrapper[4719]: I1009 16:50:33.209251 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vk29b_0ad43ecb-75f5-4453-89e5-2c7891c537a7/extract-utilities/0.log" Oct 09 16:50:33 crc kubenswrapper[4719]: I1009 16:50:33.285622 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8d5wd_f4c82774-ac3f-4330-b575-1cfb72f5dbf7/registry-server/0.log" Oct 09 16:50:33 crc kubenswrapper[4719]: I1009 16:50:33.388443 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vk29b_0ad43ecb-75f5-4453-89e5-2c7891c537a7/extract-utilities/0.log" Oct 09 16:50:33 crc kubenswrapper[4719]: I1009 16:50:33.396690 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vk29b_0ad43ecb-75f5-4453-89e5-2c7891c537a7/extract-content/0.log" Oct 09 16:50:33 crc kubenswrapper[4719]: I1009 16:50:33.427046 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vk29b_0ad43ecb-75f5-4453-89e5-2c7891c537a7/extract-content/0.log" Oct 09 16:50:33 crc kubenswrapper[4719]: I1009 16:50:33.577845 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vk29b_0ad43ecb-75f5-4453-89e5-2c7891c537a7/extract-content/0.log" Oct 09 16:50:33 crc kubenswrapper[4719]: I1009 16:50:33.584240 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vk29b_0ad43ecb-75f5-4453-89e5-2c7891c537a7/extract-utilities/0.log" Oct 09 16:50:34 crc kubenswrapper[4719]: I1009 16:50:34.251474 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vk29b_0ad43ecb-75f5-4453-89e5-2c7891c537a7/registry-server/0.log" Oct 09 16:50:41 crc kubenswrapper[4719]: I1009 16:50:41.161273 4719 scope.go:117] "RemoveContainer" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" Oct 09 16:50:41 crc kubenswrapper[4719]: E1009 16:50:41.162050 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:50:45 crc kubenswrapper[4719]: I1009 16:50:45.100476 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-z6j76_50a5cc44-22d4-4ef1-a265-800ebc36afd4/prometheus-operator/0.log" Oct 09 16:50:45 crc kubenswrapper[4719]: I1009 16:50:45.309951 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5bfc464c98-6px5r_db3811fa-89b7-44a6-94e8-92ca398d8d2c/prometheus-operator-admission-webhook/0.log" Oct 09 16:50:45 crc kubenswrapper[4719]: I1009 16:50:45.315845 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5bfc464c98-2f97q_1ae73acd-0d93-4281-b807-4798a207506b/prometheus-operator-admission-webhook/0.log" Oct 09 16:50:45 crc kubenswrapper[4719]: I1009 16:50:45.570862 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-kc8zw_4eb8b96a-c47f-424d-bcbc-20ff193b8d7f/perses-operator/0.log" Oct 09 16:50:45 crc kubenswrapper[4719]: I1009 16:50:45.588173 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-ff5rt_efab9597-d673-43a0-bedd-f1ec483ae194/operator/0.log" Oct 09 16:50:54 crc kubenswrapper[4719]: I1009 16:50:54.161034 4719 scope.go:117] "RemoveContainer" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" Oct 09 16:50:54 crc kubenswrapper[4719]: E1009 16:50:54.162165 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:51:08 crc kubenswrapper[4719]: I1009 16:51:08.161888 4719 scope.go:117] "RemoveContainer" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" Oct 09 16:51:08 crc kubenswrapper[4719]: I1009 16:51:08.469241 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerStarted","Data":"cb433a3bb385cb61f77433e06e45a3b88f2918e28fef2f3286d20b0b2ec3257e"} Oct 09 16:52:38 crc kubenswrapper[4719]: I1009 16:52:38.378653 4719 generic.go:334] "Generic (PLEG): container finished" podID="3c04f13f-99d3-45a7-afdc-c8c75afe6737" containerID="008d4782160db9493449fe21d6a79efca2293ac2da9381ded8ac84c29fd0fb09" exitCode=0 Oct 09 16:52:38 crc kubenswrapper[4719]: I1009 16:52:38.378856 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2mzl/must-gather-pnnpc" event={"ID":"3c04f13f-99d3-45a7-afdc-c8c75afe6737","Type":"ContainerDied","Data":"008d4782160db9493449fe21d6a79efca2293ac2da9381ded8ac84c29fd0fb09"} Oct 09 16:52:38 crc kubenswrapper[4719]: I1009 16:52:38.380036 4719 scope.go:117] "RemoveContainer" containerID="008d4782160db9493449fe21d6a79efca2293ac2da9381ded8ac84c29fd0fb09" Oct 09 16:52:38 crc kubenswrapper[4719]: I1009 16:52:38.475582 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d2mzl_must-gather-pnnpc_3c04f13f-99d3-45a7-afdc-c8c75afe6737/gather/0.log" Oct 09 16:52:46 crc kubenswrapper[4719]: I1009 16:52:46.499744 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d2mzl/must-gather-pnnpc"] Oct 09 16:52:46 crc kubenswrapper[4719]: I1009 16:52:46.500742 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-d2mzl/must-gather-pnnpc" podUID="3c04f13f-99d3-45a7-afdc-c8c75afe6737" containerName="copy" containerID="cri-o://c7f39fecf4987a694e7120a018f1a5978607ead4f98d5eb4b8140ed893279338" gracePeriod=2 Oct 09 16:52:46 crc kubenswrapper[4719]: I1009 16:52:46.511519 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d2mzl/must-gather-pnnpc"] Oct 09 16:52:46 crc kubenswrapper[4719]: I1009 16:52:46.988210 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d2mzl_must-gather-pnnpc_3c04f13f-99d3-45a7-afdc-c8c75afe6737/copy/0.log" Oct 09 16:52:46 crc kubenswrapper[4719]: I1009 16:52:46.989009 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2mzl/must-gather-pnnpc" Oct 09 16:52:47 crc kubenswrapper[4719]: I1009 16:52:47.153211 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xsx2\" (UniqueName: \"kubernetes.io/projected/3c04f13f-99d3-45a7-afdc-c8c75afe6737-kube-api-access-8xsx2\") pod \"3c04f13f-99d3-45a7-afdc-c8c75afe6737\" (UID: \"3c04f13f-99d3-45a7-afdc-c8c75afe6737\") " Oct 09 16:52:47 crc kubenswrapper[4719]: I1009 16:52:47.153805 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3c04f13f-99d3-45a7-afdc-c8c75afe6737-must-gather-output\") pod \"3c04f13f-99d3-45a7-afdc-c8c75afe6737\" (UID: \"3c04f13f-99d3-45a7-afdc-c8c75afe6737\") " Oct 09 16:52:47 crc kubenswrapper[4719]: I1009 16:52:47.158930 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c04f13f-99d3-45a7-afdc-c8c75afe6737-kube-api-access-8xsx2" (OuterVolumeSpecName: "kube-api-access-8xsx2") pod "3c04f13f-99d3-45a7-afdc-c8c75afe6737" (UID: "3c04f13f-99d3-45a7-afdc-c8c75afe6737"). InnerVolumeSpecName "kube-api-access-8xsx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:52:47 crc kubenswrapper[4719]: I1009 16:52:47.258213 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xsx2\" (UniqueName: \"kubernetes.io/projected/3c04f13f-99d3-45a7-afdc-c8c75afe6737-kube-api-access-8xsx2\") on node \"crc\" DevicePath \"\"" Oct 09 16:52:47 crc kubenswrapper[4719]: I1009 16:52:47.324181 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c04f13f-99d3-45a7-afdc-c8c75afe6737-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "3c04f13f-99d3-45a7-afdc-c8c75afe6737" (UID: "3c04f13f-99d3-45a7-afdc-c8c75afe6737"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:52:47 crc kubenswrapper[4719]: I1009 16:52:47.359932 4719 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3c04f13f-99d3-45a7-afdc-c8c75afe6737-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 09 16:52:47 crc kubenswrapper[4719]: I1009 16:52:47.464466 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d2mzl_must-gather-pnnpc_3c04f13f-99d3-45a7-afdc-c8c75afe6737/copy/0.log" Oct 09 16:52:47 crc kubenswrapper[4719]: I1009 16:52:47.464935 4719 generic.go:334] "Generic (PLEG): container finished" podID="3c04f13f-99d3-45a7-afdc-c8c75afe6737" containerID="c7f39fecf4987a694e7120a018f1a5978607ead4f98d5eb4b8140ed893279338" exitCode=143 Oct 09 16:52:47 crc kubenswrapper[4719]: I1009 16:52:47.464984 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2mzl/must-gather-pnnpc" Oct 09 16:52:47 crc kubenswrapper[4719]: I1009 16:52:47.465015 4719 scope.go:117] "RemoveContainer" containerID="c7f39fecf4987a694e7120a018f1a5978607ead4f98d5eb4b8140ed893279338" Oct 09 16:52:47 crc kubenswrapper[4719]: I1009 16:52:47.495892 4719 scope.go:117] "RemoveContainer" containerID="008d4782160db9493449fe21d6a79efca2293ac2da9381ded8ac84c29fd0fb09" Oct 09 16:52:47 crc kubenswrapper[4719]: I1009 16:52:47.559956 4719 scope.go:117] "RemoveContainer" containerID="c7f39fecf4987a694e7120a018f1a5978607ead4f98d5eb4b8140ed893279338" Oct 09 16:52:47 crc kubenswrapper[4719]: E1009 16:52:47.560410 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7f39fecf4987a694e7120a018f1a5978607ead4f98d5eb4b8140ed893279338\": container with ID starting with c7f39fecf4987a694e7120a018f1a5978607ead4f98d5eb4b8140ed893279338 not found: ID does not exist" containerID="c7f39fecf4987a694e7120a018f1a5978607ead4f98d5eb4b8140ed893279338" Oct 09 16:52:47 crc kubenswrapper[4719]: I1009 16:52:47.560447 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7f39fecf4987a694e7120a018f1a5978607ead4f98d5eb4b8140ed893279338"} err="failed to get container status \"c7f39fecf4987a694e7120a018f1a5978607ead4f98d5eb4b8140ed893279338\": rpc error: code = NotFound desc = could not find container \"c7f39fecf4987a694e7120a018f1a5978607ead4f98d5eb4b8140ed893279338\": container with ID starting with c7f39fecf4987a694e7120a018f1a5978607ead4f98d5eb4b8140ed893279338 not found: ID does not exist" Oct 09 16:52:47 crc kubenswrapper[4719]: I1009 16:52:47.560472 4719 scope.go:117] "RemoveContainer" containerID="008d4782160db9493449fe21d6a79efca2293ac2da9381ded8ac84c29fd0fb09" Oct 09 16:52:47 crc kubenswrapper[4719]: E1009 16:52:47.560699 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"008d4782160db9493449fe21d6a79efca2293ac2da9381ded8ac84c29fd0fb09\": container with ID starting with 008d4782160db9493449fe21d6a79efca2293ac2da9381ded8ac84c29fd0fb09 not found: ID does not exist" containerID="008d4782160db9493449fe21d6a79efca2293ac2da9381ded8ac84c29fd0fb09" Oct 09 16:52:47 crc kubenswrapper[4719]: I1009 16:52:47.560730 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"008d4782160db9493449fe21d6a79efca2293ac2da9381ded8ac84c29fd0fb09"} err="failed to get container status \"008d4782160db9493449fe21d6a79efca2293ac2da9381ded8ac84c29fd0fb09\": rpc error: code = NotFound desc = could not find container \"008d4782160db9493449fe21d6a79efca2293ac2da9381ded8ac84c29fd0fb09\": container with ID starting with 008d4782160db9493449fe21d6a79efca2293ac2da9381ded8ac84c29fd0fb09 not found: ID does not exist" Oct 09 16:52:49 crc kubenswrapper[4719]: I1009 16:52:49.172116 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c04f13f-99d3-45a7-afdc-c8c75afe6737" path="/var/lib/kubelet/pods/3c04f13f-99d3-45a7-afdc-c8c75afe6737/volumes" Oct 09 16:52:53 crc kubenswrapper[4719]: I1009 16:52:53.742575 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dfxfv"] Oct 09 16:52:53 crc kubenswrapper[4719]: E1009 16:52:53.743738 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c04f13f-99d3-45a7-afdc-c8c75afe6737" containerName="gather" Oct 09 16:52:53 crc kubenswrapper[4719]: I1009 16:52:53.743757 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c04f13f-99d3-45a7-afdc-c8c75afe6737" containerName="gather" Oct 09 16:52:53 crc kubenswrapper[4719]: E1009 16:52:53.743787 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c04f13f-99d3-45a7-afdc-c8c75afe6737" containerName="copy" Oct 09 16:52:53 crc kubenswrapper[4719]: I1009 16:52:53.743796 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c04f13f-99d3-45a7-afdc-c8c75afe6737" containerName="copy" Oct 09 16:52:53 crc kubenswrapper[4719]: E1009 16:52:53.743822 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b03be8d2-2a64-413e-84bc-ea2dbb249d44" containerName="registry-server" Oct 09 16:52:53 crc kubenswrapper[4719]: I1009 16:52:53.743833 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03be8d2-2a64-413e-84bc-ea2dbb249d44" containerName="registry-server" Oct 09 16:52:53 crc kubenswrapper[4719]: E1009 16:52:53.743866 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b03be8d2-2a64-413e-84bc-ea2dbb249d44" containerName="extract-utilities" Oct 09 16:52:53 crc kubenswrapper[4719]: I1009 16:52:53.743883 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03be8d2-2a64-413e-84bc-ea2dbb249d44" containerName="extract-utilities" Oct 09 16:52:53 crc kubenswrapper[4719]: E1009 16:52:53.743916 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b03be8d2-2a64-413e-84bc-ea2dbb249d44" containerName="extract-content" Oct 09 16:52:53 crc kubenswrapper[4719]: I1009 16:52:53.743924 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03be8d2-2a64-413e-84bc-ea2dbb249d44" containerName="extract-content" Oct 09 16:52:53 crc kubenswrapper[4719]: I1009 16:52:53.744238 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c04f13f-99d3-45a7-afdc-c8c75afe6737" containerName="gather" Oct 09 16:52:53 crc kubenswrapper[4719]: I1009 16:52:53.744261 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="b03be8d2-2a64-413e-84bc-ea2dbb249d44" containerName="registry-server" Oct 09 16:52:53 crc kubenswrapper[4719]: I1009 16:52:53.744290 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c04f13f-99d3-45a7-afdc-c8c75afe6737" containerName="copy" Oct 09 16:52:53 crc kubenswrapper[4719]: I1009 16:52:53.746234 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dfxfv" Oct 09 16:52:53 crc kubenswrapper[4719]: I1009 16:52:53.760150 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dfxfv"] Oct 09 16:52:53 crc kubenswrapper[4719]: I1009 16:52:53.900295 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv4nc\" (UniqueName: \"kubernetes.io/projected/f300df4d-8af5-4100-a31c-7f3b2779abc6-kube-api-access-hv4nc\") pod \"redhat-marketplace-dfxfv\" (UID: \"f300df4d-8af5-4100-a31c-7f3b2779abc6\") " pod="openshift-marketplace/redhat-marketplace-dfxfv" Oct 09 16:52:53 crc kubenswrapper[4719]: I1009 16:52:53.900442 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f300df4d-8af5-4100-a31c-7f3b2779abc6-catalog-content\") pod \"redhat-marketplace-dfxfv\" (UID: \"f300df4d-8af5-4100-a31c-7f3b2779abc6\") " pod="openshift-marketplace/redhat-marketplace-dfxfv" Oct 09 16:52:53 crc kubenswrapper[4719]: I1009 16:52:53.900471 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f300df4d-8af5-4100-a31c-7f3b2779abc6-utilities\") pod \"redhat-marketplace-dfxfv\" (UID: \"f300df4d-8af5-4100-a31c-7f3b2779abc6\") " pod="openshift-marketplace/redhat-marketplace-dfxfv" Oct 09 16:52:54 crc kubenswrapper[4719]: I1009 16:52:54.002297 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f300df4d-8af5-4100-a31c-7f3b2779abc6-catalog-content\") pod \"redhat-marketplace-dfxfv\" (UID: \"f300df4d-8af5-4100-a31c-7f3b2779abc6\") " pod="openshift-marketplace/redhat-marketplace-dfxfv" Oct 09 16:52:54 crc kubenswrapper[4719]: I1009 16:52:54.002392 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f300df4d-8af5-4100-a31c-7f3b2779abc6-utilities\") pod \"redhat-marketplace-dfxfv\" (UID: \"f300df4d-8af5-4100-a31c-7f3b2779abc6\") " pod="openshift-marketplace/redhat-marketplace-dfxfv" Oct 09 16:52:54 crc kubenswrapper[4719]: I1009 16:52:54.002544 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv4nc\" (UniqueName: \"kubernetes.io/projected/f300df4d-8af5-4100-a31c-7f3b2779abc6-kube-api-access-hv4nc\") pod \"redhat-marketplace-dfxfv\" (UID: \"f300df4d-8af5-4100-a31c-7f3b2779abc6\") " pod="openshift-marketplace/redhat-marketplace-dfxfv" Oct 09 16:52:54 crc kubenswrapper[4719]: I1009 16:52:54.003190 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f300df4d-8af5-4100-a31c-7f3b2779abc6-utilities\") pod \"redhat-marketplace-dfxfv\" (UID: \"f300df4d-8af5-4100-a31c-7f3b2779abc6\") " pod="openshift-marketplace/redhat-marketplace-dfxfv" Oct 09 16:52:54 crc kubenswrapper[4719]: I1009 16:52:54.003317 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f300df4d-8af5-4100-a31c-7f3b2779abc6-catalog-content\") pod \"redhat-marketplace-dfxfv\" (UID: \"f300df4d-8af5-4100-a31c-7f3b2779abc6\") " pod="openshift-marketplace/redhat-marketplace-dfxfv" Oct 09 16:52:54 crc kubenswrapper[4719]: I1009 16:52:54.030813 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv4nc\" (UniqueName: \"kubernetes.io/projected/f300df4d-8af5-4100-a31c-7f3b2779abc6-kube-api-access-hv4nc\") pod \"redhat-marketplace-dfxfv\" (UID: \"f300df4d-8af5-4100-a31c-7f3b2779abc6\") " pod="openshift-marketplace/redhat-marketplace-dfxfv" Oct 09 16:52:54 crc kubenswrapper[4719]: I1009 16:52:54.087648 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dfxfv" Oct 09 16:52:54 crc kubenswrapper[4719]: I1009 16:52:54.586308 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dfxfv"] Oct 09 16:52:55 crc kubenswrapper[4719]: I1009 16:52:55.543017 4719 generic.go:334] "Generic (PLEG): container finished" podID="f300df4d-8af5-4100-a31c-7f3b2779abc6" containerID="a7d304fb365a61b5cad8488dd1421e0c32b2da783bc95f236fdd22ac9b57ca10" exitCode=0 Oct 09 16:52:55 crc kubenswrapper[4719]: I1009 16:52:55.543099 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfxfv" event={"ID":"f300df4d-8af5-4100-a31c-7f3b2779abc6","Type":"ContainerDied","Data":"a7d304fb365a61b5cad8488dd1421e0c32b2da783bc95f236fdd22ac9b57ca10"} Oct 09 16:52:55 crc kubenswrapper[4719]: I1009 16:52:55.543334 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfxfv" event={"ID":"f300df4d-8af5-4100-a31c-7f3b2779abc6","Type":"ContainerStarted","Data":"2d7f008c45cc100ddd9da5ae0353714e46483caad729463280926c7c6bfd80af"} Oct 09 16:52:55 crc kubenswrapper[4719]: I1009 16:52:55.546196 4719 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 16:52:56 crc kubenswrapper[4719]: I1009 16:52:56.553959 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfxfv" event={"ID":"f300df4d-8af5-4100-a31c-7f3b2779abc6","Type":"ContainerStarted","Data":"afc25d212c02d77fdea596edc98f21210cf2cb49fdd593017b21a6bba7094e67"} Oct 09 16:52:57 crc kubenswrapper[4719]: I1009 16:52:57.566673 4719 generic.go:334] "Generic (PLEG): container finished" podID="f300df4d-8af5-4100-a31c-7f3b2779abc6" containerID="afc25d212c02d77fdea596edc98f21210cf2cb49fdd593017b21a6bba7094e67" exitCode=0 Oct 09 16:52:57 crc kubenswrapper[4719]: I1009 16:52:57.566799 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfxfv" event={"ID":"f300df4d-8af5-4100-a31c-7f3b2779abc6","Type":"ContainerDied","Data":"afc25d212c02d77fdea596edc98f21210cf2cb49fdd593017b21a6bba7094e67"} Oct 09 16:52:58 crc kubenswrapper[4719]: I1009 16:52:58.578075 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfxfv" event={"ID":"f300df4d-8af5-4100-a31c-7f3b2779abc6","Type":"ContainerStarted","Data":"da096276eb7a5c0391c22e9dea3568b6a88da4edda1a212146e2e51e1a680491"} Oct 09 16:52:58 crc kubenswrapper[4719]: I1009 16:52:58.602035 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dfxfv" podStartSLOduration=3.116907118 podStartE2EDuration="5.602018413s" podCreationTimestamp="2025-10-09 16:52:53 +0000 UTC" firstStartedPulling="2025-10-09 16:52:55.54593171 +0000 UTC m=+5681.055642995" lastFinishedPulling="2025-10-09 16:52:58.031043005 +0000 UTC m=+5683.540754290" observedRunningTime="2025-10-09 16:52:58.593488671 +0000 UTC m=+5684.103199956" watchObservedRunningTime="2025-10-09 16:52:58.602018413 +0000 UTC m=+5684.111729698" Oct 09 16:53:04 crc kubenswrapper[4719]: I1009 16:53:04.088412 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dfxfv" Oct 09 16:53:04 crc kubenswrapper[4719]: I1009 16:53:04.088957 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dfxfv" Oct 09 16:53:04 crc kubenswrapper[4719]: I1009 16:53:04.136803 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dfxfv" Oct 09 16:53:04 crc kubenswrapper[4719]: I1009 16:53:04.695114 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dfxfv" Oct 09 16:53:04 crc kubenswrapper[4719]: I1009 16:53:04.748331 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dfxfv"] Oct 09 16:53:06 crc kubenswrapper[4719]: I1009 16:53:06.663794 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dfxfv" podUID="f300df4d-8af5-4100-a31c-7f3b2779abc6" containerName="registry-server" containerID="cri-o://da096276eb7a5c0391c22e9dea3568b6a88da4edda1a212146e2e51e1a680491" gracePeriod=2 Oct 09 16:53:07 crc kubenswrapper[4719]: I1009 16:53:07.186870 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dfxfv" Oct 09 16:53:07 crc kubenswrapper[4719]: I1009 16:53:07.277515 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f300df4d-8af5-4100-a31c-7f3b2779abc6-utilities\") pod \"f300df4d-8af5-4100-a31c-7f3b2779abc6\" (UID: \"f300df4d-8af5-4100-a31c-7f3b2779abc6\") " Oct 09 16:53:07 crc kubenswrapper[4719]: I1009 16:53:07.277744 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f300df4d-8af5-4100-a31c-7f3b2779abc6-catalog-content\") pod \"f300df4d-8af5-4100-a31c-7f3b2779abc6\" (UID: \"f300df4d-8af5-4100-a31c-7f3b2779abc6\") " Oct 09 16:53:07 crc kubenswrapper[4719]: I1009 16:53:07.277872 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv4nc\" (UniqueName: \"kubernetes.io/projected/f300df4d-8af5-4100-a31c-7f3b2779abc6-kube-api-access-hv4nc\") pod \"f300df4d-8af5-4100-a31c-7f3b2779abc6\" (UID: \"f300df4d-8af5-4100-a31c-7f3b2779abc6\") " Oct 09 16:53:07 crc kubenswrapper[4719]: I1009 16:53:07.278428 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f300df4d-8af5-4100-a31c-7f3b2779abc6-utilities" (OuterVolumeSpecName: "utilities") pod "f300df4d-8af5-4100-a31c-7f3b2779abc6" (UID: "f300df4d-8af5-4100-a31c-7f3b2779abc6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:53:07 crc kubenswrapper[4719]: I1009 16:53:07.278812 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f300df4d-8af5-4100-a31c-7f3b2779abc6-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 16:53:07 crc kubenswrapper[4719]: I1009 16:53:07.283495 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f300df4d-8af5-4100-a31c-7f3b2779abc6-kube-api-access-hv4nc" (OuterVolumeSpecName: "kube-api-access-hv4nc") pod "f300df4d-8af5-4100-a31c-7f3b2779abc6" (UID: "f300df4d-8af5-4100-a31c-7f3b2779abc6"). InnerVolumeSpecName "kube-api-access-hv4nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:53:07 crc kubenswrapper[4719]: I1009 16:53:07.294598 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f300df4d-8af5-4100-a31c-7f3b2779abc6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f300df4d-8af5-4100-a31c-7f3b2779abc6" (UID: "f300df4d-8af5-4100-a31c-7f3b2779abc6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:53:07 crc kubenswrapper[4719]: I1009 16:53:07.380563 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f300df4d-8af5-4100-a31c-7f3b2779abc6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 16:53:07 crc kubenswrapper[4719]: I1009 16:53:07.380607 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv4nc\" (UniqueName: \"kubernetes.io/projected/f300df4d-8af5-4100-a31c-7f3b2779abc6-kube-api-access-hv4nc\") on node \"crc\" DevicePath \"\"" Oct 09 16:53:07 crc kubenswrapper[4719]: I1009 16:53:07.674881 4719 generic.go:334] "Generic (PLEG): container finished" podID="f300df4d-8af5-4100-a31c-7f3b2779abc6" containerID="da096276eb7a5c0391c22e9dea3568b6a88da4edda1a212146e2e51e1a680491" exitCode=0 Oct 09 16:53:07 crc kubenswrapper[4719]: I1009 16:53:07.674929 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfxfv" event={"ID":"f300df4d-8af5-4100-a31c-7f3b2779abc6","Type":"ContainerDied","Data":"da096276eb7a5c0391c22e9dea3568b6a88da4edda1a212146e2e51e1a680491"} Oct 09 16:53:07 crc kubenswrapper[4719]: I1009 16:53:07.674959 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfxfv" event={"ID":"f300df4d-8af5-4100-a31c-7f3b2779abc6","Type":"ContainerDied","Data":"2d7f008c45cc100ddd9da5ae0353714e46483caad729463280926c7c6bfd80af"} Oct 09 16:53:07 crc kubenswrapper[4719]: I1009 16:53:07.674978 4719 scope.go:117] "RemoveContainer" containerID="da096276eb7a5c0391c22e9dea3568b6a88da4edda1a212146e2e51e1a680491" Oct 09 16:53:07 crc kubenswrapper[4719]: I1009 16:53:07.675103 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dfxfv" Oct 09 16:53:07 crc kubenswrapper[4719]: I1009 16:53:07.702043 4719 scope.go:117] "RemoveContainer" containerID="afc25d212c02d77fdea596edc98f21210cf2cb49fdd593017b21a6bba7094e67" Oct 09 16:53:07 crc kubenswrapper[4719]: I1009 16:53:07.712788 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dfxfv"] Oct 09 16:53:07 crc kubenswrapper[4719]: I1009 16:53:07.725843 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dfxfv"] Oct 09 16:53:07 crc kubenswrapper[4719]: I1009 16:53:07.747067 4719 scope.go:117] "RemoveContainer" containerID="a7d304fb365a61b5cad8488dd1421e0c32b2da783bc95f236fdd22ac9b57ca10" Oct 09 16:53:07 crc kubenswrapper[4719]: I1009 16:53:07.789313 4719 scope.go:117] "RemoveContainer" containerID="da096276eb7a5c0391c22e9dea3568b6a88da4edda1a212146e2e51e1a680491" Oct 09 16:53:07 crc kubenswrapper[4719]: E1009 16:53:07.790078 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da096276eb7a5c0391c22e9dea3568b6a88da4edda1a212146e2e51e1a680491\": container with ID starting with da096276eb7a5c0391c22e9dea3568b6a88da4edda1a212146e2e51e1a680491 not found: ID does not exist" containerID="da096276eb7a5c0391c22e9dea3568b6a88da4edda1a212146e2e51e1a680491" Oct 09 16:53:07 crc kubenswrapper[4719]: I1009 16:53:07.790123 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da096276eb7a5c0391c22e9dea3568b6a88da4edda1a212146e2e51e1a680491"} err="failed to get container status \"da096276eb7a5c0391c22e9dea3568b6a88da4edda1a212146e2e51e1a680491\": rpc error: code = NotFound desc = could not find container \"da096276eb7a5c0391c22e9dea3568b6a88da4edda1a212146e2e51e1a680491\": container with ID starting with da096276eb7a5c0391c22e9dea3568b6a88da4edda1a212146e2e51e1a680491 not found: ID does not exist" Oct 09 16:53:07 crc kubenswrapper[4719]: I1009 16:53:07.790151 4719 scope.go:117] "RemoveContainer" containerID="afc25d212c02d77fdea596edc98f21210cf2cb49fdd593017b21a6bba7094e67" Oct 09 16:53:07 crc kubenswrapper[4719]: E1009 16:53:07.790506 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afc25d212c02d77fdea596edc98f21210cf2cb49fdd593017b21a6bba7094e67\": container with ID starting with afc25d212c02d77fdea596edc98f21210cf2cb49fdd593017b21a6bba7094e67 not found: ID does not exist" containerID="afc25d212c02d77fdea596edc98f21210cf2cb49fdd593017b21a6bba7094e67" Oct 09 16:53:07 crc kubenswrapper[4719]: I1009 16:53:07.790546 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afc25d212c02d77fdea596edc98f21210cf2cb49fdd593017b21a6bba7094e67"} err="failed to get container status \"afc25d212c02d77fdea596edc98f21210cf2cb49fdd593017b21a6bba7094e67\": rpc error: code = NotFound desc = could not find container \"afc25d212c02d77fdea596edc98f21210cf2cb49fdd593017b21a6bba7094e67\": container with ID starting with afc25d212c02d77fdea596edc98f21210cf2cb49fdd593017b21a6bba7094e67 not found: ID does not exist" Oct 09 16:53:07 crc kubenswrapper[4719]: I1009 16:53:07.790564 4719 scope.go:117] "RemoveContainer" containerID="a7d304fb365a61b5cad8488dd1421e0c32b2da783bc95f236fdd22ac9b57ca10" Oct 09 16:53:07 crc kubenswrapper[4719]: E1009 16:53:07.790861 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7d304fb365a61b5cad8488dd1421e0c32b2da783bc95f236fdd22ac9b57ca10\": container with ID starting with a7d304fb365a61b5cad8488dd1421e0c32b2da783bc95f236fdd22ac9b57ca10 not found: ID does not exist" containerID="a7d304fb365a61b5cad8488dd1421e0c32b2da783bc95f236fdd22ac9b57ca10" Oct 09 16:53:07 crc kubenswrapper[4719]: I1009 16:53:07.790954 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7d304fb365a61b5cad8488dd1421e0c32b2da783bc95f236fdd22ac9b57ca10"} err="failed to get container status \"a7d304fb365a61b5cad8488dd1421e0c32b2da783bc95f236fdd22ac9b57ca10\": rpc error: code = NotFound desc = could not find container \"a7d304fb365a61b5cad8488dd1421e0c32b2da783bc95f236fdd22ac9b57ca10\": container with ID starting with a7d304fb365a61b5cad8488dd1421e0c32b2da783bc95f236fdd22ac9b57ca10 not found: ID does not exist" Oct 09 16:53:09 crc kubenswrapper[4719]: I1009 16:53:09.172727 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f300df4d-8af5-4100-a31c-7f3b2779abc6" path="/var/lib/kubelet/pods/f300df4d-8af5-4100-a31c-7f3b2779abc6/volumes" Oct 09 16:53:28 crc kubenswrapper[4719]: I1009 16:53:28.437613 4719 scope.go:117] "RemoveContainer" containerID="23c115c5b8b865c5806d9934b3eb59b4140b95599ec19e55b09e8127e2c7d639" Oct 09 16:53:33 crc kubenswrapper[4719]: I1009 16:53:33.896955 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tcx8h/must-gather-f8j9c"] Oct 09 16:53:33 crc kubenswrapper[4719]: E1009 16:53:33.898237 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f300df4d-8af5-4100-a31c-7f3b2779abc6" containerName="extract-utilities" Oct 09 16:53:33 crc kubenswrapper[4719]: I1009 16:53:33.898257 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="f300df4d-8af5-4100-a31c-7f3b2779abc6" containerName="extract-utilities" Oct 09 16:53:33 crc kubenswrapper[4719]: E1009 16:53:33.898277 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f300df4d-8af5-4100-a31c-7f3b2779abc6" containerName="extract-content" Oct 09 16:53:33 crc kubenswrapper[4719]: I1009 16:53:33.898285 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="f300df4d-8af5-4100-a31c-7f3b2779abc6" containerName="extract-content" Oct 09 16:53:33 crc kubenswrapper[4719]: E1009 16:53:33.898304 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f300df4d-8af5-4100-a31c-7f3b2779abc6" containerName="registry-server" Oct 09 16:53:33 crc kubenswrapper[4719]: I1009 16:53:33.898312 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="f300df4d-8af5-4100-a31c-7f3b2779abc6" containerName="registry-server" Oct 09 16:53:33 crc kubenswrapper[4719]: I1009 16:53:33.898570 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="f300df4d-8af5-4100-a31c-7f3b2779abc6" containerName="registry-server" Oct 09 16:53:33 crc kubenswrapper[4719]: I1009 16:53:33.899917 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcx8h/must-gather-f8j9c" Oct 09 16:53:33 crc kubenswrapper[4719]: I1009 16:53:33.903568 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-tcx8h"/"openshift-service-ca.crt" Oct 09 16:53:33 crc kubenswrapper[4719]: I1009 16:53:33.903928 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-tcx8h"/"kube-root-ca.crt" Oct 09 16:53:33 crc kubenswrapper[4719]: I1009 16:53:33.932175 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tcx8h/must-gather-f8j9c"] Oct 09 16:53:33 crc kubenswrapper[4719]: I1009 16:53:33.956901 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/86f1c527-f139-431a-ba26-ad6bdc2adefb-must-gather-output\") pod \"must-gather-f8j9c\" (UID: \"86f1c527-f139-431a-ba26-ad6bdc2adefb\") " pod="openshift-must-gather-tcx8h/must-gather-f8j9c" Oct 09 16:53:33 crc kubenswrapper[4719]: I1009 16:53:33.957106 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnn8p\" (UniqueName: \"kubernetes.io/projected/86f1c527-f139-431a-ba26-ad6bdc2adefb-kube-api-access-cnn8p\") pod \"must-gather-f8j9c\" (UID: \"86f1c527-f139-431a-ba26-ad6bdc2adefb\") " pod="openshift-must-gather-tcx8h/must-gather-f8j9c" Oct 09 16:53:34 crc kubenswrapper[4719]: I1009 16:53:34.060325 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/86f1c527-f139-431a-ba26-ad6bdc2adefb-must-gather-output\") pod \"must-gather-f8j9c\" (UID: \"86f1c527-f139-431a-ba26-ad6bdc2adefb\") " pod="openshift-must-gather-tcx8h/must-gather-f8j9c" Oct 09 16:53:34 crc kubenswrapper[4719]: I1009 16:53:34.060470 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnn8p\" (UniqueName: \"kubernetes.io/projected/86f1c527-f139-431a-ba26-ad6bdc2adefb-kube-api-access-cnn8p\") pod \"must-gather-f8j9c\" (UID: \"86f1c527-f139-431a-ba26-ad6bdc2adefb\") " pod="openshift-must-gather-tcx8h/must-gather-f8j9c" Oct 09 16:53:34 crc kubenswrapper[4719]: I1009 16:53:34.060977 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/86f1c527-f139-431a-ba26-ad6bdc2adefb-must-gather-output\") pod \"must-gather-f8j9c\" (UID: \"86f1c527-f139-431a-ba26-ad6bdc2adefb\") " pod="openshift-must-gather-tcx8h/must-gather-f8j9c" Oct 09 16:53:34 crc kubenswrapper[4719]: I1009 16:53:34.082804 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnn8p\" (UniqueName: \"kubernetes.io/projected/86f1c527-f139-431a-ba26-ad6bdc2adefb-kube-api-access-cnn8p\") pod \"must-gather-f8j9c\" (UID: \"86f1c527-f139-431a-ba26-ad6bdc2adefb\") " pod="openshift-must-gather-tcx8h/must-gather-f8j9c" Oct 09 16:53:34 crc kubenswrapper[4719]: I1009 16:53:34.224072 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcx8h/must-gather-f8j9c" Oct 09 16:53:34 crc kubenswrapper[4719]: I1009 16:53:34.727112 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tcx8h/must-gather-f8j9c"] Oct 09 16:53:34 crc kubenswrapper[4719]: I1009 16:53:34.931027 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tcx8h/must-gather-f8j9c" event={"ID":"86f1c527-f139-431a-ba26-ad6bdc2adefb","Type":"ContainerStarted","Data":"709b8788107a5097b3e731d9a263b8e632160e121a1e8d0c075269281f63a184"} Oct 09 16:53:35 crc kubenswrapper[4719]: I1009 16:53:35.940077 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tcx8h/must-gather-f8j9c" event={"ID":"86f1c527-f139-431a-ba26-ad6bdc2adefb","Type":"ContainerStarted","Data":"e994fff5bdc677790c92bc69e21439b8ee8442aaf37eeb5c6059f1198ab58162"} Oct 09 16:53:35 crc kubenswrapper[4719]: I1009 16:53:35.940393 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tcx8h/must-gather-f8j9c" event={"ID":"86f1c527-f139-431a-ba26-ad6bdc2adefb","Type":"ContainerStarted","Data":"a1d984721b5db8deb4ca01d5bb3c88f4b40ef38da180b1f1f8f0a013f5794d8b"} Oct 09 16:53:35 crc kubenswrapper[4719]: I1009 16:53:35.962392 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tcx8h/must-gather-f8j9c" podStartSLOduration=2.962371577 podStartE2EDuration="2.962371577s" podCreationTimestamp="2025-10-09 16:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 16:53:35.959617479 +0000 UTC m=+5721.469328774" watchObservedRunningTime="2025-10-09 16:53:35.962371577 +0000 UTC m=+5721.472082862" Oct 09 16:53:36 crc kubenswrapper[4719]: I1009 16:53:36.976748 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:53:36 crc kubenswrapper[4719]: I1009 16:53:36.980573 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:53:38 crc kubenswrapper[4719]: I1009 16:53:38.989977 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tcx8h/crc-debug-xvqnk"] Oct 09 16:53:38 crc kubenswrapper[4719]: I1009 16:53:38.991863 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcx8h/crc-debug-xvqnk" Oct 09 16:53:38 crc kubenswrapper[4719]: I1009 16:53:38.993718 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-tcx8h"/"default-dockercfg-ssw7t" Oct 09 16:53:39 crc kubenswrapper[4719]: I1009 16:53:39.074030 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1240174-4af8-4feb-9ed9-79d76537c172-host\") pod \"crc-debug-xvqnk\" (UID: \"a1240174-4af8-4feb-9ed9-79d76537c172\") " pod="openshift-must-gather-tcx8h/crc-debug-xvqnk" Oct 09 16:53:39 crc kubenswrapper[4719]: I1009 16:53:39.074093 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc22s\" (UniqueName: \"kubernetes.io/projected/a1240174-4af8-4feb-9ed9-79d76537c172-kube-api-access-zc22s\") pod \"crc-debug-xvqnk\" (UID: \"a1240174-4af8-4feb-9ed9-79d76537c172\") " pod="openshift-must-gather-tcx8h/crc-debug-xvqnk" Oct 09 16:53:39 crc kubenswrapper[4719]: I1009 16:53:39.176546 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc22s\" (UniqueName: \"kubernetes.io/projected/a1240174-4af8-4feb-9ed9-79d76537c172-kube-api-access-zc22s\") pod \"crc-debug-xvqnk\" (UID: \"a1240174-4af8-4feb-9ed9-79d76537c172\") " pod="openshift-must-gather-tcx8h/crc-debug-xvqnk" Oct 09 16:53:39 crc kubenswrapper[4719]: I1009 16:53:39.176774 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1240174-4af8-4feb-9ed9-79d76537c172-host\") pod \"crc-debug-xvqnk\" (UID: \"a1240174-4af8-4feb-9ed9-79d76537c172\") " pod="openshift-must-gather-tcx8h/crc-debug-xvqnk" Oct 09 16:53:39 crc kubenswrapper[4719]: I1009 16:53:39.176862 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1240174-4af8-4feb-9ed9-79d76537c172-host\") pod \"crc-debug-xvqnk\" (UID: \"a1240174-4af8-4feb-9ed9-79d76537c172\") " pod="openshift-must-gather-tcx8h/crc-debug-xvqnk" Oct 09 16:53:39 crc kubenswrapper[4719]: I1009 16:53:39.196514 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc22s\" (UniqueName: \"kubernetes.io/projected/a1240174-4af8-4feb-9ed9-79d76537c172-kube-api-access-zc22s\") pod \"crc-debug-xvqnk\" (UID: \"a1240174-4af8-4feb-9ed9-79d76537c172\") " pod="openshift-must-gather-tcx8h/crc-debug-xvqnk" Oct 09 16:53:39 crc kubenswrapper[4719]: I1009 16:53:39.309395 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcx8h/crc-debug-xvqnk" Oct 09 16:53:39 crc kubenswrapper[4719]: I1009 16:53:39.975910 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tcx8h/crc-debug-xvqnk" event={"ID":"a1240174-4af8-4feb-9ed9-79d76537c172","Type":"ContainerStarted","Data":"c4d100e05f09736d5e4dcb0fe7569d9dcd1504e3a9da744b3ac0bcb00f7d76d5"} Oct 09 16:53:39 crc kubenswrapper[4719]: I1009 16:53:39.976582 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tcx8h/crc-debug-xvqnk" event={"ID":"a1240174-4af8-4feb-9ed9-79d76537c172","Type":"ContainerStarted","Data":"b85a3956b9c9edf69ade2331341d15d2884162c1ca42c79db66cadbf2f9a8840"} Oct 09 16:53:39 crc kubenswrapper[4719]: I1009 16:53:39.996525 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tcx8h/crc-debug-xvqnk" podStartSLOduration=1.9965049540000002 podStartE2EDuration="1.996504954s" podCreationTimestamp="2025-10-09 16:53:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 16:53:39.995408449 +0000 UTC m=+5725.505119744" watchObservedRunningTime="2025-10-09 16:53:39.996504954 +0000 UTC m=+5725.506216229" Oct 09 16:54:06 crc kubenswrapper[4719]: I1009 16:54:06.976803 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:54:06 crc kubenswrapper[4719]: I1009 16:54:06.977442 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:54:19 crc kubenswrapper[4719]: I1009 16:54:19.366249 4719 generic.go:334] "Generic (PLEG): container finished" podID="a1240174-4af8-4feb-9ed9-79d76537c172" containerID="c4d100e05f09736d5e4dcb0fe7569d9dcd1504e3a9da744b3ac0bcb00f7d76d5" exitCode=0 Oct 09 16:54:19 crc kubenswrapper[4719]: I1009 16:54:19.366364 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tcx8h/crc-debug-xvqnk" event={"ID":"a1240174-4af8-4feb-9ed9-79d76537c172","Type":"ContainerDied","Data":"c4d100e05f09736d5e4dcb0fe7569d9dcd1504e3a9da744b3ac0bcb00f7d76d5"} Oct 09 16:54:20 crc kubenswrapper[4719]: I1009 16:54:20.507278 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcx8h/crc-debug-xvqnk" Oct 09 16:54:20 crc kubenswrapper[4719]: I1009 16:54:20.547203 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-tcx8h/crc-debug-xvqnk"] Oct 09 16:54:20 crc kubenswrapper[4719]: I1009 16:54:20.557961 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-tcx8h/crc-debug-xvqnk"] Oct 09 16:54:20 crc kubenswrapper[4719]: I1009 16:54:20.633584 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1240174-4af8-4feb-9ed9-79d76537c172-host\") pod \"a1240174-4af8-4feb-9ed9-79d76537c172\" (UID: \"a1240174-4af8-4feb-9ed9-79d76537c172\") " Oct 09 16:54:20 crc kubenswrapper[4719]: I1009 16:54:20.633644 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc22s\" (UniqueName: \"kubernetes.io/projected/a1240174-4af8-4feb-9ed9-79d76537c172-kube-api-access-zc22s\") pod \"a1240174-4af8-4feb-9ed9-79d76537c172\" (UID: \"a1240174-4af8-4feb-9ed9-79d76537c172\") " Oct 09 16:54:20 crc kubenswrapper[4719]: I1009 16:54:20.633727 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1240174-4af8-4feb-9ed9-79d76537c172-host" (OuterVolumeSpecName: "host") pod "a1240174-4af8-4feb-9ed9-79d76537c172" (UID: "a1240174-4af8-4feb-9ed9-79d76537c172"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 16:54:20 crc kubenswrapper[4719]: I1009 16:54:20.634211 4719 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1240174-4af8-4feb-9ed9-79d76537c172-host\") on node \"crc\" DevicePath \"\"" Oct 09 16:54:20 crc kubenswrapper[4719]: I1009 16:54:20.639394 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1240174-4af8-4feb-9ed9-79d76537c172-kube-api-access-zc22s" (OuterVolumeSpecName: "kube-api-access-zc22s") pod "a1240174-4af8-4feb-9ed9-79d76537c172" (UID: "a1240174-4af8-4feb-9ed9-79d76537c172"). InnerVolumeSpecName "kube-api-access-zc22s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:54:20 crc kubenswrapper[4719]: I1009 16:54:20.736516 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc22s\" (UniqueName: \"kubernetes.io/projected/a1240174-4af8-4feb-9ed9-79d76537c172-kube-api-access-zc22s\") on node \"crc\" DevicePath \"\"" Oct 09 16:54:21 crc kubenswrapper[4719]: I1009 16:54:21.172535 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1240174-4af8-4feb-9ed9-79d76537c172" path="/var/lib/kubelet/pods/a1240174-4af8-4feb-9ed9-79d76537c172/volumes" Oct 09 16:54:21 crc kubenswrapper[4719]: I1009 16:54:21.390290 4719 scope.go:117] "RemoveContainer" containerID="c4d100e05f09736d5e4dcb0fe7569d9dcd1504e3a9da744b3ac0bcb00f7d76d5" Oct 09 16:54:21 crc kubenswrapper[4719]: I1009 16:54:21.390331 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcx8h/crc-debug-xvqnk" Oct 09 16:54:21 crc kubenswrapper[4719]: I1009 16:54:21.712947 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tcx8h/crc-debug-dkh4q"] Oct 09 16:54:21 crc kubenswrapper[4719]: E1009 16:54:21.713680 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1240174-4af8-4feb-9ed9-79d76537c172" containerName="container-00" Oct 09 16:54:21 crc kubenswrapper[4719]: I1009 16:54:21.713693 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1240174-4af8-4feb-9ed9-79d76537c172" containerName="container-00" Oct 09 16:54:21 crc kubenswrapper[4719]: I1009 16:54:21.713967 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1240174-4af8-4feb-9ed9-79d76537c172" containerName="container-00" Oct 09 16:54:21 crc kubenswrapper[4719]: I1009 16:54:21.714668 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcx8h/crc-debug-dkh4q" Oct 09 16:54:21 crc kubenswrapper[4719]: I1009 16:54:21.716658 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-tcx8h"/"default-dockercfg-ssw7t" Oct 09 16:54:21 crc kubenswrapper[4719]: I1009 16:54:21.857513 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6flb\" (UniqueName: \"kubernetes.io/projected/11b20235-4190-4026-bc8d-59c16f8c9715-kube-api-access-c6flb\") pod \"crc-debug-dkh4q\" (UID: \"11b20235-4190-4026-bc8d-59c16f8c9715\") " pod="openshift-must-gather-tcx8h/crc-debug-dkh4q" Oct 09 16:54:21 crc kubenswrapper[4719]: I1009 16:54:21.857686 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/11b20235-4190-4026-bc8d-59c16f8c9715-host\") pod \"crc-debug-dkh4q\" (UID: \"11b20235-4190-4026-bc8d-59c16f8c9715\") " pod="openshift-must-gather-tcx8h/crc-debug-dkh4q" Oct 09 16:54:21 crc kubenswrapper[4719]: I1009 16:54:21.959499 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6flb\" (UniqueName: \"kubernetes.io/projected/11b20235-4190-4026-bc8d-59c16f8c9715-kube-api-access-c6flb\") pod \"crc-debug-dkh4q\" (UID: \"11b20235-4190-4026-bc8d-59c16f8c9715\") " pod="openshift-must-gather-tcx8h/crc-debug-dkh4q" Oct 09 16:54:21 crc kubenswrapper[4719]: I1009 16:54:21.959673 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/11b20235-4190-4026-bc8d-59c16f8c9715-host\") pod \"crc-debug-dkh4q\" (UID: \"11b20235-4190-4026-bc8d-59c16f8c9715\") " pod="openshift-must-gather-tcx8h/crc-debug-dkh4q" Oct 09 16:54:21 crc kubenswrapper[4719]: I1009 16:54:21.959792 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/11b20235-4190-4026-bc8d-59c16f8c9715-host\") pod \"crc-debug-dkh4q\" (UID: \"11b20235-4190-4026-bc8d-59c16f8c9715\") " pod="openshift-must-gather-tcx8h/crc-debug-dkh4q" Oct 09 16:54:21 crc kubenswrapper[4719]: I1009 16:54:21.980120 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6flb\" (UniqueName: \"kubernetes.io/projected/11b20235-4190-4026-bc8d-59c16f8c9715-kube-api-access-c6flb\") pod \"crc-debug-dkh4q\" (UID: \"11b20235-4190-4026-bc8d-59c16f8c9715\") " pod="openshift-must-gather-tcx8h/crc-debug-dkh4q" Oct 09 16:54:22 crc kubenswrapper[4719]: I1009 16:54:22.031874 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcx8h/crc-debug-dkh4q" Oct 09 16:54:22 crc kubenswrapper[4719]: I1009 16:54:22.425934 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tcx8h/crc-debug-dkh4q" event={"ID":"11b20235-4190-4026-bc8d-59c16f8c9715","Type":"ContainerStarted","Data":"727fb0b38ff655544391d4a197df0e79ffb279d26207aeab24bcd8bb20ed9e84"} Oct 09 16:54:22 crc kubenswrapper[4719]: I1009 16:54:22.426002 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tcx8h/crc-debug-dkh4q" event={"ID":"11b20235-4190-4026-bc8d-59c16f8c9715","Type":"ContainerStarted","Data":"c575768b849e444a7747e5ccf5202756647fa81f287115ce341f24775151b3b4"} Oct 09 16:54:22 crc kubenswrapper[4719]: I1009 16:54:22.457847 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tcx8h/crc-debug-dkh4q" podStartSLOduration=1.457823502 podStartE2EDuration="1.457823502s" podCreationTimestamp="2025-10-09 16:54:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 16:54:22.439940942 +0000 UTC m=+5767.949652227" watchObservedRunningTime="2025-10-09 16:54:22.457823502 +0000 UTC m=+5767.967534787" Oct 09 16:54:23 crc kubenswrapper[4719]: I1009 16:54:23.447472 4719 generic.go:334] "Generic (PLEG): container finished" podID="11b20235-4190-4026-bc8d-59c16f8c9715" containerID="727fb0b38ff655544391d4a197df0e79ffb279d26207aeab24bcd8bb20ed9e84" exitCode=0 Oct 09 16:54:23 crc kubenswrapper[4719]: I1009 16:54:23.447603 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tcx8h/crc-debug-dkh4q" event={"ID":"11b20235-4190-4026-bc8d-59c16f8c9715","Type":"ContainerDied","Data":"727fb0b38ff655544391d4a197df0e79ffb279d26207aeab24bcd8bb20ed9e84"} Oct 09 16:54:24 crc kubenswrapper[4719]: I1009 16:54:24.569840 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcx8h/crc-debug-dkh4q" Oct 09 16:54:24 crc kubenswrapper[4719]: I1009 16:54:24.621483 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-tcx8h/crc-debug-dkh4q"] Oct 09 16:54:24 crc kubenswrapper[4719]: I1009 16:54:24.629712 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-tcx8h/crc-debug-dkh4q"] Oct 09 16:54:24 crc kubenswrapper[4719]: I1009 16:54:24.712941 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6flb\" (UniqueName: \"kubernetes.io/projected/11b20235-4190-4026-bc8d-59c16f8c9715-kube-api-access-c6flb\") pod \"11b20235-4190-4026-bc8d-59c16f8c9715\" (UID: \"11b20235-4190-4026-bc8d-59c16f8c9715\") " Oct 09 16:54:24 crc kubenswrapper[4719]: I1009 16:54:24.713375 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/11b20235-4190-4026-bc8d-59c16f8c9715-host\") pod \"11b20235-4190-4026-bc8d-59c16f8c9715\" (UID: \"11b20235-4190-4026-bc8d-59c16f8c9715\") " Oct 09 16:54:24 crc kubenswrapper[4719]: I1009 16:54:24.713494 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11b20235-4190-4026-bc8d-59c16f8c9715-host" (OuterVolumeSpecName: "host") pod "11b20235-4190-4026-bc8d-59c16f8c9715" (UID: "11b20235-4190-4026-bc8d-59c16f8c9715"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 16:54:24 crc kubenswrapper[4719]: I1009 16:54:24.714192 4719 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/11b20235-4190-4026-bc8d-59c16f8c9715-host\") on node \"crc\" DevicePath \"\"" Oct 09 16:54:24 crc kubenswrapper[4719]: I1009 16:54:24.730559 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11b20235-4190-4026-bc8d-59c16f8c9715-kube-api-access-c6flb" (OuterVolumeSpecName: "kube-api-access-c6flb") pod "11b20235-4190-4026-bc8d-59c16f8c9715" (UID: "11b20235-4190-4026-bc8d-59c16f8c9715"). InnerVolumeSpecName "kube-api-access-c6flb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:54:24 crc kubenswrapper[4719]: I1009 16:54:24.816663 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6flb\" (UniqueName: \"kubernetes.io/projected/11b20235-4190-4026-bc8d-59c16f8c9715-kube-api-access-c6flb\") on node \"crc\" DevicePath \"\"" Oct 09 16:54:25 crc kubenswrapper[4719]: I1009 16:54:25.201216 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11b20235-4190-4026-bc8d-59c16f8c9715" path="/var/lib/kubelet/pods/11b20235-4190-4026-bc8d-59c16f8c9715/volumes" Oct 09 16:54:25 crc kubenswrapper[4719]: I1009 16:54:25.467228 4719 scope.go:117] "RemoveContainer" containerID="727fb0b38ff655544391d4a197df0e79ffb279d26207aeab24bcd8bb20ed9e84" Oct 09 16:54:25 crc kubenswrapper[4719]: I1009 16:54:25.467260 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcx8h/crc-debug-dkh4q" Oct 09 16:54:25 crc kubenswrapper[4719]: I1009 16:54:25.792419 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tcx8h/crc-debug-l2vj8"] Oct 09 16:54:25 crc kubenswrapper[4719]: E1009 16:54:25.792863 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11b20235-4190-4026-bc8d-59c16f8c9715" containerName="container-00" Oct 09 16:54:25 crc kubenswrapper[4719]: I1009 16:54:25.792874 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="11b20235-4190-4026-bc8d-59c16f8c9715" containerName="container-00" Oct 09 16:54:25 crc kubenswrapper[4719]: I1009 16:54:25.793087 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="11b20235-4190-4026-bc8d-59c16f8c9715" containerName="container-00" Oct 09 16:54:25 crc kubenswrapper[4719]: I1009 16:54:25.793757 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcx8h/crc-debug-l2vj8" Oct 09 16:54:25 crc kubenswrapper[4719]: I1009 16:54:25.795812 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-tcx8h"/"default-dockercfg-ssw7t" Oct 09 16:54:25 crc kubenswrapper[4719]: I1009 16:54:25.836857 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a651056-92ee-471d-b978-0fff9a179a8f-host\") pod \"crc-debug-l2vj8\" (UID: \"4a651056-92ee-471d-b978-0fff9a179a8f\") " pod="openshift-must-gather-tcx8h/crc-debug-l2vj8" Oct 09 16:54:25 crc kubenswrapper[4719]: I1009 16:54:25.837133 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvmbb\" (UniqueName: \"kubernetes.io/projected/4a651056-92ee-471d-b978-0fff9a179a8f-kube-api-access-zvmbb\") pod \"crc-debug-l2vj8\" (UID: \"4a651056-92ee-471d-b978-0fff9a179a8f\") " pod="openshift-must-gather-tcx8h/crc-debug-l2vj8" Oct 09 16:54:25 crc kubenswrapper[4719]: I1009 16:54:25.939338 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvmbb\" (UniqueName: \"kubernetes.io/projected/4a651056-92ee-471d-b978-0fff9a179a8f-kube-api-access-zvmbb\") pod \"crc-debug-l2vj8\" (UID: \"4a651056-92ee-471d-b978-0fff9a179a8f\") " pod="openshift-must-gather-tcx8h/crc-debug-l2vj8" Oct 09 16:54:25 crc kubenswrapper[4719]: I1009 16:54:25.939524 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a651056-92ee-471d-b978-0fff9a179a8f-host\") pod \"crc-debug-l2vj8\" (UID: \"4a651056-92ee-471d-b978-0fff9a179a8f\") " pod="openshift-must-gather-tcx8h/crc-debug-l2vj8" Oct 09 16:54:25 crc kubenswrapper[4719]: I1009 16:54:25.939611 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a651056-92ee-471d-b978-0fff9a179a8f-host\") pod \"crc-debug-l2vj8\" (UID: \"4a651056-92ee-471d-b978-0fff9a179a8f\") " pod="openshift-must-gather-tcx8h/crc-debug-l2vj8" Oct 09 16:54:25 crc kubenswrapper[4719]: I1009 16:54:25.964205 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvmbb\" (UniqueName: \"kubernetes.io/projected/4a651056-92ee-471d-b978-0fff9a179a8f-kube-api-access-zvmbb\") pod \"crc-debug-l2vj8\" (UID: \"4a651056-92ee-471d-b978-0fff9a179a8f\") " pod="openshift-must-gather-tcx8h/crc-debug-l2vj8" Oct 09 16:54:26 crc kubenswrapper[4719]: I1009 16:54:26.110047 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcx8h/crc-debug-l2vj8" Oct 09 16:54:26 crc kubenswrapper[4719]: I1009 16:54:26.481906 4719 generic.go:334] "Generic (PLEG): container finished" podID="4a651056-92ee-471d-b978-0fff9a179a8f" containerID="0843449edee7b53f8dc766e9b79e0312a25356a76a62519886708cd1d2f67a09" exitCode=0 Oct 09 16:54:26 crc kubenswrapper[4719]: I1009 16:54:26.482189 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tcx8h/crc-debug-l2vj8" event={"ID":"4a651056-92ee-471d-b978-0fff9a179a8f","Type":"ContainerDied","Data":"0843449edee7b53f8dc766e9b79e0312a25356a76a62519886708cd1d2f67a09"} Oct 09 16:54:26 crc kubenswrapper[4719]: I1009 16:54:26.482614 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tcx8h/crc-debug-l2vj8" event={"ID":"4a651056-92ee-471d-b978-0fff9a179a8f","Type":"ContainerStarted","Data":"0b7efd20eee040882ab867d14515e9681bbf574576ef66a73b8415b6e403b4ae"} Oct 09 16:54:26 crc kubenswrapper[4719]: I1009 16:54:26.529466 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-tcx8h/crc-debug-l2vj8"] Oct 09 16:54:26 crc kubenswrapper[4719]: I1009 16:54:26.539617 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-tcx8h/crc-debug-l2vj8"] Oct 09 16:54:27 crc kubenswrapper[4719]: I1009 16:54:27.612251 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcx8h/crc-debug-l2vj8" Oct 09 16:54:27 crc kubenswrapper[4719]: I1009 16:54:27.671232 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a651056-92ee-471d-b978-0fff9a179a8f-host\") pod \"4a651056-92ee-471d-b978-0fff9a179a8f\" (UID: \"4a651056-92ee-471d-b978-0fff9a179a8f\") " Oct 09 16:54:27 crc kubenswrapper[4719]: I1009 16:54:27.671317 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvmbb\" (UniqueName: \"kubernetes.io/projected/4a651056-92ee-471d-b978-0fff9a179a8f-kube-api-access-zvmbb\") pod \"4a651056-92ee-471d-b978-0fff9a179a8f\" (UID: \"4a651056-92ee-471d-b978-0fff9a179a8f\") " Oct 09 16:54:27 crc kubenswrapper[4719]: I1009 16:54:27.671344 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a651056-92ee-471d-b978-0fff9a179a8f-host" (OuterVolumeSpecName: "host") pod "4a651056-92ee-471d-b978-0fff9a179a8f" (UID: "4a651056-92ee-471d-b978-0fff9a179a8f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 16:54:27 crc kubenswrapper[4719]: I1009 16:54:27.671854 4719 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a651056-92ee-471d-b978-0fff9a179a8f-host\") on node \"crc\" DevicePath \"\"" Oct 09 16:54:27 crc kubenswrapper[4719]: I1009 16:54:27.676779 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a651056-92ee-471d-b978-0fff9a179a8f-kube-api-access-zvmbb" (OuterVolumeSpecName: "kube-api-access-zvmbb") pod "4a651056-92ee-471d-b978-0fff9a179a8f" (UID: "4a651056-92ee-471d-b978-0fff9a179a8f"). InnerVolumeSpecName "kube-api-access-zvmbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:54:27 crc kubenswrapper[4719]: I1009 16:54:27.774050 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvmbb\" (UniqueName: \"kubernetes.io/projected/4a651056-92ee-471d-b978-0fff9a179a8f-kube-api-access-zvmbb\") on node \"crc\" DevicePath \"\"" Oct 09 16:54:28 crc kubenswrapper[4719]: I1009 16:54:28.504250 4719 scope.go:117] "RemoveContainer" containerID="0843449edee7b53f8dc766e9b79e0312a25356a76a62519886708cd1d2f67a09" Oct 09 16:54:28 crc kubenswrapper[4719]: I1009 16:54:28.504321 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcx8h/crc-debug-l2vj8" Oct 09 16:54:28 crc kubenswrapper[4719]: I1009 16:54:28.528678 4719 scope.go:117] "RemoveContainer" containerID="30dffeaaed714faadf04ee26e38b1aa2496be5d91df1d1696b070438d2bc5847" Oct 09 16:54:29 crc kubenswrapper[4719]: I1009 16:54:29.188421 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a651056-92ee-471d-b978-0fff9a179a8f" path="/var/lib/kubelet/pods/4a651056-92ee-471d-b978-0fff9a179a8f/volumes" Oct 09 16:54:36 crc kubenswrapper[4719]: I1009 16:54:36.976548 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:54:36 crc kubenswrapper[4719]: I1009 16:54:36.977085 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:54:36 crc kubenswrapper[4719]: I1009 16:54:36.977127 4719 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 16:54:36 crc kubenswrapper[4719]: I1009 16:54:36.977913 4719 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cb433a3bb385cb61f77433e06e45a3b88f2918e28fef2f3286d20b0b2ec3257e"} pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 16:54:36 crc kubenswrapper[4719]: I1009 16:54:36.977959 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" containerID="cri-o://cb433a3bb385cb61f77433e06e45a3b88f2918e28fef2f3286d20b0b2ec3257e" gracePeriod=600 Oct 09 16:54:37 crc kubenswrapper[4719]: I1009 16:54:37.582591 4719 generic.go:334] "Generic (PLEG): container finished" podID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerID="cb433a3bb385cb61f77433e06e45a3b88f2918e28fef2f3286d20b0b2ec3257e" exitCode=0 Oct 09 16:54:37 crc kubenswrapper[4719]: I1009 16:54:37.582653 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerDied","Data":"cb433a3bb385cb61f77433e06e45a3b88f2918e28fef2f3286d20b0b2ec3257e"} Oct 09 16:54:37 crc kubenswrapper[4719]: I1009 16:54:37.583006 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerStarted","Data":"20566c16e7a94ebae587edc494a04ff82794280a90c676730ee03cece4dfb016"} Oct 09 16:54:37 crc kubenswrapper[4719]: I1009 16:54:37.583033 4719 scope.go:117] "RemoveContainer" containerID="d7d38bf7ba8f9d934644b1093a0a8aa65e0c062854c0b93f0aea2fed354e8199" Oct 09 16:54:55 crc kubenswrapper[4719]: I1009 16:54:55.546568 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b4958cb64-65wft_0a621c39-47f2-4b25-ac34-cf712d8b27c3/barbican-api/0.log" Oct 09 16:54:55 crc kubenswrapper[4719]: I1009 16:54:55.674099 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b4958cb64-65wft_0a621c39-47f2-4b25-ac34-cf712d8b27c3/barbican-api-log/0.log" Oct 09 16:54:55 crc kubenswrapper[4719]: I1009 16:54:55.735275 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7ccff5b764-rskpw_276c7ea1-10eb-4a7d-9eb1-50c62518b5b4/barbican-keystone-listener/0.log" Oct 09 16:54:55 crc kubenswrapper[4719]: I1009 16:54:55.854313 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7ccff5b764-rskpw_276c7ea1-10eb-4a7d-9eb1-50c62518b5b4/barbican-keystone-listener-log/0.log" Oct 09 16:54:55 crc kubenswrapper[4719]: I1009 16:54:55.916047 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-68857c4d7f-ns5gc_e305acce-34be-4503-b643-b60e4201ecfa/barbican-worker/0.log" Oct 09 16:54:55 crc kubenswrapper[4719]: I1009 16:54:55.951299 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-68857c4d7f-ns5gc_e305acce-34be-4503-b643-b60e4201ecfa/barbican-worker-log/0.log" Oct 09 16:54:56 crc kubenswrapper[4719]: I1009 16:54:56.155528 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-jhrxh_75c7240d-03e4-40f9-a915-c85892b060d9/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:54:56 crc kubenswrapper[4719]: I1009 16:54:56.273525 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c2a64def-d060-46e7-8792-835bb734a809/ceilometer-central-agent/0.log" Oct 09 16:54:56 crc kubenswrapper[4719]: I1009 16:54:56.361970 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c2a64def-d060-46e7-8792-835bb734a809/ceilometer-notification-agent/0.log" Oct 09 16:54:56 crc kubenswrapper[4719]: I1009 16:54:56.389313 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c2a64def-d060-46e7-8792-835bb734a809/proxy-httpd/0.log" Oct 09 16:54:56 crc kubenswrapper[4719]: I1009 16:54:56.390075 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c2a64def-d060-46e7-8792-835bb734a809/sg-core/0.log" Oct 09 16:54:56 crc kubenswrapper[4719]: I1009 16:54:56.598053 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f5b332c9-c154-4ef0-8921-4e329b4b504a/cinder-api-log/0.log" Oct 09 16:54:56 crc kubenswrapper[4719]: I1009 16:54:56.849628 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_7fd9ad9a-1651-46cc-9c22-adae6a548ef8/probe/0.log" Oct 09 16:54:57 crc kubenswrapper[4719]: I1009 16:54:57.159081 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_46e46828-5596-4987-8998-c52dbaf93086/cinder-scheduler/0.log" Oct 09 16:54:57 crc kubenswrapper[4719]: I1009 16:54:57.198836 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_46e46828-5596-4987-8998-c52dbaf93086/probe/0.log" Oct 09 16:54:57 crc kubenswrapper[4719]: I1009 16:54:57.225943 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_7fd9ad9a-1651-46cc-9c22-adae6a548ef8/cinder-backup/0.log" Oct 09 16:54:57 crc kubenswrapper[4719]: I1009 16:54:57.316244 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f5b332c9-c154-4ef0-8921-4e329b4b504a/cinder-api/0.log" Oct 09 16:54:57 crc kubenswrapper[4719]: I1009 16:54:57.430123 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_b95abe4c-159b-460a-b238-3be4b341ccc2/probe/0.log" Oct 09 16:54:57 crc kubenswrapper[4719]: I1009 16:54:57.660237 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_b95abe4c-159b-460a-b238-3be4b341ccc2/cinder-volume/0.log" Oct 09 16:54:57 crc kubenswrapper[4719]: I1009 16:54:57.665683 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_c3633b1f-2c6e-4483-8255-71551f8a25db/probe/0.log" Oct 09 16:54:57 crc kubenswrapper[4719]: I1009 16:54:57.775692 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_c3633b1f-2c6e-4483-8255-71551f8a25db/cinder-volume/0.log" Oct 09 16:54:57 crc kubenswrapper[4719]: I1009 16:54:57.899592 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-fstf7_6368a031-4a2d-43bd-a289-fd9966d38182/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:54:58 crc kubenswrapper[4719]: I1009 16:54:57.999979 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-b2llw_6fa2621b-c679-4391-9058-cd2a871264df/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:54:58 crc kubenswrapper[4719]: I1009 16:54:58.083482 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-blvrz_c2ecd37c-0c41-4b8f-8072-c690aa729218/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:54:58 crc kubenswrapper[4719]: I1009 16:54:58.175270 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-756dcbffdc-f275x_78bd028a-6324-402f-80e4-5a712e07bfb6/init/0.log" Oct 09 16:54:58 crc kubenswrapper[4719]: I1009 16:54:58.428952 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-756dcbffdc-f275x_78bd028a-6324-402f-80e4-5a712e07bfb6/init/0.log" Oct 09 16:54:58 crc kubenswrapper[4719]: I1009 16:54:58.538413 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-r8nbl_49f3b180-01ca-489f-9a12-5e22d186b1b7/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:54:58 crc kubenswrapper[4719]: I1009 16:54:58.569826 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-756dcbffdc-f275x_78bd028a-6324-402f-80e4-5a712e07bfb6/dnsmasq-dns/0.log" Oct 09 16:54:58 crc kubenswrapper[4719]: I1009 16:54:58.690662 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_b3bab132-2f43-4321-99c6-6164f0f93e86/glance-httpd/0.log" Oct 09 16:54:58 crc kubenswrapper[4719]: I1009 16:54:58.775020 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_b3bab132-2f43-4321-99c6-6164f0f93e86/glance-log/0.log" Oct 09 16:54:58 crc kubenswrapper[4719]: I1009 16:54:58.901841 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_de5609da-6273-4076-9f02-b6c4614ebd07/glance-httpd/0.log" Oct 09 16:54:58 crc kubenswrapper[4719]: I1009 16:54:58.914702 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_de5609da-6273-4076-9f02-b6c4614ebd07/glance-log/0.log" Oct 09 16:54:59 crc kubenswrapper[4719]: I1009 16:54:59.131185 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6f5bc696cd-sgqb2_a66fd9c2-b3cc-43db-b520-6972ce53871f/horizon/0.log" Oct 09 16:54:59 crc kubenswrapper[4719]: I1009 16:54:59.196159 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-kgtnw_976acf87-d11d-47a4-ad0d-2119fc70504c/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:54:59 crc kubenswrapper[4719]: I1009 16:54:59.389856 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-fm7mp_1f37f188-1b48-4b10-a085-e6a44d7e16d5/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:54:59 crc kubenswrapper[4719]: I1009 16:54:59.793983 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29333761-x56tx_88187911-7d06-4147-97ad-9279f3e101e0/keystone-cron/0.log" Oct 09 16:54:59 crc kubenswrapper[4719]: I1009 16:54:59.979629 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_1f083e47-9fa6-462e-b596-8665719a2e4f/kube-state-metrics/0.log" Oct 09 16:55:00 crc kubenswrapper[4719]: I1009 16:55:00.137010 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6f5bc696cd-sgqb2_a66fd9c2-b3cc-43db-b520-6972ce53871f/horizon-log/0.log" Oct 09 16:55:00 crc kubenswrapper[4719]: I1009 16:55:00.248577 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-69dbc5fbc7-t286g_ad01be3d-57da-4019-8059-f0a78501266b/keystone-api/0.log" Oct 09 16:55:00 crc kubenswrapper[4719]: I1009 16:55:00.282822 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-7mv72_2e6147d7-8fb5-4ce2-a4b1-2227f85b2b2f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:55:00 crc kubenswrapper[4719]: I1009 16:55:00.656548 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-jr9dg_d36d0870-b55a-4791-9554-11d38e304e92/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:55:00 crc kubenswrapper[4719]: I1009 16:55:00.845235 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-57c55b4b47-8npb9_f4a6c362-de01-454a-a0d8-7ea4c677720c/neutron-httpd/0.log" Oct 09 16:55:00 crc kubenswrapper[4719]: I1009 16:55:00.960542 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-57c55b4b47-8npb9_f4a6c362-de01-454a-a0d8-7ea4c677720c/neutron-api/0.log" Oct 09 16:55:01 crc kubenswrapper[4719]: I1009 16:55:01.513074 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_1a46282a-5af1-483f-97a4-b96fd855dc00/nova-cell0-conductor-conductor/0.log" Oct 09 16:55:01 crc kubenswrapper[4719]: I1009 16:55:01.724243 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_d0cbbab9-8de8-43f9-bf34-b235d2fb4400/nova-cell1-conductor-conductor/0.log" Oct 09 16:55:02 crc kubenswrapper[4719]: I1009 16:55:02.171605 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_0ff2cdae-bf76-4452-9d8d-26560a89a2da/nova-cell1-novncproxy-novncproxy/0.log" Oct 09 16:55:02 crc kubenswrapper[4719]: I1009 16:55:02.262071 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-vc5ss_23a47423-b3ad-4ba3-b0ab-9a452d485f2b/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:55:02 crc kubenswrapper[4719]: I1009 16:55:02.473231 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_c3ae86dd-d13b-46b0-8f6f-a913c783a884/nova-api-log/0.log" Oct 09 16:55:02 crc kubenswrapper[4719]: I1009 16:55:02.564079 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c1cdf53c-cd57-4e4c-85b0-178a7bc15043/nova-metadata-log/0.log" Oct 09 16:55:02 crc kubenswrapper[4719]: I1009 16:55:02.992495 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_c3ae86dd-d13b-46b0-8f6f-a913c783a884/nova-api-api/0.log" Oct 09 16:55:03 crc kubenswrapper[4719]: I1009 16:55:03.151719 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d6970b67-4ebd-401d-838b-8be92b8ba72f/mysql-bootstrap/0.log" Oct 09 16:55:03 crc kubenswrapper[4719]: I1009 16:55:03.247763 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c8b2b868-83d7-496b-8036-a10584724f35/nova-scheduler-scheduler/0.log" Oct 09 16:55:03 crc kubenswrapper[4719]: I1009 16:55:03.381586 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d6970b67-4ebd-401d-838b-8be92b8ba72f/galera/0.log" Oct 09 16:55:03 crc kubenswrapper[4719]: I1009 16:55:03.385157 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d6970b67-4ebd-401d-838b-8be92b8ba72f/mysql-bootstrap/0.log" Oct 09 16:55:03 crc kubenswrapper[4719]: I1009 16:55:03.590795 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_05ff8a95-a910-4095-930b-e42c575bf4b8/mysql-bootstrap/0.log" Oct 09 16:55:03 crc kubenswrapper[4719]: I1009 16:55:03.767695 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_05ff8a95-a910-4095-930b-e42c575bf4b8/mysql-bootstrap/0.log" Oct 09 16:55:03 crc kubenswrapper[4719]: I1009 16:55:03.807333 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_05ff8a95-a910-4095-930b-e42c575bf4b8/galera/0.log" Oct 09 16:55:03 crc kubenswrapper[4719]: I1009 16:55:03.993385 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_893b05af-4bf3-4c76-940c-3ed1cceb7e18/openstackclient/0.log" Oct 09 16:55:04 crc kubenswrapper[4719]: I1009 16:55:04.111580 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-skb56_a6539f12-5508-4c6d-870a-d19815ba3120/openstack-network-exporter/0.log" Oct 09 16:55:04 crc kubenswrapper[4719]: I1009 16:55:04.284401 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hqfvq_07b112ef-0e6a-4927-93e4-d5fc023e495f/ovsdb-server-init/0.log" Oct 09 16:55:04 crc kubenswrapper[4719]: I1009 16:55:04.446149 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hqfvq_07b112ef-0e6a-4927-93e4-d5fc023e495f/ovsdb-server-init/0.log" Oct 09 16:55:04 crc kubenswrapper[4719]: I1009 16:55:04.489410 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hqfvq_07b112ef-0e6a-4927-93e4-d5fc023e495f/ovsdb-server/0.log" Oct 09 16:55:04 crc kubenswrapper[4719]: I1009 16:55:04.698931 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-p4t6l_f0151a18-0608-47b9-b58a-7eef9dfaf31b/ovn-controller/0.log" Oct 09 16:55:04 crc kubenswrapper[4719]: I1009 16:55:04.942675 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hqfvq_07b112ef-0e6a-4927-93e4-d5fc023e495f/ovs-vswitchd/0.log" Oct 09 16:55:04 crc kubenswrapper[4719]: I1009 16:55:04.971194 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-7f2xg_a768f51e-2990-40f5-84df-13c410d05385/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:55:05 crc kubenswrapper[4719]: I1009 16:55:05.196516 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4276fa06-e8dc-40e0-8276-eaf58420e0ca/openstack-network-exporter/0.log" Oct 09 16:55:05 crc kubenswrapper[4719]: I1009 16:55:05.197432 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c1cdf53c-cd57-4e4c-85b0-178a7bc15043/nova-metadata-metadata/0.log" Oct 09 16:55:05 crc kubenswrapper[4719]: I1009 16:55:05.227423 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4276fa06-e8dc-40e0-8276-eaf58420e0ca/ovn-northd/0.log" Oct 09 16:55:05 crc kubenswrapper[4719]: I1009 16:55:05.409614 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_29d7fec9-be2c-4fa8-9191-5ffaf287f825/openstack-network-exporter/0.log" Oct 09 16:55:05 crc kubenswrapper[4719]: I1009 16:55:05.422565 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_29d7fec9-be2c-4fa8-9191-5ffaf287f825/ovsdbserver-nb/0.log" Oct 09 16:55:05 crc kubenswrapper[4719]: I1009 16:55:05.667258 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c/openstack-network-exporter/0.log" Oct 09 16:55:05 crc kubenswrapper[4719]: I1009 16:55:05.667283 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_dbbe3e0c-44f1-4ad5-89a9-70d73acfc81c/ovsdbserver-sb/0.log" Oct 09 16:55:05 crc kubenswrapper[4719]: I1009 16:55:05.957936 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_9cebaae5-69d4-4429-a062-aef6cafb9f4a/init-config-reloader/0.log" Oct 09 16:55:05 crc kubenswrapper[4719]: I1009 16:55:05.977564 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5db5d6b746-l6xlx_494a5aaa-f833-4429-bb35-d745fcdf4ad1/placement-api/0.log" Oct 09 16:55:06 crc kubenswrapper[4719]: I1009 16:55:06.149190 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5db5d6b746-l6xlx_494a5aaa-f833-4429-bb35-d745fcdf4ad1/placement-log/0.log" Oct 09 16:55:06 crc kubenswrapper[4719]: I1009 16:55:06.181414 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_9cebaae5-69d4-4429-a062-aef6cafb9f4a/init-config-reloader/0.log" Oct 09 16:55:06 crc kubenswrapper[4719]: I1009 16:55:06.187187 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_9cebaae5-69d4-4429-a062-aef6cafb9f4a/config-reloader/0.log" Oct 09 16:55:06 crc kubenswrapper[4719]: I1009 16:55:06.202556 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_9cebaae5-69d4-4429-a062-aef6cafb9f4a/prometheus/0.log" Oct 09 16:55:06 crc kubenswrapper[4719]: I1009 16:55:06.398405 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8/setup-container/0.log" Oct 09 16:55:06 crc kubenswrapper[4719]: I1009 16:55:06.412598 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_9cebaae5-69d4-4429-a062-aef6cafb9f4a/thanos-sidecar/0.log" Oct 09 16:55:06 crc kubenswrapper[4719]: I1009 16:55:06.572403 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8/setup-container/0.log" Oct 09 16:55:06 crc kubenswrapper[4719]: I1009 16:55:06.640618 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_1df540c9-8b54-44a5-9c5d-03cf736ee67a/setup-container/0.log" Oct 09 16:55:06 crc kubenswrapper[4719]: I1009 16:55:06.674932 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_256b6f76-4a1f-43d5-bbe4-fcc45d0a59b8/rabbitmq/0.log" Oct 09 16:55:06 crc kubenswrapper[4719]: I1009 16:55:06.819309 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_1df540c9-8b54-44a5-9c5d-03cf736ee67a/setup-container/0.log" Oct 09 16:55:06 crc kubenswrapper[4719]: I1009 16:55:06.840598 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_1df540c9-8b54-44a5-9c5d-03cf736ee67a/rabbitmq/0.log" Oct 09 16:55:06 crc kubenswrapper[4719]: I1009 16:55:06.918560 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cf67e8f5-acbb-4033-bcca-d9c86d2be88c/setup-container/0.log" Oct 09 16:55:07 crc kubenswrapper[4719]: I1009 16:55:07.141645 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cf67e8f5-acbb-4033-bcca-d9c86d2be88c/rabbitmq/0.log" Oct 09 16:55:07 crc kubenswrapper[4719]: I1009 16:55:07.200944 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cf67e8f5-acbb-4033-bcca-d9c86d2be88c/setup-container/0.log" Oct 09 16:55:07 crc kubenswrapper[4719]: I1009 16:55:07.228259 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-s5j4v_39e39eb0-02e7-46b7-82be-38cbb9e1bf19/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:55:07 crc kubenswrapper[4719]: I1009 16:55:07.349591 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-g2mb5_2cbe17ac-7862-4175-9d90-10fe6c51cfb4/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:55:07 crc kubenswrapper[4719]: I1009 16:55:07.454543 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-48j65_35cce4cf-e1ff-44fb-9f62-887951a77275/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:55:07 crc kubenswrapper[4719]: I1009 16:55:07.676989 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-h4kn5_39fd920e-4d39-4926-b9d2-3c3c02ebb9ed/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:55:07 crc kubenswrapper[4719]: I1009 16:55:07.700851 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-hfwbm_7df6a296-587c-407c-b2b4-ec923cd05cda/ssh-known-hosts-edpm-deployment/0.log" Oct 09 16:55:07 crc kubenswrapper[4719]: I1009 16:55:07.924303 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-556bc79449-9bdkb_6cbef595-0a78-4655-85ca-b329f51067af/proxy-server/0.log" Oct 09 16:55:08 crc kubenswrapper[4719]: I1009 16:55:08.043089 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-dd7hf_5685c463-d342-436a-a619-f809a2559691/swift-ring-rebalance/0.log" Oct 09 16:55:08 crc kubenswrapper[4719]: I1009 16:55:08.132667 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-556bc79449-9bdkb_6cbef595-0a78-4655-85ca-b329f51067af/proxy-httpd/0.log" Oct 09 16:55:08 crc kubenswrapper[4719]: I1009 16:55:08.180975 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/account-auditor/0.log" Oct 09 16:55:08 crc kubenswrapper[4719]: I1009 16:55:08.248004 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/account-reaper/0.log" Oct 09 16:55:08 crc kubenswrapper[4719]: I1009 16:55:08.396846 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/account-server/0.log" Oct 09 16:55:08 crc kubenswrapper[4719]: I1009 16:55:08.409996 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/account-replicator/0.log" Oct 09 16:55:08 crc kubenswrapper[4719]: I1009 16:55:08.419560 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/container-auditor/0.log" Oct 09 16:55:08 crc kubenswrapper[4719]: I1009 16:55:08.518177 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/container-replicator/0.log" Oct 09 16:55:08 crc kubenswrapper[4719]: I1009 16:55:08.579162 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/container-server/0.log" Oct 09 16:55:08 crc kubenswrapper[4719]: I1009 16:55:08.615190 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/container-updater/0.log" Oct 09 16:55:08 crc kubenswrapper[4719]: I1009 16:55:08.657176 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/object-auditor/0.log" Oct 09 16:55:08 crc kubenswrapper[4719]: I1009 16:55:08.769830 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/object-expirer/0.log" Oct 09 16:55:08 crc kubenswrapper[4719]: I1009 16:55:08.777111 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/object-server/0.log" Oct 09 16:55:08 crc kubenswrapper[4719]: I1009 16:55:08.864171 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/object-replicator/0.log" Oct 09 16:55:08 crc kubenswrapper[4719]: I1009 16:55:08.880720 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/object-updater/0.log" Oct 09 16:55:08 crc kubenswrapper[4719]: I1009 16:55:08.961389 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/swift-recon-cron/0.log" Oct 09 16:55:09 crc kubenswrapper[4719]: I1009 16:55:09.015080 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0358ac72-8c58-4e63-843e-b9eaa35aefdf/rsync/0.log" Oct 09 16:55:09 crc kubenswrapper[4719]: I1009 16:55:09.126343 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-hr6gt_0d3cdd16-36d0-40d7-8f12-62c79d0e0c9a/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:55:09 crc kubenswrapper[4719]: I1009 16:55:09.282050 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_8905824a-8f15-4df7-b938-b63f2a5aebb1/tempest-tests-tempest-tests-runner/0.log" Oct 09 16:55:09 crc kubenswrapper[4719]: I1009 16:55:09.328032 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_0677443b-23bd-4727-a28c-34f602835052/test-operator-logs-container/0.log" Oct 09 16:55:09 crc kubenswrapper[4719]: I1009 16:55:09.485519 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-mbmpb_506813a5-78ae-4083-8d8f-27f6a46858c8/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 16:55:10 crc kubenswrapper[4719]: I1009 16:55:10.348844 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_b9629af3-81da-4d90-a2a8-735ac9bdaeb2/watcher-applier/0.log" Oct 09 16:55:10 crc kubenswrapper[4719]: I1009 16:55:10.793343 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_ead85b36-611c-47d5-8eb2-cddfecffaa77/watcher-api-log/0.log" Oct 09 16:55:13 crc kubenswrapper[4719]: I1009 16:55:13.874581 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_c5d4f8ef-6b73-4d97-9899-49865bf6d744/watcher-decision-engine/0.log" Oct 09 16:55:15 crc kubenswrapper[4719]: I1009 16:55:15.053225 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_ead85b36-611c-47d5-8eb2-cddfecffaa77/watcher-api/0.log" Oct 09 16:55:23 crc kubenswrapper[4719]: I1009 16:55:23.286163 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_c7495027-5c56-46e2-9947-1ad2d6bcaf28/memcached/0.log" Oct 09 16:55:37 crc kubenswrapper[4719]: I1009 16:55:37.573091 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt_57fba553-2ed7-4d57-94af-f8322ebd87d3/util/0.log" Oct 09 16:55:37 crc kubenswrapper[4719]: I1009 16:55:37.758478 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt_57fba553-2ed7-4d57-94af-f8322ebd87d3/pull/0.log" Oct 09 16:55:37 crc kubenswrapper[4719]: I1009 16:55:37.774007 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt_57fba553-2ed7-4d57-94af-f8322ebd87d3/util/0.log" Oct 09 16:55:37 crc kubenswrapper[4719]: I1009 16:55:37.810861 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt_57fba553-2ed7-4d57-94af-f8322ebd87d3/pull/0.log" Oct 09 16:55:37 crc kubenswrapper[4719]: I1009 16:55:37.929168 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt_57fba553-2ed7-4d57-94af-f8322ebd87d3/util/0.log" Oct 09 16:55:37 crc kubenswrapper[4719]: I1009 16:55:37.969113 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt_57fba553-2ed7-4d57-94af-f8322ebd87d3/pull/0.log" Oct 09 16:55:38 crc kubenswrapper[4719]: I1009 16:55:38.002198 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_748a94f4c7250aeb81698430451b20334fd83217ab20309854025d820dt7tgt_57fba553-2ed7-4d57-94af-f8322ebd87d3/extract/0.log" Oct 09 16:55:38 crc kubenswrapper[4719]: I1009 16:55:38.123677 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-kxkjp_ab93ff28-c8ec-4514-bd82-dbab0fe25cee/kube-rbac-proxy/0.log" Oct 09 16:55:38 crc kubenswrapper[4719]: I1009 16:55:38.198614 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-kxkjp_ab93ff28-c8ec-4514-bd82-dbab0fe25cee/manager/0.log" Oct 09 16:55:38 crc kubenswrapper[4719]: I1009 16:55:38.241398 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-pnb2g_f4b78ea6-51d8-4a7a-b5d3-cc4bdc3b5ba4/kube-rbac-proxy/0.log" Oct 09 16:55:38 crc kubenswrapper[4719]: I1009 16:55:38.346847 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-pnb2g_f4b78ea6-51d8-4a7a-b5d3-cc4bdc3b5ba4/manager/0.log" Oct 09 16:55:38 crc kubenswrapper[4719]: I1009 16:55:38.455269 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-wzn6r_f013ff43-3cb6-47f5-bc35-a4bf02143db0/kube-rbac-proxy/0.log" Oct 09 16:55:38 crc kubenswrapper[4719]: I1009 16:55:38.525459 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-wzn6r_f013ff43-3cb6-47f5-bc35-a4bf02143db0/manager/0.log" Oct 09 16:55:38 crc kubenswrapper[4719]: I1009 16:55:38.621723 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-pvjzc_d5e1695b-e7fb-4c23-9848-c6abacde588c/kube-rbac-proxy/0.log" Oct 09 16:55:38 crc kubenswrapper[4719]: I1009 16:55:38.695772 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-pvjzc_d5e1695b-e7fb-4c23-9848-c6abacde588c/manager/0.log" Oct 09 16:55:38 crc kubenswrapper[4719]: I1009 16:55:38.806534 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-z4mpg_582c5c2a-a5b2-43bf-bbdb-4c3fb1b21c09/kube-rbac-proxy/0.log" Oct 09 16:55:38 crc kubenswrapper[4719]: I1009 16:55:38.854184 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-z4mpg_582c5c2a-a5b2-43bf-bbdb-4c3fb1b21c09/manager/0.log" Oct 09 16:55:38 crc kubenswrapper[4719]: I1009 16:55:38.940528 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-jg6r2_64ce70f3-641d-4dfd-811e-c786365c9859/kube-rbac-proxy/0.log" Oct 09 16:55:39 crc kubenswrapper[4719]: I1009 16:55:39.060190 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-jg6r2_64ce70f3-641d-4dfd-811e-c786365c9859/manager/0.log" Oct 09 16:55:39 crc kubenswrapper[4719]: I1009 16:55:39.099949 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-jzwqf_3bc5e8dd-bc95-4b65-afda-a821512a89dd/kube-rbac-proxy/0.log" Oct 09 16:55:39 crc kubenswrapper[4719]: I1009 16:55:39.235832 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-j945k_8b59c5dc-f309-48cc-9c66-7a5c42050f8e/kube-rbac-proxy/0.log" Oct 09 16:55:39 crc kubenswrapper[4719]: I1009 16:55:39.305013 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-jzwqf_3bc5e8dd-bc95-4b65-afda-a821512a89dd/manager/0.log" Oct 09 16:55:39 crc kubenswrapper[4719]: I1009 16:55:39.359789 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-j945k_8b59c5dc-f309-48cc-9c66-7a5c42050f8e/manager/0.log" Oct 09 16:55:39 crc kubenswrapper[4719]: I1009 16:55:39.735137 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-6zjhd_651b9dd5-bce9-4ca0-b6f7-cca0c3fb30eb/kube-rbac-proxy/0.log" Oct 09 16:55:39 crc kubenswrapper[4719]: I1009 16:55:39.803291 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-6zjhd_651b9dd5-bce9-4ca0-b6f7-cca0c3fb30eb/manager/0.log" Oct 09 16:55:39 crc kubenswrapper[4719]: I1009 16:55:39.836148 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-tfk7f_2292f494-d606-40b2-bb8b-7dcc6e9dfeb4/kube-rbac-proxy/0.log" Oct 09 16:55:39 crc kubenswrapper[4719]: I1009 16:55:39.918876 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-tfk7f_2292f494-d606-40b2-bb8b-7dcc6e9dfeb4/manager/0.log" Oct 09 16:55:39 crc kubenswrapper[4719]: I1009 16:55:39.997287 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-zj22f_bc8d9b2a-7a74-40f1-8a70-e8f0013fad38/kube-rbac-proxy/0.log" Oct 09 16:55:40 crc kubenswrapper[4719]: I1009 16:55:40.091541 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-zj22f_bc8d9b2a-7a74-40f1-8a70-e8f0013fad38/manager/0.log" Oct 09 16:55:40 crc kubenswrapper[4719]: I1009 16:55:40.175080 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-qbtsl_14a3f87a-25c5-476e-8379-0b15d3511315/kube-rbac-proxy/0.log" Oct 09 16:55:40 crc kubenswrapper[4719]: I1009 16:55:40.304582 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-qbtsl_14a3f87a-25c5-476e-8379-0b15d3511315/manager/0.log" Oct 09 16:55:40 crc kubenswrapper[4719]: I1009 16:55:40.334717 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-tmxgv_6b82d858-736f-487f-ba35-c1478301b229/kube-rbac-proxy/0.log" Oct 09 16:55:40 crc kubenswrapper[4719]: I1009 16:55:40.521570 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-tmxgv_6b82d858-736f-487f-ba35-c1478301b229/manager/0.log" Oct 09 16:55:40 crc kubenswrapper[4719]: I1009 16:55:40.541566 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-d284h_308fe096-8aff-4a3b-a83a-bb2b1ef8c5df/manager/0.log" Oct 09 16:55:40 crc kubenswrapper[4719]: I1009 16:55:40.567648 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-d284h_308fe096-8aff-4a3b-a83a-bb2b1ef8c5df/kube-rbac-proxy/0.log" Oct 09 16:55:40 crc kubenswrapper[4719]: I1009 16:55:40.732028 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk_5584dd28-d59b-41bf-b24a-ec18d01029e1/kube-rbac-proxy/0.log" Oct 09 16:55:40 crc kubenswrapper[4719]: I1009 16:55:40.767906 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dx5hrk_5584dd28-d59b-41bf-b24a-ec18d01029e1/manager/0.log" Oct 09 16:55:41 crc kubenswrapper[4719]: I1009 16:55:41.016300 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-67f69c4d95-5p5fq_406c7514-3092-45dc-abde-352acbfa0108/kube-rbac-proxy/0.log" Oct 09 16:55:41 crc kubenswrapper[4719]: I1009 16:55:41.021458 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6cf9d6bfd4-rw5j8_a64c087b-46a2-4c1b-abf9-ce21ce6f9688/kube-rbac-proxy/0.log" Oct 09 16:55:41 crc kubenswrapper[4719]: I1009 16:55:41.259564 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-cmjrg_21fdeec4-a518-4f1f-a27d-50d49e078d3d/registry-server/0.log" Oct 09 16:55:41 crc kubenswrapper[4719]: I1009 16:55:41.396327 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6cf9d6bfd4-rw5j8_a64c087b-46a2-4c1b-abf9-ce21ce6f9688/operator/0.log" Oct 09 16:55:41 crc kubenswrapper[4719]: I1009 16:55:41.514126 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-k8dck_607972ec-63ef-43a7-a1ed-0aab9fffc680/kube-rbac-proxy/0.log" Oct 09 16:55:41 crc kubenswrapper[4719]: I1009 16:55:41.566460 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-k8dck_607972ec-63ef-43a7-a1ed-0aab9fffc680/manager/0.log" Oct 09 16:55:41 crc kubenswrapper[4719]: I1009 16:55:41.632997 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-2td8x_08380711-65b1-4957-80ba-36c2f064e618/kube-rbac-proxy/0.log" Oct 09 16:55:41 crc kubenswrapper[4719]: I1009 16:55:41.862906 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-2td8x_08380711-65b1-4957-80ba-36c2f064e618/manager/0.log" Oct 09 16:55:41 crc kubenswrapper[4719]: I1009 16:55:41.898266 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-r4m97_6776ccc8-9114-46e5-a2a2-699f8917bfac/operator/0.log" Oct 09 16:55:42 crc kubenswrapper[4719]: I1009 16:55:42.091960 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-lh9st_6d60ce50-53c4-47c1-b222-88b92c43fd4d/kube-rbac-proxy/0.log" Oct 09 16:55:42 crc kubenswrapper[4719]: I1009 16:55:42.159007 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-578874c84d-w6fj7_c56b1641-8023-4761-a55f-763dfe5f7c4f/kube-rbac-proxy/0.log" Oct 09 16:55:42 crc kubenswrapper[4719]: I1009 16:55:42.165268 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-lh9st_6d60ce50-53c4-47c1-b222-88b92c43fd4d/manager/0.log" Oct 09 16:55:42 crc kubenswrapper[4719]: I1009 16:55:42.251361 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-67f69c4d95-5p5fq_406c7514-3092-45dc-abde-352acbfa0108/manager/0.log" Oct 09 16:55:42 crc kubenswrapper[4719]: I1009 16:55:42.431680 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-q9cm2_288a232e-38ff-44b7-9fda-738becefc8d7/kube-rbac-proxy/0.log" Oct 09 16:55:42 crc kubenswrapper[4719]: I1009 16:55:42.500312 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-q9cm2_288a232e-38ff-44b7-9fda-738becefc8d7/manager/0.log" Oct 09 16:55:42 crc kubenswrapper[4719]: I1009 16:55:42.604748 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-cc79478c-885gj_44ee4b27-7bdd-4a5e-98ed-1b8b5f01b54f/kube-rbac-proxy/0.log" Oct 09 16:55:42 crc kubenswrapper[4719]: I1009 16:55:42.606232 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-578874c84d-w6fj7_c56b1641-8023-4761-a55f-763dfe5f7c4f/manager/0.log" Oct 09 16:55:42 crc kubenswrapper[4719]: I1009 16:55:42.689014 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-cc79478c-885gj_44ee4b27-7bdd-4a5e-98ed-1b8b5f01b54f/manager/0.log" Oct 09 16:55:57 crc kubenswrapper[4719]: I1009 16:55:57.159984 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-2d4s7_4b4724c8-6007-4df3-b822-42d08ea33fde/control-plane-machine-set-operator/0.log" Oct 09 16:55:57 crc kubenswrapper[4719]: I1009 16:55:57.331718 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-75q5v_f07d2126-7037-4b5c-aa67-4d09bf873e07/kube-rbac-proxy/0.log" Oct 09 16:55:57 crc kubenswrapper[4719]: I1009 16:55:57.357127 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-75q5v_f07d2126-7037-4b5c-aa67-4d09bf873e07/machine-api-operator/0.log" Oct 09 16:56:08 crc kubenswrapper[4719]: I1009 16:56:08.113611 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-2kc72_419868d8-7886-45fb-be57-2c476ba8d305/cert-manager-controller/0.log" Oct 09 16:56:08 crc kubenswrapper[4719]: I1009 16:56:08.326659 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-mr6c9_a2b1f94e-9754-4aeb-9d99-a5c2258290ca/cert-manager-cainjector/0.log" Oct 09 16:56:08 crc kubenswrapper[4719]: I1009 16:56:08.395255 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-s42jm_79d0f7b5-f165-44ee-8220-f31bcc6df1fd/cert-manager-webhook/0.log" Oct 09 16:56:19 crc kubenswrapper[4719]: I1009 16:56:19.142742 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-jz9ct_db938ad9-d041-4874-855a-83d6fa385b3e/nmstate-console-plugin/0.log" Oct 09 16:56:19 crc kubenswrapper[4719]: I1009 16:56:19.330045 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-wgq7f_07cbbe5f-3176-4cfa-97e0-a7b3e6613c7b/kube-rbac-proxy/0.log" Oct 09 16:56:19 crc kubenswrapper[4719]: I1009 16:56:19.349639 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-xwghc_feb926a1-9332-41f0-80b7-b100b62f8664/nmstate-handler/0.log" Oct 09 16:56:19 crc kubenswrapper[4719]: I1009 16:56:19.389162 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-wgq7f_07cbbe5f-3176-4cfa-97e0-a7b3e6613c7b/nmstate-metrics/0.log" Oct 09 16:56:19 crc kubenswrapper[4719]: I1009 16:56:19.550697 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-l4s9n_15fe7687-ec6e-42eb-9131-980871159a78/nmstate-operator/0.log" Oct 09 16:56:19 crc kubenswrapper[4719]: I1009 16:56:19.592123 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-m6lsq_036064c5-e3a3-49a7-b457-5e64df820401/nmstate-webhook/0.log" Oct 09 16:56:32 crc kubenswrapper[4719]: I1009 16:56:32.405921 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-42kr4_bee3449b-a86a-4db6-9e57-233f95dfbad0/kube-rbac-proxy/0.log" Oct 09 16:56:32 crc kubenswrapper[4719]: I1009 16:56:32.608860 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-42kr4_bee3449b-a86a-4db6-9e57-233f95dfbad0/controller/0.log" Oct 09 16:56:32 crc kubenswrapper[4719]: I1009 16:56:32.650772 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/cp-frr-files/0.log" Oct 09 16:56:32 crc kubenswrapper[4719]: I1009 16:56:32.795695 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/cp-metrics/0.log" Oct 09 16:56:32 crc kubenswrapper[4719]: I1009 16:56:32.858367 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/cp-reloader/0.log" Oct 09 16:56:32 crc kubenswrapper[4719]: I1009 16:56:32.884362 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/cp-frr-files/0.log" Oct 09 16:56:32 crc kubenswrapper[4719]: I1009 16:56:32.884399 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/cp-reloader/0.log" Oct 09 16:56:33 crc kubenswrapper[4719]: I1009 16:56:33.045335 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/cp-frr-files/0.log" Oct 09 16:56:33 crc kubenswrapper[4719]: I1009 16:56:33.046833 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/cp-reloader/0.log" Oct 09 16:56:33 crc kubenswrapper[4719]: I1009 16:56:33.058318 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/cp-metrics/0.log" Oct 09 16:56:33 crc kubenswrapper[4719]: I1009 16:56:33.103968 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/cp-metrics/0.log" Oct 09 16:56:33 crc kubenswrapper[4719]: I1009 16:56:33.271609 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/cp-frr-files/0.log" Oct 09 16:56:33 crc kubenswrapper[4719]: I1009 16:56:33.282303 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/cp-metrics/0.log" Oct 09 16:56:33 crc kubenswrapper[4719]: I1009 16:56:33.292878 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/cp-reloader/0.log" Oct 09 16:56:33 crc kubenswrapper[4719]: I1009 16:56:33.295809 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/controller/0.log" Oct 09 16:56:33 crc kubenswrapper[4719]: I1009 16:56:33.450199 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/frr-metrics/0.log" Oct 09 16:56:33 crc kubenswrapper[4719]: I1009 16:56:33.472791 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/kube-rbac-proxy/0.log" Oct 09 16:56:33 crc kubenswrapper[4719]: I1009 16:56:33.505303 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/kube-rbac-proxy-frr/0.log" Oct 09 16:56:33 crc kubenswrapper[4719]: I1009 16:56:33.730097 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-wpvvd_1f525824-038f-49b3-9410-10b49819ee01/frr-k8s-webhook-server/0.log" Oct 09 16:56:33 crc kubenswrapper[4719]: I1009 16:56:33.743666 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/reloader/0.log" Oct 09 16:56:33 crc kubenswrapper[4719]: I1009 16:56:33.979467 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-54fb4b8cc7-c4j9j_7d280e60-020e-43c0-a430-fb220b1d8354/manager/0.log" Oct 09 16:56:34 crc kubenswrapper[4719]: I1009 16:56:34.135211 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-645dbdc857-kl4lw_e3a6e576-c324-4600-b0e9-4a83cd64d478/webhook-server/0.log" Oct 09 16:56:34 crc kubenswrapper[4719]: I1009 16:56:34.196475 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-9qhkk_6298dd42-080d-4d5e-bf61-c798382943a7/kube-rbac-proxy/0.log" Oct 09 16:56:34 crc kubenswrapper[4719]: I1009 16:56:34.950494 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-9qhkk_6298dd42-080d-4d5e-bf61-c798382943a7/speaker/0.log" Oct 09 16:56:35 crc kubenswrapper[4719]: I1009 16:56:35.308901 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r8hz9_f1c10c76-5d7a-4dbc-8688-2017821c1872/frr/0.log" Oct 09 16:56:45 crc kubenswrapper[4719]: I1009 16:56:45.981435 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46_4b460c47-e24c-46c7-bf23-0e5b5d6819bd/util/0.log" Oct 09 16:56:46 crc kubenswrapper[4719]: I1009 16:56:46.129318 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46_4b460c47-e24c-46c7-bf23-0e5b5d6819bd/util/0.log" Oct 09 16:56:46 crc kubenswrapper[4719]: I1009 16:56:46.157076 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46_4b460c47-e24c-46c7-bf23-0e5b5d6819bd/pull/0.log" Oct 09 16:56:46 crc kubenswrapper[4719]: I1009 16:56:46.252453 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46_4b460c47-e24c-46c7-bf23-0e5b5d6819bd/pull/0.log" Oct 09 16:56:46 crc kubenswrapper[4719]: I1009 16:56:46.352897 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46_4b460c47-e24c-46c7-bf23-0e5b5d6819bd/util/0.log" Oct 09 16:56:46 crc kubenswrapper[4719]: I1009 16:56:46.401526 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46_4b460c47-e24c-46c7-bf23-0e5b5d6819bd/pull/0.log" Oct 09 16:56:46 crc kubenswrapper[4719]: I1009 16:56:46.409464 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2zwm46_4b460c47-e24c-46c7-bf23-0e5b5d6819bd/extract/0.log" Oct 09 16:56:46 crc kubenswrapper[4719]: I1009 16:56:46.544339 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4_2c118d4a-6a5b-4138-90ff-2270ea2dabd9/util/0.log" Oct 09 16:56:46 crc kubenswrapper[4719]: I1009 16:56:46.762720 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4_2c118d4a-6a5b-4138-90ff-2270ea2dabd9/util/0.log" Oct 09 16:56:46 crc kubenswrapper[4719]: I1009 16:56:46.770949 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4_2c118d4a-6a5b-4138-90ff-2270ea2dabd9/pull/0.log" Oct 09 16:56:46 crc kubenswrapper[4719]: I1009 16:56:46.794570 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4_2c118d4a-6a5b-4138-90ff-2270ea2dabd9/pull/0.log" Oct 09 16:56:46 crc kubenswrapper[4719]: I1009 16:56:46.949795 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4_2c118d4a-6a5b-4138-90ff-2270ea2dabd9/pull/0.log" Oct 09 16:56:46 crc kubenswrapper[4719]: I1009 16:56:46.972814 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4_2c118d4a-6a5b-4138-90ff-2270ea2dabd9/extract/0.log" Oct 09 16:56:46 crc kubenswrapper[4719]: I1009 16:56:46.985955 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dg2mt4_2c118d4a-6a5b-4138-90ff-2270ea2dabd9/util/0.log" Oct 09 16:56:47 crc kubenswrapper[4719]: I1009 16:56:47.188607 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zjkt4_430dd9b6-25a9-482d-8fa6-d2dec5d84507/extract-utilities/0.log" Oct 09 16:56:47 crc kubenswrapper[4719]: I1009 16:56:47.509529 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zjkt4_430dd9b6-25a9-482d-8fa6-d2dec5d84507/extract-content/0.log" Oct 09 16:56:47 crc kubenswrapper[4719]: I1009 16:56:47.532925 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zjkt4_430dd9b6-25a9-482d-8fa6-d2dec5d84507/extract-content/0.log" Oct 09 16:56:47 crc kubenswrapper[4719]: I1009 16:56:47.554333 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zjkt4_430dd9b6-25a9-482d-8fa6-d2dec5d84507/extract-utilities/0.log" Oct 09 16:56:47 crc kubenswrapper[4719]: I1009 16:56:47.743732 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zjkt4_430dd9b6-25a9-482d-8fa6-d2dec5d84507/extract-utilities/0.log" Oct 09 16:56:47 crc kubenswrapper[4719]: I1009 16:56:47.750695 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zjkt4_430dd9b6-25a9-482d-8fa6-d2dec5d84507/extract-content/0.log" Oct 09 16:56:47 crc kubenswrapper[4719]: I1009 16:56:47.970634 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f56vs_400debb1-678f-4731-84d3-8d0b3c455305/extract-utilities/0.log" Oct 09 16:56:48 crc kubenswrapper[4719]: I1009 16:56:48.326070 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f56vs_400debb1-678f-4731-84d3-8d0b3c455305/extract-content/0.log" Oct 09 16:56:48 crc kubenswrapper[4719]: I1009 16:56:48.326089 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f56vs_400debb1-678f-4731-84d3-8d0b3c455305/extract-utilities/0.log" Oct 09 16:56:48 crc kubenswrapper[4719]: I1009 16:56:48.375547 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f56vs_400debb1-678f-4731-84d3-8d0b3c455305/extract-content/0.log" Oct 09 16:56:48 crc kubenswrapper[4719]: I1009 16:56:48.446628 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zjkt4_430dd9b6-25a9-482d-8fa6-d2dec5d84507/registry-server/0.log" Oct 09 16:56:48 crc kubenswrapper[4719]: I1009 16:56:48.567584 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f56vs_400debb1-678f-4731-84d3-8d0b3c455305/extract-utilities/0.log" Oct 09 16:56:48 crc kubenswrapper[4719]: I1009 16:56:48.603071 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f56vs_400debb1-678f-4731-84d3-8d0b3c455305/extract-content/0.log" Oct 09 16:56:48 crc kubenswrapper[4719]: I1009 16:56:48.860245 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz_57298b91-7d64-40fd-be0e-c400cdfd9b93/util/0.log" Oct 09 16:56:48 crc kubenswrapper[4719]: I1009 16:56:48.997047 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz_57298b91-7d64-40fd-be0e-c400cdfd9b93/util/0.log" Oct 09 16:56:49 crc kubenswrapper[4719]: I1009 16:56:49.032219 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz_57298b91-7d64-40fd-be0e-c400cdfd9b93/pull/0.log" Oct 09 16:56:49 crc kubenswrapper[4719]: I1009 16:56:49.104872 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz_57298b91-7d64-40fd-be0e-c400cdfd9b93/pull/0.log" Oct 09 16:56:49 crc kubenswrapper[4719]: I1009 16:56:49.393188 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz_57298b91-7d64-40fd-be0e-c400cdfd9b93/pull/0.log" Oct 09 16:56:49 crc kubenswrapper[4719]: I1009 16:56:49.405531 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz_57298b91-7d64-40fd-be0e-c400cdfd9b93/extract/0.log" Oct 09 16:56:49 crc kubenswrapper[4719]: I1009 16:56:49.424383 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cbn5tz_57298b91-7d64-40fd-be0e-c400cdfd9b93/util/0.log" Oct 09 16:56:49 crc kubenswrapper[4719]: I1009 16:56:49.520180 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f56vs_400debb1-678f-4731-84d3-8d0b3c455305/registry-server/0.log" Oct 09 16:56:49 crc kubenswrapper[4719]: I1009 16:56:49.620670 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-gj4pz_9080569c-497b-4281-a120-7c538380a16c/marketplace-operator/0.log" Oct 09 16:56:49 crc kubenswrapper[4719]: I1009 16:56:49.721069 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8d5wd_f4c82774-ac3f-4330-b575-1cfb72f5dbf7/extract-utilities/0.log" Oct 09 16:56:49 crc kubenswrapper[4719]: I1009 16:56:49.892010 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8d5wd_f4c82774-ac3f-4330-b575-1cfb72f5dbf7/extract-content/0.log" Oct 09 16:56:49 crc kubenswrapper[4719]: I1009 16:56:49.949515 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8d5wd_f4c82774-ac3f-4330-b575-1cfb72f5dbf7/extract-content/0.log" Oct 09 16:56:49 crc kubenswrapper[4719]: I1009 16:56:49.953517 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8d5wd_f4c82774-ac3f-4330-b575-1cfb72f5dbf7/extract-utilities/0.log" Oct 09 16:56:50 crc kubenswrapper[4719]: I1009 16:56:50.109910 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8d5wd_f4c82774-ac3f-4330-b575-1cfb72f5dbf7/extract-content/0.log" Oct 09 16:56:50 crc kubenswrapper[4719]: I1009 16:56:50.182528 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vk29b_0ad43ecb-75f5-4453-89e5-2c7891c537a7/extract-utilities/0.log" Oct 09 16:56:50 crc kubenswrapper[4719]: I1009 16:56:50.188328 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8d5wd_f4c82774-ac3f-4330-b575-1cfb72f5dbf7/extract-utilities/0.log" Oct 09 16:56:50 crc kubenswrapper[4719]: I1009 16:56:50.351560 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vk29b_0ad43ecb-75f5-4453-89e5-2c7891c537a7/extract-content/0.log" Oct 09 16:56:50 crc kubenswrapper[4719]: I1009 16:56:50.370048 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8d5wd_f4c82774-ac3f-4330-b575-1cfb72f5dbf7/registry-server/0.log" Oct 09 16:56:50 crc kubenswrapper[4719]: I1009 16:56:50.414458 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vk29b_0ad43ecb-75f5-4453-89e5-2c7891c537a7/extract-content/0.log" Oct 09 16:56:50 crc kubenswrapper[4719]: I1009 16:56:50.424201 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vk29b_0ad43ecb-75f5-4453-89e5-2c7891c537a7/extract-utilities/0.log" Oct 09 16:56:50 crc kubenswrapper[4719]: I1009 16:56:50.594477 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vk29b_0ad43ecb-75f5-4453-89e5-2c7891c537a7/extract-utilities/0.log" Oct 09 16:56:50 crc kubenswrapper[4719]: I1009 16:56:50.594828 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vk29b_0ad43ecb-75f5-4453-89e5-2c7891c537a7/extract-content/0.log" Oct 09 16:56:51 crc kubenswrapper[4719]: I1009 16:56:51.243274 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vk29b_0ad43ecb-75f5-4453-89e5-2c7891c537a7/registry-server/0.log" Oct 09 16:57:02 crc kubenswrapper[4719]: I1009 16:57:02.534313 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-z6j76_50a5cc44-22d4-4ef1-a265-800ebc36afd4/prometheus-operator/0.log" Oct 09 16:57:02 crc kubenswrapper[4719]: I1009 16:57:02.781694 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5bfc464c98-2f97q_1ae73acd-0d93-4281-b807-4798a207506b/prometheus-operator-admission-webhook/0.log" Oct 09 16:57:02 crc kubenswrapper[4719]: I1009 16:57:02.783295 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5bfc464c98-6px5r_db3811fa-89b7-44a6-94e8-92ca398d8d2c/prometheus-operator-admission-webhook/0.log" Oct 09 16:57:03 crc kubenswrapper[4719]: I1009 16:57:03.028516 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-ff5rt_efab9597-d673-43a0-bedd-f1ec483ae194/operator/0.log" Oct 09 16:57:03 crc kubenswrapper[4719]: I1009 16:57:03.066591 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-kc8zw_4eb8b96a-c47f-424d-bcbc-20ff193b8d7f/perses-operator/0.log" Oct 09 16:57:06 crc kubenswrapper[4719]: I1009 16:57:06.976910 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:57:06 crc kubenswrapper[4719]: I1009 16:57:06.977530 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:57:36 crc kubenswrapper[4719]: I1009 16:57:36.156707 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n2j5g"] Oct 09 16:57:36 crc kubenswrapper[4719]: E1009 16:57:36.157808 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a651056-92ee-471d-b978-0fff9a179a8f" containerName="container-00" Oct 09 16:57:36 crc kubenswrapper[4719]: I1009 16:57:36.157826 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a651056-92ee-471d-b978-0fff9a179a8f" containerName="container-00" Oct 09 16:57:36 crc kubenswrapper[4719]: I1009 16:57:36.158103 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a651056-92ee-471d-b978-0fff9a179a8f" containerName="container-00" Oct 09 16:57:36 crc kubenswrapper[4719]: I1009 16:57:36.160517 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2j5g" Oct 09 16:57:36 crc kubenswrapper[4719]: I1009 16:57:36.188148 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n2j5g"] Oct 09 16:57:36 crc kubenswrapper[4719]: I1009 16:57:36.245178 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlr7c\" (UniqueName: \"kubernetes.io/projected/95ed4cf1-e01a-4647-9916-22a6497f829c-kube-api-access-jlr7c\") pod \"redhat-operators-n2j5g\" (UID: \"95ed4cf1-e01a-4647-9916-22a6497f829c\") " pod="openshift-marketplace/redhat-operators-n2j5g" Oct 09 16:57:36 crc kubenswrapper[4719]: I1009 16:57:36.245556 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95ed4cf1-e01a-4647-9916-22a6497f829c-catalog-content\") pod \"redhat-operators-n2j5g\" (UID: \"95ed4cf1-e01a-4647-9916-22a6497f829c\") " pod="openshift-marketplace/redhat-operators-n2j5g" Oct 09 16:57:36 crc kubenswrapper[4719]: I1009 16:57:36.245652 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95ed4cf1-e01a-4647-9916-22a6497f829c-utilities\") pod \"redhat-operators-n2j5g\" (UID: \"95ed4cf1-e01a-4647-9916-22a6497f829c\") " pod="openshift-marketplace/redhat-operators-n2j5g" Oct 09 16:57:36 crc kubenswrapper[4719]: I1009 16:57:36.348844 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95ed4cf1-e01a-4647-9916-22a6497f829c-catalog-content\") pod \"redhat-operators-n2j5g\" (UID: \"95ed4cf1-e01a-4647-9916-22a6497f829c\") " pod="openshift-marketplace/redhat-operators-n2j5g" Oct 09 16:57:36 crc kubenswrapper[4719]: I1009 16:57:36.348887 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95ed4cf1-e01a-4647-9916-22a6497f829c-utilities\") pod \"redhat-operators-n2j5g\" (UID: \"95ed4cf1-e01a-4647-9916-22a6497f829c\") " pod="openshift-marketplace/redhat-operators-n2j5g" Oct 09 16:57:36 crc kubenswrapper[4719]: I1009 16:57:36.348977 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlr7c\" (UniqueName: \"kubernetes.io/projected/95ed4cf1-e01a-4647-9916-22a6497f829c-kube-api-access-jlr7c\") pod \"redhat-operators-n2j5g\" (UID: \"95ed4cf1-e01a-4647-9916-22a6497f829c\") " pod="openshift-marketplace/redhat-operators-n2j5g" Oct 09 16:57:36 crc kubenswrapper[4719]: I1009 16:57:36.349688 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95ed4cf1-e01a-4647-9916-22a6497f829c-utilities\") pod \"redhat-operators-n2j5g\" (UID: \"95ed4cf1-e01a-4647-9916-22a6497f829c\") " pod="openshift-marketplace/redhat-operators-n2j5g" Oct 09 16:57:36 crc kubenswrapper[4719]: I1009 16:57:36.350751 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95ed4cf1-e01a-4647-9916-22a6497f829c-catalog-content\") pod \"redhat-operators-n2j5g\" (UID: \"95ed4cf1-e01a-4647-9916-22a6497f829c\") " pod="openshift-marketplace/redhat-operators-n2j5g" Oct 09 16:57:36 crc kubenswrapper[4719]: I1009 16:57:36.413936 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlr7c\" (UniqueName: \"kubernetes.io/projected/95ed4cf1-e01a-4647-9916-22a6497f829c-kube-api-access-jlr7c\") pod \"redhat-operators-n2j5g\" (UID: \"95ed4cf1-e01a-4647-9916-22a6497f829c\") " pod="openshift-marketplace/redhat-operators-n2j5g" Oct 09 16:57:36 crc kubenswrapper[4719]: I1009 16:57:36.490576 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2j5g" Oct 09 16:57:36 crc kubenswrapper[4719]: I1009 16:57:36.976339 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:57:36 crc kubenswrapper[4719]: I1009 16:57:36.976713 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:57:37 crc kubenswrapper[4719]: I1009 16:57:37.090650 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n2j5g"] Oct 09 16:57:37 crc kubenswrapper[4719]: I1009 16:57:37.221206 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2j5g" event={"ID":"95ed4cf1-e01a-4647-9916-22a6497f829c","Type":"ContainerStarted","Data":"643305a0741511797d4a9bbd349cd7b552fdd5dafe990fae18ab1dfccf30956f"} Oct 09 16:57:38 crc kubenswrapper[4719]: I1009 16:57:38.243187 4719 generic.go:334] "Generic (PLEG): container finished" podID="95ed4cf1-e01a-4647-9916-22a6497f829c" containerID="029d348da4490b97a6e73121cfa2908dd92d32b5fdda706ca458297d51c6d087" exitCode=0 Oct 09 16:57:38 crc kubenswrapper[4719]: I1009 16:57:38.243656 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2j5g" event={"ID":"95ed4cf1-e01a-4647-9916-22a6497f829c","Type":"ContainerDied","Data":"029d348da4490b97a6e73121cfa2908dd92d32b5fdda706ca458297d51c6d087"} Oct 09 16:57:40 crc kubenswrapper[4719]: I1009 16:57:40.273463 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2j5g" event={"ID":"95ed4cf1-e01a-4647-9916-22a6497f829c","Type":"ContainerStarted","Data":"a5566f242f29967bf3a2864f3a923dba2e78b7b3211c98b321255e15d51ff02f"} Oct 09 16:57:42 crc kubenswrapper[4719]: I1009 16:57:42.291134 4719 generic.go:334] "Generic (PLEG): container finished" podID="95ed4cf1-e01a-4647-9916-22a6497f829c" containerID="a5566f242f29967bf3a2864f3a923dba2e78b7b3211c98b321255e15d51ff02f" exitCode=0 Oct 09 16:57:42 crc kubenswrapper[4719]: I1009 16:57:42.291703 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2j5g" event={"ID":"95ed4cf1-e01a-4647-9916-22a6497f829c","Type":"ContainerDied","Data":"a5566f242f29967bf3a2864f3a923dba2e78b7b3211c98b321255e15d51ff02f"} Oct 09 16:57:43 crc kubenswrapper[4719]: I1009 16:57:43.303888 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2j5g" event={"ID":"95ed4cf1-e01a-4647-9916-22a6497f829c","Type":"ContainerStarted","Data":"de3326e98617f896d55c4945901d44c5874999a7b54293e56f1b6edd302d87df"} Oct 09 16:57:43 crc kubenswrapper[4719]: I1009 16:57:43.331391 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n2j5g" podStartSLOduration=2.880762084 podStartE2EDuration="7.331371282s" podCreationTimestamp="2025-10-09 16:57:36 +0000 UTC" firstStartedPulling="2025-10-09 16:57:38.248177477 +0000 UTC m=+5963.757888762" lastFinishedPulling="2025-10-09 16:57:42.698786665 +0000 UTC m=+5968.208497960" observedRunningTime="2025-10-09 16:57:43.322892022 +0000 UTC m=+5968.832603307" watchObservedRunningTime="2025-10-09 16:57:43.331371282 +0000 UTC m=+5968.841082567" Oct 09 16:57:46 crc kubenswrapper[4719]: I1009 16:57:46.493294 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n2j5g" Oct 09 16:57:46 crc kubenswrapper[4719]: I1009 16:57:46.493926 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n2j5g" Oct 09 16:57:47 crc kubenswrapper[4719]: I1009 16:57:47.545297 4719 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n2j5g" podUID="95ed4cf1-e01a-4647-9916-22a6497f829c" containerName="registry-server" probeResult="failure" output=< Oct 09 16:57:47 crc kubenswrapper[4719]: timeout: failed to connect service ":50051" within 1s Oct 09 16:57:47 crc kubenswrapper[4719]: > Oct 09 16:57:53 crc kubenswrapper[4719]: I1009 16:57:53.057745 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vcf44"] Oct 09 16:57:53 crc kubenswrapper[4719]: I1009 16:57:53.061313 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vcf44" Oct 09 16:57:53 crc kubenswrapper[4719]: I1009 16:57:53.083952 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vcf44"] Oct 09 16:57:53 crc kubenswrapper[4719]: I1009 16:57:53.104143 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrtvr\" (UniqueName: \"kubernetes.io/projected/85de7f80-2aec-49cf-b646-e7639e5a5be0-kube-api-access-qrtvr\") pod \"community-operators-vcf44\" (UID: \"85de7f80-2aec-49cf-b646-e7639e5a5be0\") " pod="openshift-marketplace/community-operators-vcf44" Oct 09 16:57:53 crc kubenswrapper[4719]: I1009 16:57:53.104384 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85de7f80-2aec-49cf-b646-e7639e5a5be0-utilities\") pod \"community-operators-vcf44\" (UID: \"85de7f80-2aec-49cf-b646-e7639e5a5be0\") " pod="openshift-marketplace/community-operators-vcf44" Oct 09 16:57:53 crc kubenswrapper[4719]: I1009 16:57:53.104628 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85de7f80-2aec-49cf-b646-e7639e5a5be0-catalog-content\") pod \"community-operators-vcf44\" (UID: \"85de7f80-2aec-49cf-b646-e7639e5a5be0\") " pod="openshift-marketplace/community-operators-vcf44" Oct 09 16:57:53 crc kubenswrapper[4719]: I1009 16:57:53.206717 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85de7f80-2aec-49cf-b646-e7639e5a5be0-catalog-content\") pod \"community-operators-vcf44\" (UID: \"85de7f80-2aec-49cf-b646-e7639e5a5be0\") " pod="openshift-marketplace/community-operators-vcf44" Oct 09 16:57:53 crc kubenswrapper[4719]: I1009 16:57:53.207052 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85de7f80-2aec-49cf-b646-e7639e5a5be0-catalog-content\") pod \"community-operators-vcf44\" (UID: \"85de7f80-2aec-49cf-b646-e7639e5a5be0\") " pod="openshift-marketplace/community-operators-vcf44" Oct 09 16:57:53 crc kubenswrapper[4719]: I1009 16:57:53.207257 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrtvr\" (UniqueName: \"kubernetes.io/projected/85de7f80-2aec-49cf-b646-e7639e5a5be0-kube-api-access-qrtvr\") pod \"community-operators-vcf44\" (UID: \"85de7f80-2aec-49cf-b646-e7639e5a5be0\") " pod="openshift-marketplace/community-operators-vcf44" Oct 09 16:57:53 crc kubenswrapper[4719]: I1009 16:57:53.207806 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85de7f80-2aec-49cf-b646-e7639e5a5be0-utilities\") pod \"community-operators-vcf44\" (UID: \"85de7f80-2aec-49cf-b646-e7639e5a5be0\") " pod="openshift-marketplace/community-operators-vcf44" Oct 09 16:57:53 crc kubenswrapper[4719]: I1009 16:57:53.208182 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85de7f80-2aec-49cf-b646-e7639e5a5be0-utilities\") pod \"community-operators-vcf44\" (UID: \"85de7f80-2aec-49cf-b646-e7639e5a5be0\") " pod="openshift-marketplace/community-operators-vcf44" Oct 09 16:57:53 crc kubenswrapper[4719]: I1009 16:57:53.228343 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrtvr\" (UniqueName: \"kubernetes.io/projected/85de7f80-2aec-49cf-b646-e7639e5a5be0-kube-api-access-qrtvr\") pod \"community-operators-vcf44\" (UID: \"85de7f80-2aec-49cf-b646-e7639e5a5be0\") " pod="openshift-marketplace/community-operators-vcf44" Oct 09 16:57:53 crc kubenswrapper[4719]: I1009 16:57:53.402167 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vcf44" Oct 09 16:57:53 crc kubenswrapper[4719]: I1009 16:57:53.983778 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vcf44"] Oct 09 16:57:54 crc kubenswrapper[4719]: I1009 16:57:54.412135 4719 generic.go:334] "Generic (PLEG): container finished" podID="85de7f80-2aec-49cf-b646-e7639e5a5be0" containerID="84f1559efda0e881df3a352a92c3ade9d20a527266d60f3a27fe935918a8cac5" exitCode=0 Oct 09 16:57:54 crc kubenswrapper[4719]: I1009 16:57:54.412210 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcf44" event={"ID":"85de7f80-2aec-49cf-b646-e7639e5a5be0","Type":"ContainerDied","Data":"84f1559efda0e881df3a352a92c3ade9d20a527266d60f3a27fe935918a8cac5"} Oct 09 16:57:54 crc kubenswrapper[4719]: I1009 16:57:54.412244 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcf44" event={"ID":"85de7f80-2aec-49cf-b646-e7639e5a5be0","Type":"ContainerStarted","Data":"28f95280fa056b764df4f8b443bcb4edf05ada775a3207365652bd3f69221002"} Oct 09 16:57:55 crc kubenswrapper[4719]: I1009 16:57:55.422733 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcf44" event={"ID":"85de7f80-2aec-49cf-b646-e7639e5a5be0","Type":"ContainerStarted","Data":"1348db1774de2796acb0fc717fb2bf84546de2e744ab415215b5c69c56c6e6a7"} Oct 09 16:57:56 crc kubenswrapper[4719]: I1009 16:57:56.444929 4719 generic.go:334] "Generic (PLEG): container finished" podID="85de7f80-2aec-49cf-b646-e7639e5a5be0" containerID="1348db1774de2796acb0fc717fb2bf84546de2e744ab415215b5c69c56c6e6a7" exitCode=0 Oct 09 16:57:56 crc kubenswrapper[4719]: I1009 16:57:56.444976 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcf44" event={"ID":"85de7f80-2aec-49cf-b646-e7639e5a5be0","Type":"ContainerDied","Data":"1348db1774de2796acb0fc717fb2bf84546de2e744ab415215b5c69c56c6e6a7"} Oct 09 16:57:56 crc kubenswrapper[4719]: I1009 16:57:56.447416 4719 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 16:57:56 crc kubenswrapper[4719]: I1009 16:57:56.551398 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n2j5g" Oct 09 16:57:56 crc kubenswrapper[4719]: I1009 16:57:56.607217 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n2j5g" Oct 09 16:57:57 crc kubenswrapper[4719]: I1009 16:57:57.458515 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcf44" event={"ID":"85de7f80-2aec-49cf-b646-e7639e5a5be0","Type":"ContainerStarted","Data":"14f9aba5bb6b953b15f6b59c83e96f5acc0d573f16010f8b416b4cea489228ac"} Oct 09 16:57:57 crc kubenswrapper[4719]: I1009 16:57:57.492336 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vcf44" podStartSLOduration=2.068564864 podStartE2EDuration="4.492301878s" podCreationTimestamp="2025-10-09 16:57:53 +0000 UTC" firstStartedPulling="2025-10-09 16:57:54.414818923 +0000 UTC m=+5979.924530208" lastFinishedPulling="2025-10-09 16:57:56.838555937 +0000 UTC m=+5982.348267222" observedRunningTime="2025-10-09 16:57:57.481572626 +0000 UTC m=+5982.991284001" watchObservedRunningTime="2025-10-09 16:57:57.492301878 +0000 UTC m=+5983.002013203" Oct 09 16:57:58 crc kubenswrapper[4719]: I1009 16:57:58.831434 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n2j5g"] Oct 09 16:57:58 crc kubenswrapper[4719]: I1009 16:57:58.831975 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n2j5g" podUID="95ed4cf1-e01a-4647-9916-22a6497f829c" containerName="registry-server" containerID="cri-o://de3326e98617f896d55c4945901d44c5874999a7b54293e56f1b6edd302d87df" gracePeriod=2 Oct 09 16:57:59 crc kubenswrapper[4719]: I1009 16:57:59.332171 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2j5g" Oct 09 16:57:59 crc kubenswrapper[4719]: I1009 16:57:59.450846 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95ed4cf1-e01a-4647-9916-22a6497f829c-catalog-content\") pod \"95ed4cf1-e01a-4647-9916-22a6497f829c\" (UID: \"95ed4cf1-e01a-4647-9916-22a6497f829c\") " Oct 09 16:57:59 crc kubenswrapper[4719]: I1009 16:57:59.450917 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlr7c\" (UniqueName: \"kubernetes.io/projected/95ed4cf1-e01a-4647-9916-22a6497f829c-kube-api-access-jlr7c\") pod \"95ed4cf1-e01a-4647-9916-22a6497f829c\" (UID: \"95ed4cf1-e01a-4647-9916-22a6497f829c\") " Oct 09 16:57:59 crc kubenswrapper[4719]: I1009 16:57:59.451008 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95ed4cf1-e01a-4647-9916-22a6497f829c-utilities\") pod \"95ed4cf1-e01a-4647-9916-22a6497f829c\" (UID: \"95ed4cf1-e01a-4647-9916-22a6497f829c\") " Oct 09 16:57:59 crc kubenswrapper[4719]: I1009 16:57:59.453002 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95ed4cf1-e01a-4647-9916-22a6497f829c-utilities" (OuterVolumeSpecName: "utilities") pod "95ed4cf1-e01a-4647-9916-22a6497f829c" (UID: "95ed4cf1-e01a-4647-9916-22a6497f829c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:57:59 crc kubenswrapper[4719]: I1009 16:57:59.458952 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95ed4cf1-e01a-4647-9916-22a6497f829c-kube-api-access-jlr7c" (OuterVolumeSpecName: "kube-api-access-jlr7c") pod "95ed4cf1-e01a-4647-9916-22a6497f829c" (UID: "95ed4cf1-e01a-4647-9916-22a6497f829c"). InnerVolumeSpecName "kube-api-access-jlr7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:57:59 crc kubenswrapper[4719]: I1009 16:57:59.483201 4719 generic.go:334] "Generic (PLEG): container finished" podID="95ed4cf1-e01a-4647-9916-22a6497f829c" containerID="de3326e98617f896d55c4945901d44c5874999a7b54293e56f1b6edd302d87df" exitCode=0 Oct 09 16:57:59 crc kubenswrapper[4719]: I1009 16:57:59.483255 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2j5g" event={"ID":"95ed4cf1-e01a-4647-9916-22a6497f829c","Type":"ContainerDied","Data":"de3326e98617f896d55c4945901d44c5874999a7b54293e56f1b6edd302d87df"} Oct 09 16:57:59 crc kubenswrapper[4719]: I1009 16:57:59.483273 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2j5g" Oct 09 16:57:59 crc kubenswrapper[4719]: I1009 16:57:59.483292 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2j5g" event={"ID":"95ed4cf1-e01a-4647-9916-22a6497f829c","Type":"ContainerDied","Data":"643305a0741511797d4a9bbd349cd7b552fdd5dafe990fae18ab1dfccf30956f"} Oct 09 16:57:59 crc kubenswrapper[4719]: I1009 16:57:59.483317 4719 scope.go:117] "RemoveContainer" containerID="de3326e98617f896d55c4945901d44c5874999a7b54293e56f1b6edd302d87df" Oct 09 16:57:59 crc kubenswrapper[4719]: I1009 16:57:59.524738 4719 scope.go:117] "RemoveContainer" containerID="a5566f242f29967bf3a2864f3a923dba2e78b7b3211c98b321255e15d51ff02f" Oct 09 16:57:59 crc kubenswrapper[4719]: I1009 16:57:59.551247 4719 scope.go:117] "RemoveContainer" containerID="029d348da4490b97a6e73121cfa2908dd92d32b5fdda706ca458297d51c6d087" Oct 09 16:57:59 crc kubenswrapper[4719]: I1009 16:57:59.553465 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95ed4cf1-e01a-4647-9916-22a6497f829c-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 16:57:59 crc kubenswrapper[4719]: I1009 16:57:59.553496 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlr7c\" (UniqueName: \"kubernetes.io/projected/95ed4cf1-e01a-4647-9916-22a6497f829c-kube-api-access-jlr7c\") on node \"crc\" DevicePath \"\"" Oct 09 16:57:59 crc kubenswrapper[4719]: I1009 16:57:59.588686 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95ed4cf1-e01a-4647-9916-22a6497f829c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95ed4cf1-e01a-4647-9916-22a6497f829c" (UID: "95ed4cf1-e01a-4647-9916-22a6497f829c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:57:59 crc kubenswrapper[4719]: I1009 16:57:59.598420 4719 scope.go:117] "RemoveContainer" containerID="de3326e98617f896d55c4945901d44c5874999a7b54293e56f1b6edd302d87df" Oct 09 16:57:59 crc kubenswrapper[4719]: E1009 16:57:59.598993 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de3326e98617f896d55c4945901d44c5874999a7b54293e56f1b6edd302d87df\": container with ID starting with de3326e98617f896d55c4945901d44c5874999a7b54293e56f1b6edd302d87df not found: ID does not exist" containerID="de3326e98617f896d55c4945901d44c5874999a7b54293e56f1b6edd302d87df" Oct 09 16:57:59 crc kubenswrapper[4719]: I1009 16:57:59.599021 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de3326e98617f896d55c4945901d44c5874999a7b54293e56f1b6edd302d87df"} err="failed to get container status \"de3326e98617f896d55c4945901d44c5874999a7b54293e56f1b6edd302d87df\": rpc error: code = NotFound desc = could not find container \"de3326e98617f896d55c4945901d44c5874999a7b54293e56f1b6edd302d87df\": container with ID starting with de3326e98617f896d55c4945901d44c5874999a7b54293e56f1b6edd302d87df not found: ID does not exist" Oct 09 16:57:59 crc kubenswrapper[4719]: I1009 16:57:59.599040 4719 scope.go:117] "RemoveContainer" containerID="a5566f242f29967bf3a2864f3a923dba2e78b7b3211c98b321255e15d51ff02f" Oct 09 16:57:59 crc kubenswrapper[4719]: E1009 16:57:59.599396 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5566f242f29967bf3a2864f3a923dba2e78b7b3211c98b321255e15d51ff02f\": container with ID starting with a5566f242f29967bf3a2864f3a923dba2e78b7b3211c98b321255e15d51ff02f not found: ID does not exist" containerID="a5566f242f29967bf3a2864f3a923dba2e78b7b3211c98b321255e15d51ff02f" Oct 09 16:57:59 crc kubenswrapper[4719]: I1009 16:57:59.599418 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5566f242f29967bf3a2864f3a923dba2e78b7b3211c98b321255e15d51ff02f"} err="failed to get container status \"a5566f242f29967bf3a2864f3a923dba2e78b7b3211c98b321255e15d51ff02f\": rpc error: code = NotFound desc = could not find container \"a5566f242f29967bf3a2864f3a923dba2e78b7b3211c98b321255e15d51ff02f\": container with ID starting with a5566f242f29967bf3a2864f3a923dba2e78b7b3211c98b321255e15d51ff02f not found: ID does not exist" Oct 09 16:57:59 crc kubenswrapper[4719]: I1009 16:57:59.599436 4719 scope.go:117] "RemoveContainer" containerID="029d348da4490b97a6e73121cfa2908dd92d32b5fdda706ca458297d51c6d087" Oct 09 16:57:59 crc kubenswrapper[4719]: E1009 16:57:59.599657 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"029d348da4490b97a6e73121cfa2908dd92d32b5fdda706ca458297d51c6d087\": container with ID starting with 029d348da4490b97a6e73121cfa2908dd92d32b5fdda706ca458297d51c6d087 not found: ID does not exist" containerID="029d348da4490b97a6e73121cfa2908dd92d32b5fdda706ca458297d51c6d087" Oct 09 16:57:59 crc kubenswrapper[4719]: I1009 16:57:59.599676 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"029d348da4490b97a6e73121cfa2908dd92d32b5fdda706ca458297d51c6d087"} err="failed to get container status \"029d348da4490b97a6e73121cfa2908dd92d32b5fdda706ca458297d51c6d087\": rpc error: code = NotFound desc = could not find container \"029d348da4490b97a6e73121cfa2908dd92d32b5fdda706ca458297d51c6d087\": container with ID starting with 029d348da4490b97a6e73121cfa2908dd92d32b5fdda706ca458297d51c6d087 not found: ID does not exist" Oct 09 16:57:59 crc kubenswrapper[4719]: I1009 16:57:59.655189 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95ed4cf1-e01a-4647-9916-22a6497f829c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 16:57:59 crc kubenswrapper[4719]: I1009 16:57:59.819799 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n2j5g"] Oct 09 16:57:59 crc kubenswrapper[4719]: I1009 16:57:59.831948 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n2j5g"] Oct 09 16:57:59 crc kubenswrapper[4719]: E1009 16:57:59.939758 4719 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95ed4cf1_e01a_4647_9916_22a6497f829c.slice\": RecentStats: unable to find data in memory cache]" Oct 09 16:58:01 crc kubenswrapper[4719]: I1009 16:58:01.175732 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95ed4cf1-e01a-4647-9916-22a6497f829c" path="/var/lib/kubelet/pods/95ed4cf1-e01a-4647-9916-22a6497f829c/volumes" Oct 09 16:58:03 crc kubenswrapper[4719]: I1009 16:58:03.402548 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vcf44" Oct 09 16:58:03 crc kubenswrapper[4719]: I1009 16:58:03.402934 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vcf44" Oct 09 16:58:03 crc kubenswrapper[4719]: I1009 16:58:03.450999 4719 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vcf44" Oct 09 16:58:03 crc kubenswrapper[4719]: I1009 16:58:03.581581 4719 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vcf44" Oct 09 16:58:03 crc kubenswrapper[4719]: I1009 16:58:03.691311 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vcf44"] Oct 09 16:58:05 crc kubenswrapper[4719]: I1009 16:58:05.553981 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vcf44" podUID="85de7f80-2aec-49cf-b646-e7639e5a5be0" containerName="registry-server" containerID="cri-o://14f9aba5bb6b953b15f6b59c83e96f5acc0d573f16010f8b416b4cea489228ac" gracePeriod=2 Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.042275 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vcf44" Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.192935 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85de7f80-2aec-49cf-b646-e7639e5a5be0-catalog-content\") pod \"85de7f80-2aec-49cf-b646-e7639e5a5be0\" (UID: \"85de7f80-2aec-49cf-b646-e7639e5a5be0\") " Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.193052 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85de7f80-2aec-49cf-b646-e7639e5a5be0-utilities\") pod \"85de7f80-2aec-49cf-b646-e7639e5a5be0\" (UID: \"85de7f80-2aec-49cf-b646-e7639e5a5be0\") " Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.193123 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrtvr\" (UniqueName: \"kubernetes.io/projected/85de7f80-2aec-49cf-b646-e7639e5a5be0-kube-api-access-qrtvr\") pod \"85de7f80-2aec-49cf-b646-e7639e5a5be0\" (UID: \"85de7f80-2aec-49cf-b646-e7639e5a5be0\") " Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.194728 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85de7f80-2aec-49cf-b646-e7639e5a5be0-utilities" (OuterVolumeSpecName: "utilities") pod "85de7f80-2aec-49cf-b646-e7639e5a5be0" (UID: "85de7f80-2aec-49cf-b646-e7639e5a5be0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.201301 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85de7f80-2aec-49cf-b646-e7639e5a5be0-kube-api-access-qrtvr" (OuterVolumeSpecName: "kube-api-access-qrtvr") pod "85de7f80-2aec-49cf-b646-e7639e5a5be0" (UID: "85de7f80-2aec-49cf-b646-e7639e5a5be0"). InnerVolumeSpecName "kube-api-access-qrtvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.250891 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85de7f80-2aec-49cf-b646-e7639e5a5be0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85de7f80-2aec-49cf-b646-e7639e5a5be0" (UID: "85de7f80-2aec-49cf-b646-e7639e5a5be0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.295512 4719 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85de7f80-2aec-49cf-b646-e7639e5a5be0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.295544 4719 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85de7f80-2aec-49cf-b646-e7639e5a5be0-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.295555 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrtvr\" (UniqueName: \"kubernetes.io/projected/85de7f80-2aec-49cf-b646-e7639e5a5be0-kube-api-access-qrtvr\") on node \"crc\" DevicePath \"\"" Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.598088 4719 generic.go:334] "Generic (PLEG): container finished" podID="85de7f80-2aec-49cf-b646-e7639e5a5be0" containerID="14f9aba5bb6b953b15f6b59c83e96f5acc0d573f16010f8b416b4cea489228ac" exitCode=0 Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.598161 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vcf44" Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.598183 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcf44" event={"ID":"85de7f80-2aec-49cf-b646-e7639e5a5be0","Type":"ContainerDied","Data":"14f9aba5bb6b953b15f6b59c83e96f5acc0d573f16010f8b416b4cea489228ac"} Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.599388 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcf44" event={"ID":"85de7f80-2aec-49cf-b646-e7639e5a5be0","Type":"ContainerDied","Data":"28f95280fa056b764df4f8b443bcb4edf05ada775a3207365652bd3f69221002"} Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.599431 4719 scope.go:117] "RemoveContainer" containerID="14f9aba5bb6b953b15f6b59c83e96f5acc0d573f16010f8b416b4cea489228ac" Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.633488 4719 scope.go:117] "RemoveContainer" containerID="1348db1774de2796acb0fc717fb2bf84546de2e744ab415215b5c69c56c6e6a7" Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.638750 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vcf44"] Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.647788 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vcf44"] Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.667829 4719 scope.go:117] "RemoveContainer" containerID="84f1559efda0e881df3a352a92c3ade9d20a527266d60f3a27fe935918a8cac5" Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.710648 4719 scope.go:117] "RemoveContainer" containerID="14f9aba5bb6b953b15f6b59c83e96f5acc0d573f16010f8b416b4cea489228ac" Oct 09 16:58:06 crc kubenswrapper[4719]: E1009 16:58:06.711045 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14f9aba5bb6b953b15f6b59c83e96f5acc0d573f16010f8b416b4cea489228ac\": container with ID starting with 14f9aba5bb6b953b15f6b59c83e96f5acc0d573f16010f8b416b4cea489228ac not found: ID does not exist" containerID="14f9aba5bb6b953b15f6b59c83e96f5acc0d573f16010f8b416b4cea489228ac" Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.711084 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14f9aba5bb6b953b15f6b59c83e96f5acc0d573f16010f8b416b4cea489228ac"} err="failed to get container status \"14f9aba5bb6b953b15f6b59c83e96f5acc0d573f16010f8b416b4cea489228ac\": rpc error: code = NotFound desc = could not find container \"14f9aba5bb6b953b15f6b59c83e96f5acc0d573f16010f8b416b4cea489228ac\": container with ID starting with 14f9aba5bb6b953b15f6b59c83e96f5acc0d573f16010f8b416b4cea489228ac not found: ID does not exist" Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.711110 4719 scope.go:117] "RemoveContainer" containerID="1348db1774de2796acb0fc717fb2bf84546de2e744ab415215b5c69c56c6e6a7" Oct 09 16:58:06 crc kubenswrapper[4719]: E1009 16:58:06.711367 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1348db1774de2796acb0fc717fb2bf84546de2e744ab415215b5c69c56c6e6a7\": container with ID starting with 1348db1774de2796acb0fc717fb2bf84546de2e744ab415215b5c69c56c6e6a7 not found: ID does not exist" containerID="1348db1774de2796acb0fc717fb2bf84546de2e744ab415215b5c69c56c6e6a7" Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.711390 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1348db1774de2796acb0fc717fb2bf84546de2e744ab415215b5c69c56c6e6a7"} err="failed to get container status \"1348db1774de2796acb0fc717fb2bf84546de2e744ab415215b5c69c56c6e6a7\": rpc error: code = NotFound desc = could not find container \"1348db1774de2796acb0fc717fb2bf84546de2e744ab415215b5c69c56c6e6a7\": container with ID starting with 1348db1774de2796acb0fc717fb2bf84546de2e744ab415215b5c69c56c6e6a7 not found: ID does not exist" Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.711402 4719 scope.go:117] "RemoveContainer" containerID="84f1559efda0e881df3a352a92c3ade9d20a527266d60f3a27fe935918a8cac5" Oct 09 16:58:06 crc kubenswrapper[4719]: E1009 16:58:06.711696 4719 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84f1559efda0e881df3a352a92c3ade9d20a527266d60f3a27fe935918a8cac5\": container with ID starting with 84f1559efda0e881df3a352a92c3ade9d20a527266d60f3a27fe935918a8cac5 not found: ID does not exist" containerID="84f1559efda0e881df3a352a92c3ade9d20a527266d60f3a27fe935918a8cac5" Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.711715 4719 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84f1559efda0e881df3a352a92c3ade9d20a527266d60f3a27fe935918a8cac5"} err="failed to get container status \"84f1559efda0e881df3a352a92c3ade9d20a527266d60f3a27fe935918a8cac5\": rpc error: code = NotFound desc = could not find container \"84f1559efda0e881df3a352a92c3ade9d20a527266d60f3a27fe935918a8cac5\": container with ID starting with 84f1559efda0e881df3a352a92c3ade9d20a527266d60f3a27fe935918a8cac5 not found: ID does not exist" Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.976734 4719 patch_prober.go:28] interesting pod/machine-config-daemon-p9kwh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.976802 4719 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.976849 4719 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.977759 4719 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"20566c16e7a94ebae587edc494a04ff82794280a90c676730ee03cece4dfb016"} pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 16:58:06 crc kubenswrapper[4719]: I1009 16:58:06.977827 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerName="machine-config-daemon" containerID="cri-o://20566c16e7a94ebae587edc494a04ff82794280a90c676730ee03cece4dfb016" gracePeriod=600 Oct 09 16:58:07 crc kubenswrapper[4719]: E1009 16:58:07.136878 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:58:07 crc kubenswrapper[4719]: I1009 16:58:07.172027 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85de7f80-2aec-49cf-b646-e7639e5a5be0" path="/var/lib/kubelet/pods/85de7f80-2aec-49cf-b646-e7639e5a5be0/volumes" Oct 09 16:58:07 crc kubenswrapper[4719]: I1009 16:58:07.615080 4719 generic.go:334] "Generic (PLEG): container finished" podID="99353559-5b0b-4a9e-b759-0321ef3a8a71" containerID="20566c16e7a94ebae587edc494a04ff82794280a90c676730ee03cece4dfb016" exitCode=0 Oct 09 16:58:07 crc kubenswrapper[4719]: I1009 16:58:07.615168 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" event={"ID":"99353559-5b0b-4a9e-b759-0321ef3a8a71","Type":"ContainerDied","Data":"20566c16e7a94ebae587edc494a04ff82794280a90c676730ee03cece4dfb016"} Oct 09 16:58:07 crc kubenswrapper[4719]: I1009 16:58:07.615255 4719 scope.go:117] "RemoveContainer" containerID="cb433a3bb385cb61f77433e06e45a3b88f2918e28fef2f3286d20b0b2ec3257e" Oct 09 16:58:07 crc kubenswrapper[4719]: I1009 16:58:07.616135 4719 scope.go:117] "RemoveContainer" containerID="20566c16e7a94ebae587edc494a04ff82794280a90c676730ee03cece4dfb016" Oct 09 16:58:07 crc kubenswrapper[4719]: E1009 16:58:07.616714 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:58:21 crc kubenswrapper[4719]: I1009 16:58:21.161928 4719 scope.go:117] "RemoveContainer" containerID="20566c16e7a94ebae587edc494a04ff82794280a90c676730ee03cece4dfb016" Oct 09 16:58:21 crc kubenswrapper[4719]: E1009 16:58:21.162920 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:58:36 crc kubenswrapper[4719]: I1009 16:58:36.161036 4719 scope.go:117] "RemoveContainer" containerID="20566c16e7a94ebae587edc494a04ff82794280a90c676730ee03cece4dfb016" Oct 09 16:58:36 crc kubenswrapper[4719]: E1009 16:58:36.161793 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:58:50 crc kubenswrapper[4719]: I1009 16:58:50.160814 4719 scope.go:117] "RemoveContainer" containerID="20566c16e7a94ebae587edc494a04ff82794280a90c676730ee03cece4dfb016" Oct 09 16:58:50 crc kubenswrapper[4719]: E1009 16:58:50.161655 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:59:00 crc kubenswrapper[4719]: I1009 16:59:00.149519 4719 generic.go:334] "Generic (PLEG): container finished" podID="86f1c527-f139-431a-ba26-ad6bdc2adefb" containerID="a1d984721b5db8deb4ca01d5bb3c88f4b40ef38da180b1f1f8f0a013f5794d8b" exitCode=0 Oct 09 16:59:00 crc kubenswrapper[4719]: I1009 16:59:00.149634 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tcx8h/must-gather-f8j9c" event={"ID":"86f1c527-f139-431a-ba26-ad6bdc2adefb","Type":"ContainerDied","Data":"a1d984721b5db8deb4ca01d5bb3c88f4b40ef38da180b1f1f8f0a013f5794d8b"} Oct 09 16:59:00 crc kubenswrapper[4719]: I1009 16:59:00.150916 4719 scope.go:117] "RemoveContainer" containerID="a1d984721b5db8deb4ca01d5bb3c88f4b40ef38da180b1f1f8f0a013f5794d8b" Oct 09 16:59:00 crc kubenswrapper[4719]: I1009 16:59:00.684072 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tcx8h_must-gather-f8j9c_86f1c527-f139-431a-ba26-ad6bdc2adefb/gather/0.log" Oct 09 16:59:04 crc kubenswrapper[4719]: I1009 16:59:04.161424 4719 scope.go:117] "RemoveContainer" containerID="20566c16e7a94ebae587edc494a04ff82794280a90c676730ee03cece4dfb016" Oct 09 16:59:04 crc kubenswrapper[4719]: E1009 16:59:04.162161 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:59:10 crc kubenswrapper[4719]: I1009 16:59:10.763163 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-tcx8h/must-gather-f8j9c"] Oct 09 16:59:10 crc kubenswrapper[4719]: I1009 16:59:10.763910 4719 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-tcx8h/must-gather-f8j9c" podUID="86f1c527-f139-431a-ba26-ad6bdc2adefb" containerName="copy" containerID="cri-o://e994fff5bdc677790c92bc69e21439b8ee8442aaf37eeb5c6059f1198ab58162" gracePeriod=2 Oct 09 16:59:10 crc kubenswrapper[4719]: I1009 16:59:10.773571 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-tcx8h/must-gather-f8j9c"] Oct 09 16:59:11 crc kubenswrapper[4719]: I1009 16:59:11.266590 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tcx8h_must-gather-f8j9c_86f1c527-f139-431a-ba26-ad6bdc2adefb/copy/0.log" Oct 09 16:59:11 crc kubenswrapper[4719]: I1009 16:59:11.268338 4719 generic.go:334] "Generic (PLEG): container finished" podID="86f1c527-f139-431a-ba26-ad6bdc2adefb" containerID="e994fff5bdc677790c92bc69e21439b8ee8442aaf37eeb5c6059f1198ab58162" exitCode=143 Oct 09 16:59:11 crc kubenswrapper[4719]: I1009 16:59:11.396564 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tcx8h_must-gather-f8j9c_86f1c527-f139-431a-ba26-ad6bdc2adefb/copy/0.log" Oct 09 16:59:11 crc kubenswrapper[4719]: I1009 16:59:11.396928 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcx8h/must-gather-f8j9c" Oct 09 16:59:11 crc kubenswrapper[4719]: I1009 16:59:11.443188 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnn8p\" (UniqueName: \"kubernetes.io/projected/86f1c527-f139-431a-ba26-ad6bdc2adefb-kube-api-access-cnn8p\") pod \"86f1c527-f139-431a-ba26-ad6bdc2adefb\" (UID: \"86f1c527-f139-431a-ba26-ad6bdc2adefb\") " Oct 09 16:59:11 crc kubenswrapper[4719]: I1009 16:59:11.443380 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/86f1c527-f139-431a-ba26-ad6bdc2adefb-must-gather-output\") pod \"86f1c527-f139-431a-ba26-ad6bdc2adefb\" (UID: \"86f1c527-f139-431a-ba26-ad6bdc2adefb\") " Oct 09 16:59:11 crc kubenswrapper[4719]: I1009 16:59:11.450032 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86f1c527-f139-431a-ba26-ad6bdc2adefb-kube-api-access-cnn8p" (OuterVolumeSpecName: "kube-api-access-cnn8p") pod "86f1c527-f139-431a-ba26-ad6bdc2adefb" (UID: "86f1c527-f139-431a-ba26-ad6bdc2adefb"). InnerVolumeSpecName "kube-api-access-cnn8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 16:59:11 crc kubenswrapper[4719]: I1009 16:59:11.455266 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnn8p\" (UniqueName: \"kubernetes.io/projected/86f1c527-f139-431a-ba26-ad6bdc2adefb-kube-api-access-cnn8p\") on node \"crc\" DevicePath \"\"" Oct 09 16:59:11 crc kubenswrapper[4719]: I1009 16:59:11.661409 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86f1c527-f139-431a-ba26-ad6bdc2adefb-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "86f1c527-f139-431a-ba26-ad6bdc2adefb" (UID: "86f1c527-f139-431a-ba26-ad6bdc2adefb"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 16:59:11 crc kubenswrapper[4719]: I1009 16:59:11.761455 4719 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/86f1c527-f139-431a-ba26-ad6bdc2adefb-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 09 16:59:12 crc kubenswrapper[4719]: I1009 16:59:12.280885 4719 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tcx8h_must-gather-f8j9c_86f1c527-f139-431a-ba26-ad6bdc2adefb/copy/0.log" Oct 09 16:59:12 crc kubenswrapper[4719]: I1009 16:59:12.281542 4719 scope.go:117] "RemoveContainer" containerID="e994fff5bdc677790c92bc69e21439b8ee8442aaf37eeb5c6059f1198ab58162" Oct 09 16:59:12 crc kubenswrapper[4719]: I1009 16:59:12.281716 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcx8h/must-gather-f8j9c" Oct 09 16:59:12 crc kubenswrapper[4719]: I1009 16:59:12.311575 4719 scope.go:117] "RemoveContainer" containerID="a1d984721b5db8deb4ca01d5bb3c88f4b40ef38da180b1f1f8f0a013f5794d8b" Oct 09 16:59:13 crc kubenswrapper[4719]: I1009 16:59:13.171724 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86f1c527-f139-431a-ba26-ad6bdc2adefb" path="/var/lib/kubelet/pods/86f1c527-f139-431a-ba26-ad6bdc2adefb/volumes" Oct 09 16:59:18 crc kubenswrapper[4719]: I1009 16:59:18.161962 4719 scope.go:117] "RemoveContainer" containerID="20566c16e7a94ebae587edc494a04ff82794280a90c676730ee03cece4dfb016" Oct 09 16:59:18 crc kubenswrapper[4719]: E1009 16:59:18.162837 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:59:29 crc kubenswrapper[4719]: I1009 16:59:29.162805 4719 scope.go:117] "RemoveContainer" containerID="20566c16e7a94ebae587edc494a04ff82794280a90c676730ee03cece4dfb016" Oct 09 16:59:29 crc kubenswrapper[4719]: E1009 16:59:29.163672 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:59:42 crc kubenswrapper[4719]: I1009 16:59:42.162637 4719 scope.go:117] "RemoveContainer" containerID="20566c16e7a94ebae587edc494a04ff82794280a90c676730ee03cece4dfb016" Oct 09 16:59:42 crc kubenswrapper[4719]: E1009 16:59:42.163271 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 16:59:57 crc kubenswrapper[4719]: I1009 16:59:57.161465 4719 scope.go:117] "RemoveContainer" containerID="20566c16e7a94ebae587edc494a04ff82794280a90c676730ee03cece4dfb016" Oct 09 16:59:57 crc kubenswrapper[4719]: E1009 16:59:57.162273 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.147946 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333820-8m98l"] Oct 09 17:00:00 crc kubenswrapper[4719]: E1009 17:00:00.148975 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85de7f80-2aec-49cf-b646-e7639e5a5be0" containerName="registry-server" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.148990 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="85de7f80-2aec-49cf-b646-e7639e5a5be0" containerName="registry-server" Oct 09 17:00:00 crc kubenswrapper[4719]: E1009 17:00:00.149031 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ed4cf1-e01a-4647-9916-22a6497f829c" containerName="registry-server" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.149045 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ed4cf1-e01a-4647-9916-22a6497f829c" containerName="registry-server" Oct 09 17:00:00 crc kubenswrapper[4719]: E1009 17:00:00.149062 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86f1c527-f139-431a-ba26-ad6bdc2adefb" containerName="gather" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.149070 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="86f1c527-f139-431a-ba26-ad6bdc2adefb" containerName="gather" Oct 09 17:00:00 crc kubenswrapper[4719]: E1009 17:00:00.149089 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85de7f80-2aec-49cf-b646-e7639e5a5be0" containerName="extract-utilities" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.149098 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="85de7f80-2aec-49cf-b646-e7639e5a5be0" containerName="extract-utilities" Oct 09 17:00:00 crc kubenswrapper[4719]: E1009 17:00:00.149121 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ed4cf1-e01a-4647-9916-22a6497f829c" containerName="extract-content" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.149129 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ed4cf1-e01a-4647-9916-22a6497f829c" containerName="extract-content" Oct 09 17:00:00 crc kubenswrapper[4719]: E1009 17:00:00.149138 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ed4cf1-e01a-4647-9916-22a6497f829c" containerName="extract-utilities" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.149146 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ed4cf1-e01a-4647-9916-22a6497f829c" containerName="extract-utilities" Oct 09 17:00:00 crc kubenswrapper[4719]: E1009 17:00:00.149158 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85de7f80-2aec-49cf-b646-e7639e5a5be0" containerName="extract-content" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.149165 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="85de7f80-2aec-49cf-b646-e7639e5a5be0" containerName="extract-content" Oct 09 17:00:00 crc kubenswrapper[4719]: E1009 17:00:00.149176 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86f1c527-f139-431a-ba26-ad6bdc2adefb" containerName="copy" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.149182 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="86f1c527-f139-431a-ba26-ad6bdc2adefb" containerName="copy" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.149447 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="95ed4cf1-e01a-4647-9916-22a6497f829c" containerName="registry-server" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.149476 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="86f1c527-f139-431a-ba26-ad6bdc2adefb" containerName="copy" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.149492 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="85de7f80-2aec-49cf-b646-e7639e5a5be0" containerName="registry-server" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.149510 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="86f1c527-f139-431a-ba26-ad6bdc2adefb" containerName="gather" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.150429 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333820-8m98l" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.153542 4719 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.153972 4719 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.156841 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333820-8m98l"] Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.250031 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km55r\" (UniqueName: \"kubernetes.io/projected/e22da0ac-371b-41a9-ac26-05cde7b91d7d-kube-api-access-km55r\") pod \"collect-profiles-29333820-8m98l\" (UID: \"e22da0ac-371b-41a9-ac26-05cde7b91d7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333820-8m98l" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.250109 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e22da0ac-371b-41a9-ac26-05cde7b91d7d-config-volume\") pod \"collect-profiles-29333820-8m98l\" (UID: \"e22da0ac-371b-41a9-ac26-05cde7b91d7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333820-8m98l" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.250172 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e22da0ac-371b-41a9-ac26-05cde7b91d7d-secret-volume\") pod \"collect-profiles-29333820-8m98l\" (UID: \"e22da0ac-371b-41a9-ac26-05cde7b91d7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333820-8m98l" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.352626 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km55r\" (UniqueName: \"kubernetes.io/projected/e22da0ac-371b-41a9-ac26-05cde7b91d7d-kube-api-access-km55r\") pod \"collect-profiles-29333820-8m98l\" (UID: \"e22da0ac-371b-41a9-ac26-05cde7b91d7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333820-8m98l" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.352806 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e22da0ac-371b-41a9-ac26-05cde7b91d7d-config-volume\") pod \"collect-profiles-29333820-8m98l\" (UID: \"e22da0ac-371b-41a9-ac26-05cde7b91d7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333820-8m98l" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.353019 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e22da0ac-371b-41a9-ac26-05cde7b91d7d-secret-volume\") pod \"collect-profiles-29333820-8m98l\" (UID: \"e22da0ac-371b-41a9-ac26-05cde7b91d7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333820-8m98l" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.354108 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e22da0ac-371b-41a9-ac26-05cde7b91d7d-config-volume\") pod \"collect-profiles-29333820-8m98l\" (UID: \"e22da0ac-371b-41a9-ac26-05cde7b91d7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333820-8m98l" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.359856 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e22da0ac-371b-41a9-ac26-05cde7b91d7d-secret-volume\") pod \"collect-profiles-29333820-8m98l\" (UID: \"e22da0ac-371b-41a9-ac26-05cde7b91d7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333820-8m98l" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.382915 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km55r\" (UniqueName: \"kubernetes.io/projected/e22da0ac-371b-41a9-ac26-05cde7b91d7d-kube-api-access-km55r\") pod \"collect-profiles-29333820-8m98l\" (UID: \"e22da0ac-371b-41a9-ac26-05cde7b91d7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333820-8m98l" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.513383 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333820-8m98l" Oct 09 17:00:00 crc kubenswrapper[4719]: I1009 17:00:00.972926 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333820-8m98l"] Oct 09 17:00:01 crc kubenswrapper[4719]: I1009 17:00:01.748549 4719 generic.go:334] "Generic (PLEG): container finished" podID="e22da0ac-371b-41a9-ac26-05cde7b91d7d" containerID="bd2ac9ba2f2cefdbd0cefbeeb5ec5a2752a8896e953ad4ed7d419a3df6d42b1e" exitCode=0 Oct 09 17:00:01 crc kubenswrapper[4719]: I1009 17:00:01.748630 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333820-8m98l" event={"ID":"e22da0ac-371b-41a9-ac26-05cde7b91d7d","Type":"ContainerDied","Data":"bd2ac9ba2f2cefdbd0cefbeeb5ec5a2752a8896e953ad4ed7d419a3df6d42b1e"} Oct 09 17:00:01 crc kubenswrapper[4719]: I1009 17:00:01.748871 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333820-8m98l" event={"ID":"e22da0ac-371b-41a9-ac26-05cde7b91d7d","Type":"ContainerStarted","Data":"ea197ace4710d341e68109d555ce5f53732ee50bdd55108171cde422c90b51d3"} Oct 09 17:00:03 crc kubenswrapper[4719]: I1009 17:00:03.121601 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333820-8m98l" Oct 09 17:00:03 crc kubenswrapper[4719]: I1009 17:00:03.317315 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e22da0ac-371b-41a9-ac26-05cde7b91d7d-secret-volume\") pod \"e22da0ac-371b-41a9-ac26-05cde7b91d7d\" (UID: \"e22da0ac-371b-41a9-ac26-05cde7b91d7d\") " Oct 09 17:00:03 crc kubenswrapper[4719]: I1009 17:00:03.317468 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e22da0ac-371b-41a9-ac26-05cde7b91d7d-config-volume\") pod \"e22da0ac-371b-41a9-ac26-05cde7b91d7d\" (UID: \"e22da0ac-371b-41a9-ac26-05cde7b91d7d\") " Oct 09 17:00:03 crc kubenswrapper[4719]: I1009 17:00:03.317617 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km55r\" (UniqueName: \"kubernetes.io/projected/e22da0ac-371b-41a9-ac26-05cde7b91d7d-kube-api-access-km55r\") pod \"e22da0ac-371b-41a9-ac26-05cde7b91d7d\" (UID: \"e22da0ac-371b-41a9-ac26-05cde7b91d7d\") " Oct 09 17:00:03 crc kubenswrapper[4719]: I1009 17:00:03.318319 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e22da0ac-371b-41a9-ac26-05cde7b91d7d-config-volume" (OuterVolumeSpecName: "config-volume") pod "e22da0ac-371b-41a9-ac26-05cde7b91d7d" (UID: "e22da0ac-371b-41a9-ac26-05cde7b91d7d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 17:00:03 crc kubenswrapper[4719]: I1009 17:00:03.319059 4719 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e22da0ac-371b-41a9-ac26-05cde7b91d7d-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 17:00:03 crc kubenswrapper[4719]: I1009 17:00:03.332158 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e22da0ac-371b-41a9-ac26-05cde7b91d7d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e22da0ac-371b-41a9-ac26-05cde7b91d7d" (UID: "e22da0ac-371b-41a9-ac26-05cde7b91d7d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 17:00:03 crc kubenswrapper[4719]: I1009 17:00:03.332559 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e22da0ac-371b-41a9-ac26-05cde7b91d7d-kube-api-access-km55r" (OuterVolumeSpecName: "kube-api-access-km55r") pod "e22da0ac-371b-41a9-ac26-05cde7b91d7d" (UID: "e22da0ac-371b-41a9-ac26-05cde7b91d7d"). InnerVolumeSpecName "kube-api-access-km55r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 17:00:03 crc kubenswrapper[4719]: I1009 17:00:03.420887 4719 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e22da0ac-371b-41a9-ac26-05cde7b91d7d-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 17:00:03 crc kubenswrapper[4719]: I1009 17:00:03.420931 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km55r\" (UniqueName: \"kubernetes.io/projected/e22da0ac-371b-41a9-ac26-05cde7b91d7d-kube-api-access-km55r\") on node \"crc\" DevicePath \"\"" Oct 09 17:00:03 crc kubenswrapper[4719]: I1009 17:00:03.789685 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333820-8m98l" event={"ID":"e22da0ac-371b-41a9-ac26-05cde7b91d7d","Type":"ContainerDied","Data":"ea197ace4710d341e68109d555ce5f53732ee50bdd55108171cde422c90b51d3"} Oct 09 17:00:03 crc kubenswrapper[4719]: I1009 17:00:03.789731 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea197ace4710d341e68109d555ce5f53732ee50bdd55108171cde422c90b51d3" Oct 09 17:00:03 crc kubenswrapper[4719]: I1009 17:00:03.789773 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333820-8m98l" Oct 09 17:00:04 crc kubenswrapper[4719]: I1009 17:00:04.204136 4719 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333775-ks2hw"] Oct 09 17:00:04 crc kubenswrapper[4719]: I1009 17:00:04.213991 4719 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333775-ks2hw"] Oct 09 17:00:05 crc kubenswrapper[4719]: I1009 17:00:05.176367 4719 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="044dad3e-af09-4af6-bc36-5e0d89f1c7b3" path="/var/lib/kubelet/pods/044dad3e-af09-4af6-bc36-5e0d89f1c7b3/volumes" Oct 09 17:00:12 crc kubenswrapper[4719]: I1009 17:00:12.162331 4719 scope.go:117] "RemoveContainer" containerID="20566c16e7a94ebae587edc494a04ff82794280a90c676730ee03cece4dfb016" Oct 09 17:00:12 crc kubenswrapper[4719]: E1009 17:00:12.166017 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 17:00:27 crc kubenswrapper[4719]: I1009 17:00:27.167070 4719 scope.go:117] "RemoveContainer" containerID="20566c16e7a94ebae587edc494a04ff82794280a90c676730ee03cece4dfb016" Oct 09 17:00:27 crc kubenswrapper[4719]: E1009 17:00:27.167824 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 17:00:28 crc kubenswrapper[4719]: I1009 17:00:28.816142 4719 scope.go:117] "RemoveContainer" containerID="1c114b175c938b52c7de9d20c67d12f5490dcbf54941ccdd8b69aa032233db68" Oct 09 17:00:40 crc kubenswrapper[4719]: I1009 17:00:40.161159 4719 scope.go:117] "RemoveContainer" containerID="20566c16e7a94ebae587edc494a04ff82794280a90c676730ee03cece4dfb016" Oct 09 17:00:40 crc kubenswrapper[4719]: E1009 17:00:40.162063 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 17:00:51 crc kubenswrapper[4719]: I1009 17:00:51.161964 4719 scope.go:117] "RemoveContainer" containerID="20566c16e7a94ebae587edc494a04ff82794280a90c676730ee03cece4dfb016" Oct 09 17:00:51 crc kubenswrapper[4719]: E1009 17:00:51.164111 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 17:01:00 crc kubenswrapper[4719]: I1009 17:01:00.144748 4719 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29333821-vrztl"] Oct 09 17:01:00 crc kubenswrapper[4719]: E1009 17:01:00.145941 4719 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22da0ac-371b-41a9-ac26-05cde7b91d7d" containerName="collect-profiles" Oct 09 17:01:00 crc kubenswrapper[4719]: I1009 17:01:00.145962 4719 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22da0ac-371b-41a9-ac26-05cde7b91d7d" containerName="collect-profiles" Oct 09 17:01:00 crc kubenswrapper[4719]: I1009 17:01:00.146556 4719 memory_manager.go:354] "RemoveStaleState removing state" podUID="e22da0ac-371b-41a9-ac26-05cde7b91d7d" containerName="collect-profiles" Oct 09 17:01:00 crc kubenswrapper[4719]: I1009 17:01:00.147435 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29333821-vrztl" Oct 09 17:01:00 crc kubenswrapper[4719]: I1009 17:01:00.161044 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29333821-vrztl"] Oct 09 17:01:00 crc kubenswrapper[4719]: I1009 17:01:00.331854 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410649bd-c5e1-456c-99fa-cdfe37419905-config-data\") pod \"keystone-cron-29333821-vrztl\" (UID: \"410649bd-c5e1-456c-99fa-cdfe37419905\") " pod="openstack/keystone-cron-29333821-vrztl" Oct 09 17:01:00 crc kubenswrapper[4719]: I1009 17:01:00.332256 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410649bd-c5e1-456c-99fa-cdfe37419905-combined-ca-bundle\") pod \"keystone-cron-29333821-vrztl\" (UID: \"410649bd-c5e1-456c-99fa-cdfe37419905\") " pod="openstack/keystone-cron-29333821-vrztl" Oct 09 17:01:00 crc kubenswrapper[4719]: I1009 17:01:00.332333 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78xzl\" (UniqueName: \"kubernetes.io/projected/410649bd-c5e1-456c-99fa-cdfe37419905-kube-api-access-78xzl\") pod \"keystone-cron-29333821-vrztl\" (UID: \"410649bd-c5e1-456c-99fa-cdfe37419905\") " pod="openstack/keystone-cron-29333821-vrztl" Oct 09 17:01:00 crc kubenswrapper[4719]: I1009 17:01:00.332417 4719 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/410649bd-c5e1-456c-99fa-cdfe37419905-fernet-keys\") pod \"keystone-cron-29333821-vrztl\" (UID: \"410649bd-c5e1-456c-99fa-cdfe37419905\") " pod="openstack/keystone-cron-29333821-vrztl" Oct 09 17:01:00 crc kubenswrapper[4719]: I1009 17:01:00.434070 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78xzl\" (UniqueName: \"kubernetes.io/projected/410649bd-c5e1-456c-99fa-cdfe37419905-kube-api-access-78xzl\") pod \"keystone-cron-29333821-vrztl\" (UID: \"410649bd-c5e1-456c-99fa-cdfe37419905\") " pod="openstack/keystone-cron-29333821-vrztl" Oct 09 17:01:00 crc kubenswrapper[4719]: I1009 17:01:00.434158 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/410649bd-c5e1-456c-99fa-cdfe37419905-fernet-keys\") pod \"keystone-cron-29333821-vrztl\" (UID: \"410649bd-c5e1-456c-99fa-cdfe37419905\") " pod="openstack/keystone-cron-29333821-vrztl" Oct 09 17:01:00 crc kubenswrapper[4719]: I1009 17:01:00.434261 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410649bd-c5e1-456c-99fa-cdfe37419905-config-data\") pod \"keystone-cron-29333821-vrztl\" (UID: \"410649bd-c5e1-456c-99fa-cdfe37419905\") " pod="openstack/keystone-cron-29333821-vrztl" Oct 09 17:01:00 crc kubenswrapper[4719]: I1009 17:01:00.434296 4719 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410649bd-c5e1-456c-99fa-cdfe37419905-combined-ca-bundle\") pod \"keystone-cron-29333821-vrztl\" (UID: \"410649bd-c5e1-456c-99fa-cdfe37419905\") " pod="openstack/keystone-cron-29333821-vrztl" Oct 09 17:01:00 crc kubenswrapper[4719]: I1009 17:01:00.447861 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410649bd-c5e1-456c-99fa-cdfe37419905-config-data\") pod \"keystone-cron-29333821-vrztl\" (UID: \"410649bd-c5e1-456c-99fa-cdfe37419905\") " pod="openstack/keystone-cron-29333821-vrztl" Oct 09 17:01:00 crc kubenswrapper[4719]: I1009 17:01:00.452316 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/410649bd-c5e1-456c-99fa-cdfe37419905-fernet-keys\") pod \"keystone-cron-29333821-vrztl\" (UID: \"410649bd-c5e1-456c-99fa-cdfe37419905\") " pod="openstack/keystone-cron-29333821-vrztl" Oct 09 17:01:00 crc kubenswrapper[4719]: I1009 17:01:00.455605 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410649bd-c5e1-456c-99fa-cdfe37419905-combined-ca-bundle\") pod \"keystone-cron-29333821-vrztl\" (UID: \"410649bd-c5e1-456c-99fa-cdfe37419905\") " pod="openstack/keystone-cron-29333821-vrztl" Oct 09 17:01:00 crc kubenswrapper[4719]: I1009 17:01:00.456198 4719 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78xzl\" (UniqueName: \"kubernetes.io/projected/410649bd-c5e1-456c-99fa-cdfe37419905-kube-api-access-78xzl\") pod \"keystone-cron-29333821-vrztl\" (UID: \"410649bd-c5e1-456c-99fa-cdfe37419905\") " pod="openstack/keystone-cron-29333821-vrztl" Oct 09 17:01:00 crc kubenswrapper[4719]: I1009 17:01:00.484540 4719 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29333821-vrztl" Oct 09 17:01:00 crc kubenswrapper[4719]: I1009 17:01:00.961172 4719 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29333821-vrztl"] Oct 09 17:01:01 crc kubenswrapper[4719]: I1009 17:01:01.376966 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29333821-vrztl" event={"ID":"410649bd-c5e1-456c-99fa-cdfe37419905","Type":"ContainerStarted","Data":"9458f976c9b342eb5b898df20eebfd71009f09bdf4dbb28acd68902dae4b6b04"} Oct 09 17:01:01 crc kubenswrapper[4719]: I1009 17:01:01.377322 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29333821-vrztl" event={"ID":"410649bd-c5e1-456c-99fa-cdfe37419905","Type":"ContainerStarted","Data":"c8b1d3eb6ecbfb233cb31ece24cb5f95ceef523ee295caf3ffc6f75eaf9d0eb4"} Oct 09 17:01:01 crc kubenswrapper[4719]: I1009 17:01:01.395241 4719 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29333821-vrztl" podStartSLOduration=1.39522493 podStartE2EDuration="1.39522493s" podCreationTimestamp="2025-10-09 17:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 17:01:01.390712817 +0000 UTC m=+6166.900424102" watchObservedRunningTime="2025-10-09 17:01:01.39522493 +0000 UTC m=+6166.904936215" Oct 09 17:01:05 crc kubenswrapper[4719]: I1009 17:01:05.420327 4719 generic.go:334] "Generic (PLEG): container finished" podID="410649bd-c5e1-456c-99fa-cdfe37419905" containerID="9458f976c9b342eb5b898df20eebfd71009f09bdf4dbb28acd68902dae4b6b04" exitCode=0 Oct 09 17:01:05 crc kubenswrapper[4719]: I1009 17:01:05.421503 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29333821-vrztl" event={"ID":"410649bd-c5e1-456c-99fa-cdfe37419905","Type":"ContainerDied","Data":"9458f976c9b342eb5b898df20eebfd71009f09bdf4dbb28acd68902dae4b6b04"} Oct 09 17:01:06 crc kubenswrapper[4719]: I1009 17:01:06.161472 4719 scope.go:117] "RemoveContainer" containerID="20566c16e7a94ebae587edc494a04ff82794280a90c676730ee03cece4dfb016" Oct 09 17:01:06 crc kubenswrapper[4719]: E1009 17:01:06.162474 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 17:01:06 crc kubenswrapper[4719]: I1009 17:01:06.877225 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29333821-vrztl" Oct 09 17:01:07 crc kubenswrapper[4719]: I1009 17:01:07.000723 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410649bd-c5e1-456c-99fa-cdfe37419905-combined-ca-bundle\") pod \"410649bd-c5e1-456c-99fa-cdfe37419905\" (UID: \"410649bd-c5e1-456c-99fa-cdfe37419905\") " Oct 09 17:01:07 crc kubenswrapper[4719]: I1009 17:01:07.000911 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410649bd-c5e1-456c-99fa-cdfe37419905-config-data\") pod \"410649bd-c5e1-456c-99fa-cdfe37419905\" (UID: \"410649bd-c5e1-456c-99fa-cdfe37419905\") " Oct 09 17:01:07 crc kubenswrapper[4719]: I1009 17:01:07.001689 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78xzl\" (UniqueName: \"kubernetes.io/projected/410649bd-c5e1-456c-99fa-cdfe37419905-kube-api-access-78xzl\") pod \"410649bd-c5e1-456c-99fa-cdfe37419905\" (UID: \"410649bd-c5e1-456c-99fa-cdfe37419905\") " Oct 09 17:01:07 crc kubenswrapper[4719]: I1009 17:01:07.001768 4719 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/410649bd-c5e1-456c-99fa-cdfe37419905-fernet-keys\") pod \"410649bd-c5e1-456c-99fa-cdfe37419905\" (UID: \"410649bd-c5e1-456c-99fa-cdfe37419905\") " Oct 09 17:01:07 crc kubenswrapper[4719]: I1009 17:01:07.007066 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/410649bd-c5e1-456c-99fa-cdfe37419905-kube-api-access-78xzl" (OuterVolumeSpecName: "kube-api-access-78xzl") pod "410649bd-c5e1-456c-99fa-cdfe37419905" (UID: "410649bd-c5e1-456c-99fa-cdfe37419905"). InnerVolumeSpecName "kube-api-access-78xzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 17:01:07 crc kubenswrapper[4719]: I1009 17:01:07.007320 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/410649bd-c5e1-456c-99fa-cdfe37419905-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "410649bd-c5e1-456c-99fa-cdfe37419905" (UID: "410649bd-c5e1-456c-99fa-cdfe37419905"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 17:01:07 crc kubenswrapper[4719]: I1009 17:01:07.037799 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/410649bd-c5e1-456c-99fa-cdfe37419905-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "410649bd-c5e1-456c-99fa-cdfe37419905" (UID: "410649bd-c5e1-456c-99fa-cdfe37419905"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 17:01:07 crc kubenswrapper[4719]: I1009 17:01:07.060784 4719 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/410649bd-c5e1-456c-99fa-cdfe37419905-config-data" (OuterVolumeSpecName: "config-data") pod "410649bd-c5e1-456c-99fa-cdfe37419905" (UID: "410649bd-c5e1-456c-99fa-cdfe37419905"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 17:01:07 crc kubenswrapper[4719]: I1009 17:01:07.104176 4719 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/410649bd-c5e1-456c-99fa-cdfe37419905-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 17:01:07 crc kubenswrapper[4719]: I1009 17:01:07.104205 4719 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78xzl\" (UniqueName: \"kubernetes.io/projected/410649bd-c5e1-456c-99fa-cdfe37419905-kube-api-access-78xzl\") on node \"crc\" DevicePath \"\"" Oct 09 17:01:07 crc kubenswrapper[4719]: I1009 17:01:07.104215 4719 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/410649bd-c5e1-456c-99fa-cdfe37419905-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 09 17:01:07 crc kubenswrapper[4719]: I1009 17:01:07.104223 4719 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410649bd-c5e1-456c-99fa-cdfe37419905-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 17:01:07 crc kubenswrapper[4719]: I1009 17:01:07.439770 4719 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29333821-vrztl" event={"ID":"410649bd-c5e1-456c-99fa-cdfe37419905","Type":"ContainerDied","Data":"c8b1d3eb6ecbfb233cb31ece24cb5f95ceef523ee295caf3ffc6f75eaf9d0eb4"} Oct 09 17:01:07 crc kubenswrapper[4719]: I1009 17:01:07.439810 4719 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8b1d3eb6ecbfb233cb31ece24cb5f95ceef523ee295caf3ffc6f75eaf9d0eb4" Oct 09 17:01:07 crc kubenswrapper[4719]: I1009 17:01:07.439831 4719 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29333821-vrztl" Oct 09 17:01:18 crc kubenswrapper[4719]: I1009 17:01:18.164241 4719 scope.go:117] "RemoveContainer" containerID="20566c16e7a94ebae587edc494a04ff82794280a90c676730ee03cece4dfb016" Oct 09 17:01:18 crc kubenswrapper[4719]: E1009 17:01:18.165774 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 17:01:29 crc kubenswrapper[4719]: I1009 17:01:29.161790 4719 scope.go:117] "RemoveContainer" containerID="20566c16e7a94ebae587edc494a04ff82794280a90c676730ee03cece4dfb016" Oct 09 17:01:29 crc kubenswrapper[4719]: E1009 17:01:29.162846 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71" Oct 09 17:01:44 crc kubenswrapper[4719]: I1009 17:01:44.160969 4719 scope.go:117] "RemoveContainer" containerID="20566c16e7a94ebae587edc494a04ff82794280a90c676730ee03cece4dfb016" Oct 09 17:01:44 crc kubenswrapper[4719]: E1009 17:01:44.161850 4719 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p9kwh_openshift-machine-config-operator(99353559-5b0b-4a9e-b759-0321ef3a8a71)\"" pod="openshift-machine-config-operator/machine-config-daemon-p9kwh" podUID="99353559-5b0b-4a9e-b759-0321ef3a8a71"